Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
62 is young boy, assuming u have decent health ur in this at least another 20yrs…
ytr_UgwWjrmR8…
G
Having read the article in the pinned comment, it seems the argument in that par…
ytr_UgzF1jyWX…
G
My relation with AI art is funny
I understand it's purpose
I know that people ar…
ytc_Ugw8dfrqo…
G
Stupid video. Tesla's full self driving is gearing up to save millions of lives.…
ytc_UgyPLsMmo…
G
I've been hearing "computers will write their own programs, and there will be no…
ytc_Ugxx0OqIY…
G
Funny isn’t it, he gets to declare a warning about his fellow elites activities …
ytc_UgxxVXC9y…
G
you can't legislate this stuff out of existence. one site gets shut down, ten mo…
ytc_UgzaFOieB…
G
AI catching strays about God not wanting robots lol.
ChatGPT- why he say forge…
ytc_UgwhWkhTI…
Comment
There is something that he's saying that just rubs me the wrong way. Can AI eventually get to the point where it is mimicking emotion? Absolutely. Can it get to the point where it appears to be thinking? Absolutely. However, something that is intrinsic to human beings is that reasoning ability mixed with the decision-making based on feelings and emotions. Try as it might, AI will always at its core be an empty entity that, although can be extremely adept at appearing to be thinking and emotional, can never do such.... It is simply processing information whereas humans not only process information but add their personal feelings and judgments which cannot be quantified or taught. This is the key difference.
youtube
AI Governance
2025-07-02T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw9M1uYr5Vfe7yaIiV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzqG2IisacfWMVZkF14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXc4i6XoCyE_QUIGR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyaUucS5_IRHZtix5Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKxB7QADoJygHuSp14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtorJLYsgZrFbEEjV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYki8llSlV0vuh0gx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxdu1TKkvnn4d7TzER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxiZZz7TaFmVGJy79F4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwY_Zg97bucRB9A7Fh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"})