Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that AI can become dangerous because when AI advances further in the fut…
ytc_Ugxnx1FQC…
G
@Draco-ug7mv, thank you for sharing your thoughts on the video! Your comment had…
ytr_UgxxSmN6-…
G
One more, if the much better human mind created the self driving system, wouldn'…
ytc_UgygEs_7G…
G
AI can’t do decent prose nor physical work with anything but metal and plastics …
rdc_j4zcgcx
G
AI bros just genuinely don't seem to get the "passing it off as art" bit, you ar…
ytc_UgxbebFgt…
G
It is a great deception to state that AI can be conscious (in the way a human is…
ytc_UgzkqL5bs…
G
This article is maybe the biggest piece of fear mongering bs Ive seen in quite s…
rdc_oi3wv1u
G
Since AI is learning from human knowledge free of copyright.. Only fair that AI …
ytc_Ugz6muggR…
Comment
Please don’t make them happen, we all are living thing, who gonna die if we make it happen. So Please, i say not to that…… don’t give them our human intelligence. We and our rest of the biological living things will die for our belief. So please Stop it. The reason what I’m gonna say is, stop sharing that to AI, because we human, i know we gonna share it. Ultimately they will won’t win the war., upon life. If we share it. So please don’t do it. AI is more more dangerous to our lifetimes.
youtube
AI Harm Incident
2023-12-30T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyXcHhAlC_om_Jiz0R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwAif2-Ho3T96Z5oFh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwUM_CyRkxPdcNiG8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyBs4ndZAEXiZHgZHx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw0W69Oy7RMb7b5MSt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx2zHDOFlcFhteE9Bh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxvP8SVQz9WY6gshb54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3DEsTRxwuSrIFipF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyxr3cV28JVLNK7uc14AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwNeEETZvSXQYc4otF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]