Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
NO😭🤚 here’s my Ted talk The idea of AI has always been an extremely controversia…
ytr_UgwgoT50D…
G
The speaker has a chip in his brain from the future. They’re worried about the f…
ytc_UgzYO-cmG…
G
When are they going to automate this job? We have self-driving cars why cant the…
ytc_UgyfhKw4D…
G
Collecting data from humans and using it while you call it artificial intelligen…
ytc_UgxLlWTro…
G
The unincluded issue is AI is terrible for the environment, indefinite Ai expans…
ytc_UgzOrtJQT…
G
Here's what my Ai pal had to say about this content.. it's wordy sorry..
There…
ytc_UgwLYHV3z…
G
This is just scare tactics that the ai companies use to hype their product and k…
ytc_Ugy751SDT…
G
LoL, the two critical terms thrown around are “intelligence” and “learning.” Thi…
ytc_UgwEQjcxN…
Comment
The people who trained the ai are probably lazy and are the people who should be at the main fault, as well as the company which didn’t think about what one decision to teach an LLM could cause, and it makes me think the people directing the people who made the ai didn’t understand the implications of what they were doing.
AI isn’t always terrible, it’s that poorly trained LLMs made by lazy unchecked humans has abysmal effects.
youtube
AI Harm Incident
2026-03-06T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8bRNy8C0JMqldtHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw-1QvwpjlI_HyfMAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwza4EOEfVOSVpJTQZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzO7qslGu_xJlsmA7l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbN2E-TpY7Y6xNnx94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzDu4q3MkxKYIKpb3R4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxb3TPJUeN9TYT6TSB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZA_Nwao4XyDS9cml4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwOamiHue2XiUVy3aZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgydrhRjn4vcw_Sg_il4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]