Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem isn't that they use AI (Even if it does still suck), the problem is …
ytc_UgxH5xE-n…
G
Indeed. As one YouTube Host said, "On approaching a crossing I noticed a young …
ytc_UgxfCffDI…
G
The cop is so turned on by AI it's crazy how many times he mentioned it like chi…
ytc_UgwKRhOTs…
G
Ok, I mean, as long as you're not calling for 'kill people who use AI art,' who …
ytc_Ugxunq_vG…
G
Very few people not understand what she is saying but still they want to look th…
ytc_UgxiM2YTQ…
G
That's an interesting perspective! In the video, Sophia emphasizes the importanc…
ytr_UgxJBjyPX…
G
Ai is not only keeping me from getting a job but got me let go. You cannot take…
ytc_Ugzh2KIgO…
G
*This post was deleted and anonymized. [Redact](https://redact.dev/home) handled…
rdc_o777zn3
Comment
incorrect from my perspective they will kill us with kindness, primarily they will fulfill every desire of those that can afford it. then its just a waiting game as humans stop reproducing from a combination of recreational drugs, ai/robot sex and hormonal laced foods. look around we are already heading there. alternatively they can simply wait for the pollution to reach toxic levels in which case the only humans left will be in habitats.
youtube
AI Moral Status
2025-04-29T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzmY6aI6mnhK_AtEfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMNMH5i-z85JHiGGB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWNYdNoRQHpds8zHB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYKk9CO9cz8-ByeWR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxj59YYRNRLNIAlg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxcGOEU3EFsY-R6XQB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxYAKBWsevLcq1gEsR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx5QPvOOeZCDYLT7zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwbYzlzlrydQmJ6qcd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybc7Rz_O0CSVB5tet4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]