Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes, and on top of that, you will have no choice but to use the AI whether you l…
ytr_UgwxsdnmM…
G
If its not AI, it'll be ourselves who doom one another, a tale as old as time, i…
ytc_Ugzb8fseX…
G
In an ideal (or at least sensible) world, the convenience and efficiency of AI, …
ytc_Ugx6_jrVw…
G
Let’s be fair. That self portrait Charlie made was what AI art looked like when …
ytc_UgySlOxlt…
G
No AI was involved. The parents were obviously lying to respect their son's hono…
ytr_UgzkJERu_…
G
I can’t help but think this AI stuff is being used by big tech and oil to distra…
ytc_Ugy-kkbBd…
G
Just because somthing as well as thought of is risk,doesnt mean against is as wi…
ytc_UgxPb-0iY…
G
AI HAS BEEN HERE FOR YEARS AND IT IS GOING NOWHERE EXCEPT FORE WORD 😂…
ytc_Ugx-P4cZe…
Comment
Ai is a plague and needs to be eradicated. At no point should anyone be talking to it to have a conversation, and asking questions, what happened to independent fucking research?? Ai is sloppy garbage and needs to go. If its not being used for scientific purposes like engineering, math, physics, advanced modeling, etc it should be illegal.
youtube
AI Moral Status
2025-12-16T07:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_EsRwWhiHz5m_GPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyq-o_mbQLSnC20AjF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPmX5XJO4ENh8QJpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy3XJnMjeu7eYVAhPB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-ImwdEeQmxa99MKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4-7LE6AY4Gbe36pZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHfg8wjoo7hh_83PN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkSY4TA5RCvMaHVbB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1PDBCHiliNYw9F2F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwa3tlM-fVklrrDAsN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]