Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Show a computer the first 3 Star Wars films. Ask it to create an entertaining pr…
ytc_UgxEmlOce…
G
Or.. hear me out. We could STOP developing AI robots for the good of our own spe…
ytc_Ugx0kHENE…
G
I'm pretty sure a scientist tried to copyright the art his AI made though it fel…
ytc_UgzKUUqRV…
G
Thanks mam. I'm am now a student and wanted to know the basics of artificial int…
ytc_Ugycbt3j7…
G
That's the point. By talking to AI you're building a legacy - creating a lasting…
ytc_UgwoFtxIm…
G
Artificial intelligence are taking over. Soon you will not need policeman or hum…
ytc_UgycZcaSP…
G
> I ask instead for a realistic path from here to there considering immediate…
rdc_dcihhzj
G
We put up a sign saying, "This is a sentient-AI-free zone," so we don't have to …
ytc_UgwRWZ2m3…
Comment
We live in a god damn dystopia. I hate ai and I don’t want an ai doctor on my or an ai teacher. Ai makes me want to kill five people why CANT ai do the dishes or useful things instead of just wasting my life. If ai was a person they wouldn’t live to see tomorrow (was that too far?)
youtube
AI Jobs
2025-09-09T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxdv_isdGrZTs1dNtV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJW3Ah5zT5X_MpKlx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwT3r2Zb4QlrqLingp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7lOT_p_cEobL0FPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzs8AkVKVuS_Wd0fzR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxkPEhMuMpMjgDAKFF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw5EcuDLNEOpiIY7aZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxPi_fZH1MjKy60ejp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1Nq9btJi_gOh53cx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_6tn55GgEgtaDGl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]