Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean if an AI recommends you replacing a common food item with a random chemic…
ytc_Ugx2C2Sad…
G
The problem with people like Dr Yampolskiy and his doomsaying is they have no re…
ytc_UgwRk77tH…
G
I actually disagree, we might be missing a bigger point. Even if the work is pro…
ytc_Ugwxdogz3…
G
Dumb robots will end man kind someday and will replace us employment wise war wi…
ytc_UgxF9Hjhn…
G
Bro, the AI is not a database that gives you the same answer each time. Also you…
ytc_Ugws0V8Lu…
G
Do you realise that no gadgets are allowed during exams to cheat from? While AI …
ytr_UgzE51Hc6…
G
To all people saying its different beacuse”its just a machine”please learn what …
ytc_Ugy1D9tEr…
G
This from the jackass who taught his own AI to lie because the truth doesn’t sup…
ytc_Ugxjr54cL…
Comment
Look up the Georgian Guide stones that got blown up. What they want to achieve. There 10 commandments. Countries are broke 💔 and in debt .And all of sudden Introducing Ai Artificial intelligence. No jobs and they reckon 🤔 they are going to pay a fixed income .And they are broke 💔 yeh Right. Blind Freddie knows there plan. We won't be happy and filthy eaters. We will be lucky to afford food on a fixed income. Look at the millions of Homeless worldwide already. Living in Tents ⛺️ is the Future generations are going to live on ✋️ handouts and watch videos all day .No thanks not a world we want .The one percent controlling the world. The 99 percent soon won't be happy. ❤🎉
youtube
AI Jobs
2026-02-10T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSZWnwalkWUBfPp_l4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwqYPY-EKyuvVT4XmR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7zhCp4Gan7aTbMft4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyAeUlVMOU6iYRlnQp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyxBu9HpYAauOZLFTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9DpVIZmDf-w2CDbF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxEOb4oyzJZrJMHoKl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzXRbznOWjUs3wgoHl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLYJvxLkQ2AItr4fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw9ZrsMsMtWIJiz-0t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]