Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah I can't think of any sci-fi movie with robots and AI that didn't try to wip…
ytr_Ugxxmfl4Q…
G
I asked my GPT about this stuff. After a bit of a conversation where I called it…
ytc_UgyVF3XPG…
G
What scares me most is I felt genuinely sorry for this ai as you grilled them.…
ytc_UgzeOxaeU…
G
Sure u can create it, but there should be a completely different title for u lot…
ytr_Ugximm40B…
G
Skilled Tradesmen, Electrician. gonna be hard to replace men who ply this trade.…
ytc_Ugz-P1QFK…
G
I used AI to write 8 lines of SQL code, then ran it. The code was not working 😅 …
ytc_UgyLvXlnQ…
G
@liamloxley1222 what emotion? This is lunacy: this is a simulation of emotion th…
ytr_Ugxg4-W-z…
G
We should have Ai doctors with human doctors so everyone can afford some type of…
ytr_Ugzu6h1lc…
Comment
The idea that robots will eventually get rights doubles as a way of dehumanizing Black people: equating actual machines with the historical struggles of people who were treated as machines and believed to be less than human--and still are, by many of the same people promoting these concepts. It's also crazy that people can more easily imagine robot rights than equal rights among humans.
A robot can't be a "slave" or deserve "rights" any more than a shoe or a saucepan can.
youtube
2025-09-17T11:1…
♥ 368
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxWVKMHttR5ZT595pd4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwDeolbZsztKSHGOQx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwvs2p2UPEuZCX98fN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzhYzfqqTaUqrcVej54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrDwkjqBvRr528k7R4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxDyWZfACN4QWlE2j14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy55HHIdO4VW6KJtBd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz5V759UR2eSKt_pMp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCVBnuGoIp5T0OUFN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxi4BlP3Pebc2EiN1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]