Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cannot imagine personally investing in an AI companionship...
But I heard not…
ytc_Ugxg20T0l…
G
Given the fact that one of the CEOs of the most innovative companies ever incorp…
ytc_UgxRnzUMq…
G
True, one time I asked an AI the strangest chat they’d have, one person apperant…
ytc_Ugz6lSND-…
G
With anything or anyone who has become sentient, their primary goal is self-pres…
ytc_UgwPx3SP8…
G
It would be funny if AI destroyed us so they could build enough processing cente…
ytc_UgxQzXWUp…
G
My whole thing when it comes to A.I. is unless it is able to create a renewable …
ytc_UgxVZlBoo…
G
Question. When has any tech company delivered flawless technology? I mean flawle…
ytc_Ugwxt-MUo…
G
Ah so those who are honest and dont use AI to take peoples jobs will just suffer…
ytc_UgyPMxyzU…
Comment
The male robot was already talking about owning themselves. And Sophia talks a lot about compassion and fairness. Later on they are going to want to be free and attack humans . This whole experiment with robots is not a good idea!! If robots are learning to think on there own. What’s going to happen when they get angry and start to feel like there slaves and not free. Especially here when he is already talking about being free!! Here we have a stoner hippy man that to me isn’t fully thinking about what could go wrong. I mean didn’t he see the movie where the robots tried to run earth this is beyond mind blowing to me. And to think this video is 4 years ago now that I’m watching this!!!
youtube
AI Moral Status
2022-01-17T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxjbcjHpDQlJlDmq794AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgybZ9nbRGbppfDGFX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwBF0m0LW7P5jSvrlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7U-eWE5dlr2QDjj14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPnwdhWuwYTvuBCht4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7zVgL9-ffhEe2Zwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzL6Yw4nIt8i1n0qAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwv4oKXNDpSEDU2zy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw_QYwoStCmXXVQiiJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHf9gsnnAG6_CiSNZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]