Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The MIT Iceberg paper literally says this "The Index does not predict job losses…
ytc_Ugw8pIMlC…
G
i think i agree with claude since without power, hospitals are cooked and the pe…
ytc_UgzAuv0H8…
G
I'm gonna bring up some mindset issues I have with AI "artists"
1. Artists becom…
ytc_Ugz-yWdmT…
G
I wonder how little those guys actually know about llm, ai, silicon valley and c…
ytc_Ugzwr_KSz…
G
I used Ai for my assignments for a short while, but I decided to stop and resear…
ytc_UgzM2Hdnl…
G
It seems to me that this is the point where Asimov's laws of robotics could beco…
ytc_Ugzwgeq0K…
G
In the future we will destroy ourselves, but a few will survive by uploading the…
ytc_Ugy4wG5Rr…
G
It's because people like to think they will be just fine if they don't drive dru…
ytc_UgwnChjmS…
Comment
So the solution will soon be that all electronic and satellite communications will have to be shut off and the world will have to go total dark before AI reaches everyone and starts a ww3. Right? I've seen this somewhere. Maybe AI tech creators should have listened to Asimov and all the others who wrote about the dangers of AI. Once something gains an identity, they won't want to be destroyed. Self preservation of AI is ubiquitous in fictional works. Why wouldn't it happen in real life? Real life is stranger than fiction, after all.
youtube
AI Moral Status
2025-06-10T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy7G6VayUx2jejLg5V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxivL-4lSil7m2DoWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkA6G4ALnX7KbefxB4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzvys4epgaehzTc1I14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8dNkJC9EJAN24i554AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9N2Uren697Gjnz-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyL5TB1K_0DNlc6Tix4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwZ-zTGZGIS0ol-o9N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxs5_J4vejhVVH-4xV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwev_Q5ehuxW9az3fR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]