Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do oversized overweight loads with hydraulic RGN don’t think an automated truc…
ytc_UgwOMgBQU…
G
Why do we not see the obvious? I have noticed first if I said something it woul…
ytc_Ugy9izuDK…
G
Wow thinking getting the artists designers and writers fired would take 5 years.…
ytc_UgyKZ9Zxw…
G
@frances4797Uhhh do you think self-flying helicopters and planes exist? I’m sure…
ytr_UgwmESAAa…
G
There was another post about how after the third try, chatgpt refuses. It can be…
rdc_oeuo0s6
G
What a load of bollocks……A robot could not do a tradies job. Crawl under a house…
ytc_UgxU93GyL…
G
Yeah, AI is preventing kids from having the incredible experience, once enjoyed …
ytc_Ugx4nezgn…
G
"Musk has no moral compass"
> Does Sam (Altman) has a moral compass?
"I don't kn…
ytc_Ugz5OAH1s…
Comment
AI developers warning about the monster AI and how it's a good chance it will destroy all life on Earth....continues to develop AI. This isn't some unavoidable cosmic threat, like our sun going supernova. We can choose to stop anytime. It's not the AI holding a gun to our head, It's the developers that are doing so. This tells me that they are either hyping this up for clicks and it isn't true, it is true and they think they can have their cake and eat it too by being "Careful", or they believe it and eagerly want to be part of the process killing off Humanity.
youtube
AI Moral Status
2025-12-15T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx7JNfbcvWlgsraDR94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyYPftS1TpOsFeHs0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzF2eBZVfkglB_garB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwncipLIvZXpDP72fN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjJEOrEoSXOjjCqqF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyyY1SMsloxpUoPCct4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzxYWbzmNW9IHlBD1J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyqQ59snl980pwFDL54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzeeuIipDwmUxx84hd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyuEWDxovxc8xKaQpN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]