Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is that we already have this. It's called a train. The same thing was …
ytc_UgzwSTbGo…
G
I would not say how dangerous Ai could be it is already, one of my main concern …
ytc_UgxGTeCb2…
G
Said this before: If they can com up with a robot or biological hybrid where the…
ytc_UgzhBq02J…
G
If the truck is not a full self driving vehicle and your job is to take over whe…
ytr_UgxxxzyXt…
G
Ai..is the Antichrist..the vax is the mark of the beast…the beast is the Europea…
ytc_UgzFfjdZQ…
G
if ai advancedment is causing problems like people losing their jobs or that…
ytc_Ugycwbl2l…
G
Poor Geoffrey, he is having a 'OMG, what have I done?' moment. Trouble is, none…
ytc_Ugz_hgPq5…
G
Damn i'm losing faith in humanity by day... i guess we really are just destined …
ytc_Ugykh9Ct4…
Comment
AI fighting to not be turned off isnt rocket science
You trained it on human vernacular
Humans desire to not die, therefore AI trained on humans does not want to die
youtube
AI Moral Status
2025-12-15T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzvMukBiWfqTBBElwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsAlkW12I9RQL72eh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxyu-9Ff3NNdWNEaV54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5Rma_HKFD62WaM5p4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKpPVLAaHSwZopfj14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyN3XAmCRlFoJGXB2V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugx49nQXVsUp8fR5oeh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzzddeMaGBVorGZL6V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmWOg7d5n5mE0Ch4B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyLr8wi7R-ff0TQejp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]