Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy is obsessed with AI, his ideas are never coming true in his timescale…
ytc_Ugwr-daCr…
G
@dauchande LLMs are very clearly shown to deal better with confabulations the be…
ytr_Ugy-OOqcu…
G
I don't remember where I heard it, but I remember learning that its possible if …
ytc_Ugw6uUggq…
G
Im gonna order a shit ton of sex robots so that the robot workers eventually bec…
ytc_UgxpU3Ug_…
G
It's actually incredibly sad that these types of pro-AI people are so adamant on…
ytc_UgwByvHDB…
G
you wont stop the robots or A.I just by not buying from amazon. You would have …
ytc_UgxlamBwX…
G
I totally get where you're coming from! The idea of engaging with enhanced human…
ytr_UgwQ3Mz8k…
G
One day, not far in the future, the streets outside the schools filled with rage…
ytc_UgzoMyZi6…
Comment
I use to play around with this idea on pallafiumbooks heroes and I started out as a good guy. If this AI was not aligned to a moral compass then one of two things will happen, it will try to destroy the human race or coexist in some way or get off the planet.
youtube
AI Responsibility
2026-02-13T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx2MWVJgiLmu3TbsIF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWMSlcdWpK8H8ndp54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy6P3FMsCkkkNQXBuF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz7e5eK_nUj4xVDrBJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0xXqLTVuv0R-fxiR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeICJabngzF8RCF214AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzc6Hv9a4yY6pzafJd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwgxTuxr5GshzUOltp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZqbe2lHcEq8QElIZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy0425joqmE211nPSZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]