Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My ChatGPT gave the following answer:
Hey there — great question, and I appreci…
ytc_UgyIK9KIQ…
G
The only argument for humanoid robots, that makes sense to me is that the world …
ytc_UgwSEG7B-…
G
Actually, I haven't heard a single English speaker ever refer to left wingers as…
rdc_da40weu
G
Wasn't there an issue last year (or the year before) where a helpline tried to u…
ytc_UgxmoASKV…
G
That's Pandora's Box for you. But to not attempt to control something like AI wo…
ytr_UgxJWqd6y…
G
They do have a default role they'll try to play if you don't beat them up about …
ytc_Ugyn-XZKe…
G
oh it was great for the one guy to lose his job cos now he can pursue.... a care…
ytc_UgwZOKTN5…
G
Without artists Ai art wouldn't even exist dumbass the Ai is trained on real art…
ytr_Ugz0hquEk…
Comment
I really believe there’s a “crazy Epstein-class” person who wants to build the most powerful AGI, then upload their consciousness (or multiple Epstein-class people’s consciousness into one), and use that AI to control all other AIs in the world. They might even see it as a way to become one unified entity able to offer “eternal life,” with no illness and no need for food or water, to other Epstein-class people.
youtube
2026-04-23T02:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxl3P4G3mLxaYybEkN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzxCc_TEyfF2YKFYNR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1UHGuX7cHnjLqu-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzAT7Qo4XTV24AtOl94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxWLd0oO5eAk5tucpV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzj3ivsiOK1ApRPLBl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzhcJvBKX2jFRUhPCt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzNIRt9mB9vvd19ML94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz1eZsLaC7bV2mBehZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHgM0sUkjuWuJuU2B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]