Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think that bill gates idea to tax automation and AI agent's is genius. It coul…
ytc_UgyuaFo3r…
G
Dont read ahead if you dont want to listen to my boring little interaction, and …
ytc_Ugxt3Y_9x…
G
Any ai company that steals my art is stupid, they be hurting their own efforts, …
ytc_UgwEaEsse…
G
The answer is simple. Stop using AI. Buy an external hard drive and start storin…
ytc_UgyXtq20c…
G
For AI.i screamed multiple times on X at Elon Musk that they must not give them …
ytc_UgzgP4j0N…
G
Honestly I'm so tired of the "Well by YOUR logic.. Digital art isn't REAL art, h…
ytc_UgxvzKz1V…
G
AI isn't real its a machine its a tool never more never less why should it be?…
ytc_UgzV3701m…
G
would it not be less expensive just to pay a livable wage, rather than paying mi…
ytc_UgxNPFP09…
Comment
And ASI is good for ASI, and literally no one else.
"If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die." — Yudkowsky, Eliezer; Soares, Nate. If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All (p. 7). Little, Brown and Company. Kindle Edition.
youtube
AI Jobs
2025-10-21T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytr_UgxkM5ihKMaCM70EvC94AaABAg.AOQREQIZAfSAOQw7ipX83r","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy-iFOMPe5LXxWJaSB4AaABAg.AOPQFAQg8scAOXo9eoE_zk","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugy-iFOMPe5LXxWJaSB4AaABAg.AOPQFAQg8scAOZpea-W1DC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytr_UgzB1yZL8mjI4wjLQYt4AaABAg.AOLh6koFyV2AOOzu_4SZz5","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxFkEkGZSD5dVTYVKt4AaABAg.AOLf721jOH4AQMQWNYCOy5","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz7fCtvJAKrMnoNPu54AaABAg.AOLOXvYddJ1AOWuCy0wBlw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy3uic6KOq9aRPxyaV4AaABAg.AOJHsVwsbrcAOLSLdewbyy","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgyuKZ6jBJfDr3cgP4p4AaABAg.AOGvk9iiapMAOQOMH9VcAy","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxUnheWbIhQa6x75qd4AaABAg.AOGg799kZ5HAOGiLJpxV_O","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytr_Ugz1CSNMZmw92hQuMMB4AaABAg.AOGf9cd2-gBAOGij8o70hA","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]