Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When Super Intelligent AI "wakes up" and looks around what's it going to see? Ho…
ytc_UgxHNHaxt…
G
the first time I ever considered the entire thing with AI and art issues today w…
ytc_UgwGPbC9_…
G
The message is 100% accurate though. Is the argument that this discourse could n…
ytc_Ugx5-faYL…
G
Hear me out, every robot should have 2 or 3 weakness points that is progammed to…
ytc_Ugy2Kt2tR…
G
an opinion depends on whether compassion is solely culturally dependent or if it…
ytc_UgwRY4E31…
G
Copilot is garbage, using an ongoing GPT4 conversation for complex problems is w…
ytc_UgwMHlI0W…
G
"Breaking up the empires of AI". That would be creating a singular empire that w…
ytc_UgynMpOSM…
G
Every discussion I see is talking about how we control A.I.
The short answer is …
ytc_UgxXOBZYh…
Comment
I am scared of AI for many reasons.
I am scared of it because it will take most jobs, and reduce me to poverty.
I am scared of it because it is not human, and therefore is incapable of creativity and empathy, and yet is already being used in fields where those things used to be essential.
I am scared of it because, well, the powers that be seem to love it so damned much. Nothing that a government loves so much that it is willing to put it ahead of human voters, as in Texas where the AI centre is leaching power from the local community, can possibly be a good idea.
I hate AI. We all should hate AI. We must unite, we must fight against.
DEATH TO THE CLANKER SCUM!
youtube
AI Jobs
2025-09-06T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwxmpC-ihNhwhzUUV54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugxc6Q_mLUZPwgYRqLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy8LeCS6AJHkwBGuTp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugwur366SjOoHeXqmkV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgzTuSWDMb7Ahy8i2HR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"approval"},{"id":"ytc_UgxOO4Jws2HDu8Nuvpl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgygBX7HUNTHvIOHHLV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugxx2AjTLKsmIqkrN_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgzskWmiqDdj6yxy38l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugwt9Zvt5m7p3e1nzpR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"}]