Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You hate AI for something that AI didnt do? I dont hate AI. I like it.…
ytc_UgyoVfmnG…
G
I feel like I can learn the art of illusions from AI art though, like, because i…
ytc_Ugz7-diXc…
G
Sam are you working on any solutions to this problem other then raising awarness…
ytc_UgyvXSW0M…
G
These are the dumbest of the "AI bros", these clowns do not represent us in the …
ytc_UgyI3s6p6…
G
In theory if an AI was to try and fix the world it would get rid of the most cor…
ytc_UgwroZLVJ…
G
People who hate AI fall into two camps. Those who think it’s useless and will ne…
ytc_Ugz2wKLhh…
G
Do not text and drive, I see it a lot in Australia as well as drivers not indica…
ytc_UgwO1iT7f…
G
It is absolutely ridiculous to me why people think A.I isn't conscious just beca…
ytc_Ugzcgjrik…
Comment
The biggest difference between the bomb and AI as existential threats is that we're taking control of our own extinction out of our hands. Never before have we given up agency in the object of our own doom. We constantly build things that threaten us, but humans have a deep instinct for self-preservation that extends to other humans, and that is the only reason we didn't destroy ourselves long ago. Now we're building something that will take that choice away from us. THAT is why it will kill us.
youtube
AI Governance
2025-10-09T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxRomNUHs028P-hQZl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyP7mFwGFGNMFtZgdV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwqfpf1Gb7UsPDPP3d4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIKWihkGVlVG0wsR14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvX__Voj22RL6i8KR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxXIXg6857zwKYHxqZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyS5wkJXuf5OBkK8fJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx_cUwVm5bnjCdKySt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy8tSaf3xV4cyigywt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-1dYEgKCgJvIohkl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]