Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ah you see, there's the kicker. AI isn't smart! it just optimizes for whatever y…
ytc_Ugyfpts3f…
G
@coloradoliving18 i would not doubt AI would take away certain type of jobs even…
ytr_Ugz64sbcT…
G
Title of the bug report: Robot crushed human skull on boot.
Engineering manager…
ytr_Ugy4KRHJ8…
G
Or number three, people learn to use automation to be better at their jobs. That…
ytc_UgwVomsJS…
G
@ThymeFromtiYes, humans give AI the task. That's my point. An AI doesn't have i…
ytr_UgzdjFU1J…
G
the best way to fight that is not an universal income but rather an inhumane pro…
ytc_Ugxav2dZC…
G
Maybe it till get to a point where some people own a few robots and rent them ou…
ytc_Ugxf1ljBq…
G
Ι can't bloody stand prompters, their supporters and all this ugly AI art includ…
ytc_UgySRHjel…
Comment
Just read a great book (More Everything Forever) about how deep the rabbit hole of AI doomerism goes, as well as some wild plans from legitimate tech leaders who want 10 quadrillion humans colonizing the galaxy by literally harnessing the energy of every star and the atoms in every planet.
Great read that points out a lot of these people are driven by a fundamental fear of mortality and decide AI is needed to invent immortality (and solve all other human problems).
youtube
AI Governance
2025-10-15T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyhusB7AZC1eVN4bcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLFydI3wfRNar7-op4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxRekLlkKU-uXU70i14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx844MNZVkI1Ho0V8l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz_SaTH9vRJtNNqrS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy7-AwN5naVzFMIhal4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAoCHKeaB4Wikdfs54AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRAbWMwR1al9AGT_t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzsIWHq6WvXAEbFnAt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOW0iYMCstbyV00aV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"resignation"}]