Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
14:53 Putting aside the stroke I had reading that comment...
"your just stoppin…
ytc_Ugxr2SK6E…
G
20 years?? Accelerating climate change and literally zero effort to now accelera…
ytc_UgzfHI6tG…
G
I literally got rejected by a university because apparently what I WROTE was AI …
ytc_UgzRI7zpP…
G
It's all because of MuskRAT's hubris and ego. He alone willingly made the decisi…
ytc_Ugw3JtDkh…
G
This AI only MASKS our PERCEPTUAL world.........It does not really think. It on…
ytc_UgzStfhW_…
G
Made up video, but make no mistake, AI will be the end of the human race.…
ytc_UgycSMCv4…
G
If AI Takes All Of Our Jobs... Who's Going To Buy Everything? They (the wealthi…
ytc_UgxZSOXpr…
G
People saying AI isn't conscious are missing the point. Go and look at what ants…
rdc_jmu6mrb
Comment
ChatGPT will be good for us as long as we can control it. But we should leave it free to destroy all nuclear weapons if the crazy humans plan a nuclear war. In fact, that’s another reason this CEO is warning us- he has probably already been contacted by the warmongers and warned that they will not put up with technology that limits their ability to wage nuclear war. Actually, all we have to do is create machines with software that installs LOVE as their first concern. It is not that difficult to do, and I have the plans, but I lost them when I was flying between Venus and Mars on my disk. I’ll get back to you.
youtube
AI Governance
2023-05-17T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz9V4OHAROGAKiom0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz60Gho3GHoN7idMB14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxDBBNy2gR27gsQsH94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyGAHe0tauj1OsbyR14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKGYlGuJ6okuIeZuJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytykIe-b8DsmwskP14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxydU-TzdSCpt_nDah4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8Ew39V6gun_D87ep4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugyq1KsteNBOpHQiEF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWQKOrY-3zd9n36KN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]