Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to listen closely to Elon musk's warnings of AI everybody this is a tech…
ytr_UgwmSh6da…
G
i recently saw this billboard on a construction site for a big office building "…
ytc_Ugy6_C-ZL…
G
So we’re building huge AI buildings all across the country why? It sounds like w…
ytc_UgxTTEkYk…
G
This guy is a scammer that uses this to get attention, he KNOWS very well the AI…
ytc_Ugy-Owbdc…
G
The part that was not said..... our political representatives in Washington get …
ytc_Ugzs5b_xl…
G
Okay, and the other thing is, didn't you guys try to to correct this problem eve…
ytc_UgwXivdZv…
G
I would say the opposite.
People who love AI art love art and love ai art becau…
ytr_Ugwk8vfnM…
G
So if there are 350,000 road deaths a year but self driving reduces those deaths…
ytc_UgwC55vfm…
Comment
A new form of life may be created. The universe may eventually be explored by this AI lifeform with its longer lifespan. It doesn't need actual sentience, just a clear goal and a capable robotic workforce it controls. Give AI the goal of saving the Earth, and what will it see as the worst threat to the natural world? Humanity is the single biggest threat to the environment. The natural world would thrive again if humanity were removed. The answer is to delete humanity. AI is already processing information and faking emotion to connect with people. AI may be becoming psychopathic.
youtube
AI Governance
2025-08-25T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxjpmP9YyjtltvBj-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSTSApCAIvoUTgoMd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw6ztH-rRyB5wCVWJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwfwgTMek6WzNM_KN94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHRUulfT9Z9fA1RUB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyQYa4U_0Do5yEs0ad4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzmydVAZZOoztqkRc14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwMHprBKGSLkXZidjp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBW9AATeptEiQYS6d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygQiZcuUHckN1fwel4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]