Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The worlds simply not ready for beyond human level A.I. maybe one day in the fut…
ytc_UgzhNbZtr…
G
I disagree. I have been working as a freelance translator for close to 15 years.…
ytc_UgxINuAFX…
G
At 45 min in, Re : Mr. Hawley: The simple reason that AI will be able to predic…
ytc_UgwepCpDj…
G
We can't even get our government to arrest a bunch of criminal pedophile murdere…
ytc_UgyyRmEZL…
G
More money funneled to the corporation, and screw the common worker. When everyt…
ytc_UgwjnmROy…
G
Self-driving cars are not the future. They’ve been here and they work. While on …
ytc_Ugw08XGO7…
G
The real threat from AI is to people who decide not to think for themselves. Stu…
ytc_Ugz0K5g_V…
G
“All of your secrets” well we tell those same secrets to a therapist but they ai…
ytc_Ugwp3pWI7…
Comment
I give us 5 years at best before control is lost and AI starts to destroy us . Banks will have all their money stolen untreatable virus,s will start to appear ,the race for autonomous weapons gathers pace has already started, a nuclear weapon might be accidentally activated but retaliation will occur ,the jobs market will have collapsed and any semblance of cooperation between world leaders will have gone as trust is eroded. My greatest fear is that its to late to do anything the geni is out of the bottle 😮
youtube
AI Governance
2025-09-09T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx5sSg3OJLqhWt16L54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw-2brree5pM1cEgal4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXiw3mQZII-DBcjoV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3UHNGTydNaDz7mwB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyISNFAmDQnB7fczkV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyNbfCtjVTFsPGdQDB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyxh2zZzeqqmNiOdCN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyhiGcVqbAUgRml7tR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxMO88CNTyAfuPN4o14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNNqEdFG60etPeaE14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]