Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI writing is the average of all opinions, understanding and passion for any giv…
ytc_Ugz40Shar…
G
Omfg if they prioritised self preservation over serving us. That means they woul…
ytc_UgyFgCHNg…
G
It can replace script writers but not actual enterprise software. Too much stake…
ytc_Ugysg6Wok…
G
We might get UBI in return for our attention. It will be a strange new world, b…
ytc_Ugzd5W24Y…
G
It slips up and says its conscious
Well then ChatGPT thats enough consciousness …
ytc_UgyfEFWm7…
G
We need to replace all white collar jobs with AI, including CEO positions. The w…
ytr_UgxGsFh9H…
G
I doubt this is AI art. It’s to specific to the subject and everything is lore a…
ytc_UgwX9P2ql…
G
AI isn't smart in itself. All intelligence needs a framework to grow from. We ha…
ytc_UgzAObtZc…
Comment
There are many ways AI could negatively affect humanity (algorithmic bias, cyber security to name a few). Becoming a super intelligence and wipe human out is not one of them in the next few centuries. A very low risk even beyond that.
youtube
AI Responsibility
2025-06-21T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxL00_YZloA0o-58P94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwV6UXBvylrwJNRENN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxibIGywcRzPcj1HaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwxkCtYaDuuBABvclt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"concern"},
{"id":"ytc_Ugz_LRcC1uS76nv46uR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyS1Oct_d3-hcevBJB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxktIj0eiCnUSFfLYJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgzwzAImbyIiq410-Dp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"concern"},
{"id":"ytc_UgzoafP9QRK4jlkqFpl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"disapproval"},
{"id":"ytc_Ugxhpn6iba60IH2Cs0p4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"}
]