Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is pushed in the workplace harder than DEI. It's not politically correct to b…
ytc_UgzhqCapr…
G
I wanna take a picture of my butt and watch AI make it talk. You think you can h…
ytc_UgxgnWPEt…
G
Basically Indian startup are solving either niche problem or unnecessary problem…
ytc_UgxK5Xzaq…
G
Just a bunch of uneducated drivers complaining about things they don't understan…
ytc_UgyRe0G8P…
G
If we’re sending robots to fight robots
Lets just stop doing wars at this point…
rdc_ohmf7cv
G
Its just sad that this comment section has boiled down to "orange man bad" and o…
ytc_UgySYFAK6…
G
They will no longer need our money. They can let us die, and the AI and robots w…
ytc_UgxcojRUd…
G
AI is not dangerous. It can help solve US of how to extract rare minerals. And i…
ytc_UgyI0sfLy…
Comment
If it took humans less than 100 years of technology to advance and build AI, something hundreds of thousands of times smarter than the smartest human, how long will it take for AI to create something thats hundreds of thousands of times smarter than them. 😮
youtube
AI Governance
2023-04-18T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxw1ARKXXpIZO6dxvl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgXH8V7DPa1z5g8ch4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNPsCU-6IKHR0jaUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyX8blT0orS2OEOXjZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhoEEPrlKEhTse7qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdQYLuGCk34JdKBLN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTO8J7cuilpnHpDNN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzEDrkrK2wmJrw0qx54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP8Q_jk1ByJa5m5j94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEPwEEQ0L8cgg7PLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]