Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wouldn't just be better for the AI to filter those people out and tell them Stop…
ytc_UgxnnqzNd…
G
Lol you won't be saying this in a decade.
All technology isn't useful until it…
ytr_UgwIrGrFJ…
G
Who gave China the tech to make their own AI….WE DID!!!
We are like alcohol….th…
ytc_UgyILkQDB…
G
How stupid are these people. AI safe jobs are jobs where a person actually has …
ytc_UgyaVQY58…
G
We've been warned through books and movies about the dangers of AI when AI was o…
ytc_UgwnVuJ9I…
G
I just watched this man conversationally berate this AI for almost 20 minutes an…
ytc_UgwsQA-jf…
G
At that point, you're just using AI as a tool, so I'd say it's fine.…
ytr_UgwPxnvbR…
G
I think a new political party with the platform of "Free Healthcare, Closed Bord…
ytc_UgzaOM3u1…
Comment
Dogs can’t navigate our environment without any problems. Can a dog go to your counter and pour you a cup of coffee? The answer to that question is why the humanoid robot is of benefit. I don’t understand why he has such a hard time grasping this simple point. Also a centaur is a bigger more complicated piece of machinery. Plus, it’s ugly as fuck. Why would anyone want a big ugly heavy complicated robot when something simpler like a humanoid robot will do just fine? And actually better navigate our world and our vehicles? This guy is not as bright as I expected him to be lol
youtube
AI Governance
2025-12-06T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxezesmSMPcQF3FEht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7qlOKQCaVE-O_jd94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlIUmop_aySlwFA714AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwK4qt3eHf9h_-nklp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMXd1RTA_vVqKkJ7Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwbLVs8sboawNy1s14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsGik4IBQgvvs8v5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGqcaO5jMkWR52cVd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw9c-Sc0rU-NJjt7Lx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgymjetgbBorg5MJO_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]