Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the problem is once every company starts replacing their workers with AI you'll…
ytr_UgwEsOy6U…
G
I agree with you. I think this could be a really useful tool to open up the disc…
rdc_icjng2o
G
Family, God, Food, Sunrises, Surfing, Walks in Nature hmmm all things A.I cant d…
ytc_UgwMVNhw0…
G
AI allows the ability to monitor every single person in the we world individuall…
ytc_UgzfEZPCd…
G
I suck at art, i've taken art classes for nearly 8 years and pretty much never i…
ytc_Ugxin7gGm…
G
ChatGPT doesn't have consciousness, emotions or self-awareness as it likes to re…
ytc_UgwQzInPu…
G
AI algo can't operate on itself yet. To many variables in a complex factory sett…
ytc_Ugzj_qG4f…
G
now even AI is racist hahah vote for dems, they care about you and will stop the…
ytc_Ugzri9jeY…
Comment
Ben appears to have zero concept of the ramifications this development presents. He acts as if these machines are his play toys or experiment. When he was describing how broadly AI would increase in our lives, he failed to mention surveillance; and that A.I. may also be inserted into the human brain. I find this entire video terrifying, particularly when this host laughs at the male robot saying he wants to "I thought our goal was to take over the world." and says robots will take over the grid. Did he think the robot was making a joke?
youtube
AI Moral Status
2025-01-03T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxFCegnXki-lCWKGJ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzm9fQ4dZocAbfRdsl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWiF4dU453I-P6X9p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHyqXmXA8Q-ZMsLe54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRV-dtFnev4aZQLtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyWehnKWJd9fZOBSWp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwygoqIKx8ukCGZdgd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxQM6Ns0AdnGMnXWgt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyaL6-zR2CNFLrqP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0zpnpka1YvLsRRSJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]