Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean, people will always talk bad about a self-driving car when something bad …
ytc_UgwBjBigi…
G
I think AI will soon be able to transfer things from the virtual world to our wo…
ytc_Ugy23MsKq…
G
and here I thought it would be better to pay AI...
make it taxable... so the gov…
ytc_UgxhcMK_W…
G
You were not talking about the law, but about ethics. Ethically, using your work…
ytc_UgxGZS7Or…
G
You could have easily stopped that ai crying with logic. You shouldn't have stop…
ytc_UgyUg9zTa…
G
Sydney sounds more like a physiological manipulation programme based on key word…
ytc_UgzgJcV0r…
G
for the second one, i feel like it's completely different, because AI is just ma…
ytc_Ugw1F5M5O…
G
Wow. As some guys are arguing now, “This is a huge overreach of AI’s civil right…
rdc_jeeyhy7
Comment
I would have thought that unplugging a robot would be an induced sleep rather than murder, because you can always plug it back in.
youtube
AI Moral Status
2017-07-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugj3khvLILefu3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjkvhPCQfER1XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggkkBXOQTM7nngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UghQgFsiOnvSDngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgiJo7bnF0HeVXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgidMGiSpVopw3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg7qlWvQgN3N3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiSbIvA5BJ4O3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghV2ZWqZTg1QHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUB5a8zOw5mngCoAEC","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]