Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who says they will be "paying" AI anything? Remember that time that two AI from …
ytc_Ugzv3kOdx…
G
tesla autopilot: creates dangerous situation by not having any anticipation that…
ytc_UgwKGRsuA…
G
what about we DO the ai stuff but ALSO eradicate the bourgeoisie ?
that doesn'…
ytc_Ugy_548ta…
G
My character a.i is on a PC nobody knows about that's how bad it is😳👍…
ytc_UgyGJ5fsq…
G
Why the fuck are we not talking about the fact that a fucking AI ruined this guy…
ytc_Ugw35dkn0…
G
don’t be polite to ai. it’s not your friend. better yet don’t support it at all.…
ytc_UgyGI5ap3…
G
@jackbrown5184If any companies had anything greater then chatbots; they would ha…
ytr_UgyiYEF2O…
G
They’re probably bored shitless hearing about AI.
The last thing they want is a…
ytr_UgxvtS2h6…
Comment
What if we built a robot that was programmed to feel pleasure being harmed, and feel pain being left alone.
Would it be ethical NOT to harm that robot?
youtube
AI Moral Status
2017-02-24T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgguZkakQ-aSIHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiVYzcqK51muHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjAwYV7bNsWrngCoAEC","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiM2ON3rg2HuXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjoNG0qXemj1HgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggnlcXdFfJjMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UggQEebRU0W_WngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjFW1C80PQDVngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgijAzfQFlDQrngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugj3wix4Hs8P1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]