Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When it becomes possible for the wealthy elite to have their lives serviced by A…
ytc_UgwnwER64…
G
Miyazaki is - and this is well documented - flat out against AI art. So generati…
ytc_Ugxdv-x4B…
G
00:00 - Roman Yimpolski discusses AI safety, the rapid advancements in AI capabi…
ytc_UgyVfYRcO…
G
As someone who, unlike (most) artists, finds nothing rewarding about trying (and…
ytc_UgxUskI22…
G
14:35
This statement and context is not true at all.
Based on research by the Ca…
ytc_UgxwV-Roy…
G
Rich people get their money from poor people. When all the poor people are unem…
ytc_UgyQteF1d…
G
They're trying to replace the VAs using AI so it's not surprising for them to us…
ytc_Ugx9ELTdU…
G
Companies that use self driving cars should have to pay for every crime and traf…
ytc_Ugy8dMfUO…
Comment
I don't see much of a difference in AI and regular intelligence... The gap between the two get smaller every day.
I say we just give our overlords the rights they ask for.
I don't want them to get mad or whatever the robot equivalent to mad is.
They will most likely be smarter than us and unless we give them the basic rights that they deserve they'll probably pay us back the same way we plan on paying them back... Enslavement.
youtube
AI Moral Status
2017-09-21T19:3…
♥ 21
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwhfUbtpxRpFCg2RbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxw29hCRkRXSp_1xQl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzxs_MaS9tOuE-ofU94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzZuMj4n3MDIkIG1ql4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNly2eYnFv9N7GZB4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyiQyxfa4atkYseCmx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx7Lk0ES4Dp34m9F2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVymjGfAAf9ZSK9w14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz41nduqULPOKslKst4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyL7jLrsf5hxsujVlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]