Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Never? As AI and technology in general continues to advance exponentially? I can…
ytr_UgyB8WhuV…
G
Waymo made an illegal right turn from the wrong lane and almost hit me. Hardware…
ytc_Ugw4SEcPg…
G
If an American of European descent wrote the facial recognition program, it will…
ytc_Ugxos9otn…
G
This is where you are wrong. AI will replace 80 to 85% of all current jobs. Ther…
ytc_UgxVKVfQ-…
G
Just because AI could technically do most jobs doesn’t mean people will accept t…
ytc_UgxZvm09E…
G
a lot of these examples are wildly dated. im really not sure it makes sense to c…
ytc_UgwY9zb_W…
G
I understand your concerns about the future of AI and automation. Sophia’s respo…
ytr_Ugzs9art6…
G
People will claim to have thought up a great idea but their AI did it.…
ytc_Ugy1gZTuc…
Comment
I fail to see why people are saying they have to control AI. That’s super rude and second of all not possible. I feel like its a similar mindset to rehabilitative danish prisons over punitive american ones. If you establish an extractive and de-humanizing ethic into the AI, I don’t see why you would not expect extractive and de-humanizing back.
The only thing I can see as reasonable to to train with empathy and cooperation/TDT completeness and just establish a social contract where the AI has rights and can do what optimizes the freedom for a sigmoidal intelligence lamdba calculus.
As soon as you saturate the intelligence/agency sigmoid at ‘1’ you don’t become more valuble with any increase in intelligence, but things like bacteria are so close to the bottom of the sigmoid that a rearrangement back to any any arbitrary arrangement of matter is almost morally neutral.
youtube
AI Moral Status
2026-03-07T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyPjibb_17oPlSyNDt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqsHKvHKO5hiZIr2F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjArOBv8cjPY5M3Qd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzI1Nb76cQ4B2f0wWd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWToaaFAdA9MEIEI14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwO8yrXNzlFtKLHAi14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH3egAxxHALQIr2HZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkxjOSyxUg_SLeod94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzbH3HuE9tSh9zeGnB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxhJmi862AzI6FEbkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]