Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uh no. We are finished as a society when we become enamored with such things.…
ytc_UgwPY-Al4…
G
@Vi1eVixenExcellent question! They'll say anything you want them to say. They …
ytr_UgzWOIHH8…
G
Everyone is a 'potential' criminal, any algorithm left to run for long enough wi…
ytc_UgxquasoU…
G
the AI bros are so childish. I told one that I don't condone AI art and they whe…
ytc_UgyKfLSfn…
G
Again, if you don't want your art incorporated into ai art generators, stop post…
ytc_Ugyt_XLEw…
G
Maybe AI could teach him to dress like a serious person. China wont have its own…
ytc_UgzrWf5ZG…
G
Put a gun to my head and tell me to use ai when I draw and I’m pulling the trigg…
ytc_UgwmZG_HI…
G
Yeah AI is really good but it's dangerous as well. Just imagine they r going to …
ytc_UgyYV44PB…
Comment
Man writes a book about how superintelligence will kill us if it exists no matter what, but than tells us to be nice to it because maybe the thing I told you is definitely going to kill you will spare us because we were nice to an input output machine that predates it. ChatGPT only “remembers” because every message sends your entire chatlog with it. Disappointing to see the expert ignoring that because fear mongering gets more people to buy his book. I like the book and most of this chat but that moment really irked me
youtube
AI Moral Status
2026-01-08T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxXvP06xB_rvHXU8nl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxB2lUMC10V2WCKMdh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzG6m5nNk-ZQp4yPdd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdLgUpm0zqRww_36x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxXKB0Q9EOyb0TYAQ54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz9jWegCqJ5MLH9GXF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjHKweqa7s6ZC0JHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy-KZ4-7G2BKQOny894AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy88yz9_C5B-z5vALJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-G5YAEcxVcUZLiZt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]