Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
ah you see, there's the kicker. AI isn't smart! it just optimizes for whatever you tell it to, it's the trend line of an excel spreadsheet that got too ambitious. AI will become conscious because we give it one task in which consciousness is a convinent shortcut, and then promptly look down at it's mechanical hands and say _what the fuck dude_
youtube AI Moral Status 2023-07-03T11:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugy-P0EvcZYPiOSr3GJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxxIjCPOkl0-oT_Gqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwMCNsndG_EzQm0ZzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxQG6onATysv-_xZoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyBG1vfGeiDFTIpUHh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzivrBdKCRNSSvpEOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw7BInOiKjcUk3m2e94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyfpts3f89Y1Cqka7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxVWiHHpnnppk1Q4iJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxf93kWXqMK9mfNLd54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}]