Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1:29:17 what if we find out that we can't co-exist happily with AI? Why don't you ask the follow up question? In that case should we stop developing it and accept that is the end of us, lock the box and throw the key?
youtube AI Moral Status 2026-02-28T23:1… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxvZZLozLNsd92BVjd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyXiqHN7Li-joJ0r4d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzUAL1zb3eH8lTMnpN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwUJcjqddZVSfdLjn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw-1qkvwbDcJSbiVH14AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyMLUARr4PDKkev26x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxPxTtUpBCeERRe23B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwibThJF_RpVwKOojV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyssS2aQ_bRpSWxYcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugy9D1PkVXb3rf0qWap4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]