Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Maybe an extinction level event? Put AI on the shelf for ,mmm, 500 years should be enough. To put another way, it's like teaching a 5 year old to drive a semi, then handing him a license and a carton of smokes on his way to his first gig.
youtube AI Moral Status 2025-07-26T17:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyLDnUZ6EASeji3-8x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzq0kRqPb0kSH5HNNJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyLmNsPgfE_92xP_xh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwZ7XsJJJMZcVwxkCJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzVZLYnTJZauV-ycxV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxeRMY7tU4miCo-VlJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyBmeOPXFsHg05QK9J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwU1NpUwqYges2nZN94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwDetTJVW_Qlsal8-V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzXKi4E6XTJsxH03Vp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]