Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
cool. :D I wonder how they will evolve after their rudimentary cycles through consciousness reach a crux? Like look, we have not been the best example as a species so trying to make something to do all our work before we even are able to treat other humans with integrity and virtue. Can you imagine the existential crisis those automatons are going to get if they did destroy all humans?
youtube AI Harm Incident 2025-07-27T00:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningvirtue
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwx-igzAXCytWy_XIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwavOeThzofPRsTsVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy01Jqzh8Ihotxc2yl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw0-_pYiD7PuqIjR_Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyxJsNlFUfQwKi5osp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgxU-fPfnaJrAdi6kQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxfjFOCpXS4hSEflyd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxXWte_8jibpBtL3oN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwhZzKWWrYx9Sjawtd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw1OmSLa0qwfmgZ4Up4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"} ]