Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
SO, May i ask a question, CAN WE FUCKING STOP AI DEVELOPMENT PLSSS, I RATHER HAVE LESS CHANcE OF DYING THEN 1% MORE CONVENIENCe
youtube AI Moral Status 2025-12-25T13:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzjiHOJ1VCWL51LOoh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwes-2GKmJ5xbuPWnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwhtm_dnvzzYYa2FkJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxL4anxe_PD2mB_3rB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwaSvdEwOwlC4aqjtF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugz7tp2DQ1hyH67PKxB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyfHEP6jjrLliPjILZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzQZzvA60VcjeLdIf14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzz9VqpsIqLc5ef_vJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeCmnWk8k1gFlveXp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"} ]