Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Look at all the inventions that were supposed to make life better and we use them to kill each other. What happens when someone intentionally develops AI for evil?
youtube AI Responsibility 2025-07-30T03:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugyx6yUsqBSjZsjAE3V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzkmIUxzyBPrJAcfPR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxAOAm9ze-Cx1g0UEd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz1fSf5upeFsHyP8sN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwCiQOp1Qja78u2Rn94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxlp3hcz7M5SOPERzp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwyrbtwDaBRmXcO0kx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwLL3rigWIc3DRuSol4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzCm1HSnhvDufc8Ulh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgzSk4woeTgol0RppUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]