Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Yes , then why just u start a AI compy! Like we have control now but at some point we r going to make it so smart that it will grow a conscience and at this point we hope 🤞 it will help! Or we r F$#!+
youtube AI Governance 2024-05-11T01:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwWWqOu0eEa0zdP1-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugxu64zk-AyWjIqhzC54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugxe0Ltvw7Nn5PcvQMZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy0kIQr9ma-jXw7rdZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwo_4xL7_yPsfdHQNl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyGWlyZSAAg7rjqcX94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgxSt8sS3Yg7kvfdwJF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxjHyY3tOlbv1HCQSZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgySu_Nt_O1t-xm6iyV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzu_J32bLC97dSVRIh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]