Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Im staying with you guys, I've seen this pattern enough. I gave a document to an algorithm to "find" information in it and at the end the answer was wrong. Even though I gave it the data. AI is useful for grammar and web searching with steroids but not for research
youtube 2026-03-24T12:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwmrGjyPFxxGS-bvtV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxZ78g8jCl4Jwg3-m14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwnS6EUWIQucbW8qsJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxQTkO-qNY62UckrcJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_7-xkugbhubKrJv14AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyNg_xohMuaQj8rgqR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzjxNK7AuSChzCqO1d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwpopVK6VIpEm5sU4N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"resignation"}, {"id":"ytc_Ugz6g_rYfKTAe-uVkpZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx35KOr4kEJjMiPFil4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"}]