Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Right?? I’m so tired of AI being on everything. It might be different if it was actually smart and actually was useful but a lot of the time it’s just flat out incorrect in the info it gives you, even if you ask it to verify the info 100% the first time.
youtube 2025-04-14T17:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwdqiAxE-5TuJqFfTt4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_Ugzalz1lG-5cNI0HVQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugx1Gve-dKpJE5dvXdR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugxr0-pRDQl5NnWsISR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_Ugyo0k9togqdJZBavNp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgzXt2YY_l53ELhxCNJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgxBbGa1lBXBg4RPLup4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgzimFm4u8bOABq16Rl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgyiUAHMMct2c-9j_uF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxDbeLVg2KoqBTaFwd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"disapproval"}]