Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The term "hallucination" is inappropriate for generative AI. Since AI is not conscious, it does not know anything. By the same token, it cannot make up anything either.
youtube AI Responsibility 2025-10-03T01:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwoaVqFvtXAVblR5Fh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyAkf89RDGB1kAxegt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugzqexx3mNNZKazL8dZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxNth-YK5Lf2qq7P394AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyojNJPB3giiydQBTJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwnreeK-5juahsUnF14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"disapproval"}, {"id":"ytc_UgwLAVHSh-ByAiie4wd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_UgwkfZq9TiMMcg5Xspx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyP-Eg4zJ5qjWaK3fx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgxOiRm8QRukAC0IulR4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"outrage"} ]