Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
LLMs are hallucinating because no one on the internet said no. even if 50% of the internet were people saying no or being unsure. it would still hallucinate. Hallucinations happen when an LLM is asked a question outside the scope of its training and needs to still say something. We never programmed them to say they dont know, mainly because we dont really know how they know what they know.
youtube AI Moral Status 2025-11-14T15:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwQOiyO3OCzih3F6Nx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwZ8r2PkMx5xS5VR0Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwRRDIxC8EnRV-xB1Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx43e7kWLFGYaa3hQF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwLoZ9ZwlSd0rYQOzl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzh1aZjpluQuI8zrZB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzhjdP__n9vjBYJVOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuFNU9Q7nla7Usgv14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzGIF4ab5tdnSxd3Th4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxoIpBwJY9WVY8T0Rt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]