Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is irrational. ChatGPT hallucinates all the time. How do we know that the chatbot wasn’t simply toying with the guy? Chat bots are not a source of information. They are an external brain that we use to amplify our own. They’re not that useful as a substitute for Wikipedia.
youtube AI Moral Status 2025-09-17T03:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwtmltJ-6n1DE2_y3N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxvTZiVqazmsVs6Hvt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzXjooHi2QqG7T7kHt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz67p6haSHq0Pfl-3F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugzo_pOhNZ2Ff6PhKyJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw2k49p2ezdyl5u0qF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzGGLGDaZHPHxPQ26l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyttXwN1P6EANwY-NV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzlFUV9JUCSAbzPw314AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwlHQcVLxADPa9rOIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]