Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's real interesting that they used ChatGPT and claim they had no idea it could produce incorrect information, because I've used it since before this court case and at the bottom of the page where you type in your request for ChatGPT to execute, it has a disclaimer, now granted it's changed wording here and there, but it used to say that ChatGPT can produce factually incorrect information about people, places or events.
youtube AI Responsibility 2024-05-30T23:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgyCQjdFnecjaXyujUx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyRNkgGH9T5nhTB7RJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzTxeNs0qdbQJER6Dd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"amusement"}, {"id":"ytc_UgwM4I2tVKeWYgVuzWd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyksZ1AaIEFnKncYHR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwxCANAd_cL6ocR-Yd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzyUm5OHp3N3lE_uh14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgydEJ0CS3hXZAjalbV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzWd2ZwWN8pY64aGSt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyboyUuuUl1_a8Zg1V4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}]