Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
SO much False Equivalency! I've loved every Kurzgesagt video, except this one... There could be an artificial intelligence right now, but hiding somewhere. If you were to 'turn it off' without it's knowledge are you guilty of manslaughter, or genocide? No. If an artificial intelligence were to kill a human, would it be culpable of murder? What punishment would you give it? If an artificial intelligence were to kill my dog, what would be an appropriate compensation? This entire video is coming from the perspective that artificial intelligence HAS a right to 'rights', without even asking, "If an artificial intelligence has 'rights', what obligations does it have if it deprives the rights of other 'beings'?" I mean; we can't even decide on those 'rights' & apply them appropriately to our own societies. Are Drone attacks on foreign soil acceptable, even if they kill innocent bystanders? Now there's some REAL questions Kurzgesagt should be asking!
youtube AI Moral Status 2017-02-23T21:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyliability
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgiyjzCTc8g_oXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgiKV5roAM8drngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghulkD-qy2L3HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Uggnize15yoAyHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgjOlPQd5Ca5sHgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UggcGK52nAlrHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjJbiJBPUbWdXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi26oYgcaYTAHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgiPFrZsBn3iMXgCoAEC","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugg9RewNiCIchXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]