Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
56:08 Edit: Later on ya'll reference Elon saying he didnt want to create terminator but realized he choice was either be a player or be kn the sidelines. To adress the question of why would we continue to build ai knowing there is at least 20% odds it kills us all, I would like to point out the context of the world we live in. Ai is clearly an arms race, this arms race has every country that can play the game playing. Choosing not to build ai is akin to choosing not to develop the atomic bomb because of the possibility of destroying the world. One person taking the moral stance not to build ai because they fear the possibility of killing everyone does not stop anyone else from choosing to develop ai anyway. 20% risk that is in your hands seems like the better option.
youtube AI Moral Status 2026-01-27T03:2…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxSFCO02UrFuSpjfAZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxF4kpWe0I7ZJJbpEx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzLZBUlP_WkIYoHIGZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzi8TOcX6LUuErsqEN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyIv1-i443K6xz3Rg14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxBJBT7wHYm_0MkzVF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugy1hEJ2nIIydyC3QnV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgyXB-fI3s41Zj-aXeJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzDITD15vIXo4o9ADR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxjBCORjkNJr1CjiCF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})