Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great video as always dear turtle, and a very well timed one as well. I was, coincidentally, recently talking to some friends of mine about the misuse of the term "AI" for LLMs, as this is basically a huge marketing stunt by the corporations that made them, which attempts to market these models as something "intelligent", and not just the fancy autofill it really is. That's why most serious computer scientists shrugged off the Elon Musk led call for a 6 month pause in LLM development back in spring, as it had nothing to do with any ethical considerations. It was a underhanded attempt to gain time to catch up, and thereby nullify OpenAI's head-start competitive advantage, by playing on the general public's pop-culture ingrained fear of anything called AI. The ethical issues underlying human created consciousness, have been discussed at length in literary works, Sci-Fi, and actual philosophy for decades, and still has many unanswered questions to this day. If we ever do cross this boundary, I hope we do it prepared, and not just in some vainglorious attempt at outpacing your market rival, in the eternal dick-messureing contest of moderen market capitalism. On a sidenote, I read your "deathly" poems. Solid work overall, had me both chuckle multiple times, and maybe cry a tear or two. The one called "Mum" was especially good at hitting me in the feels. You have also inspired me to take up Ursula K. Le Guins works, as a way for me to get back into reading fiction, and I'm having a blast so far.
youtube AI Moral Status 2023-08-20T23:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw13fhsvckj0yR91O94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzaioAdWwVpqN1h87l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzJXqZgIVnkvMkQE_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0Sz2-H7fANSCSpNR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxJS-uRAVesQffT0OZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwaYHm5r-PWdId7C214AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz7fJy7IL6E5m3bOzF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyYpNjBKApv8wLPMC94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwxA-asleNV6b5sLrl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwwpmmVD_7-SiK7dq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"} ]