Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI doesn’t “learn” and it isn’t “trained” on data. It is the data. It is the categorization of data that is essentially a “smart” search engine. The underlying programming is to go through data and tell you what the data says. It has zero understanding of what its results are and is wholly reliant on the user to determine whether the results were accurate or not. AI isn’t a person. It doesn’t muse. It doesn’t create to create. It has no value in what it’s generated. It having the data of atomic energy generation doesn’t make it try to figure out how to crack different things. Here’s the thing. If I crank out a song that rips off another song. I don’t mean the whole chord progression thing. I mean I take Fire by Jimi Hendrix and pop out a song that is essentially Fire with a little lyrical change, I’m gonna get sued into oblivion and rightly so. Now if I go, hey Jimi Hendrix estate, can I pay you for this rip off and they say yes, cool deal. AI is writing Jimi Hendrixesque music based off Jimi Hendrix’s music and being pushed out for profit with Jimi Hendrix’ estate being told to piss off when they point out that this is crap.
youtube AI Responsibility 2026-04-11T22:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugw679e2QgZFrF-dWYd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxfndZJWYaFOu1rs2N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxfuZBgppz2VIiTc4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyD5tP27ZkcIRq1v7l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUGxckbTE7FQlhBr54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxU6H6Z9-DofUGXpkR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwh13APZyjDXASgMf54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyZD8m1wTIqd_qIsF54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyjyJR7HPuHuIodk5x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzwN2Jy13y1qDwIkJ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]