Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Back when I was an undergraduate student in 2007, I told everyone in my Sociology class that AI/robots would take our jobs. I was laughed at. Guess they aren't laughing now. The next item on the list, after AI and AGI, is BCI. In the coming decades, you will begin to see non-invasive BCI introduced in the medical field and other areas as it gains adoption. Like Elon Musk said, you will need to merge with AI using BCI. Then, invasive forms will eventually be adopted. This would be required for you to be relevant as a human being, as you won't be able to keep up with other humans or AI without it. Voluntary augmentation becomes economically essential. Then, invasive forms will eventually be adopted. The Internet and computer user interfaces we know today will become archaic as these systems go from being external hardware to fully integrated systems within the human brain. Eventually, we will function like a Gestalt consciousness or a single organism, with a governmental AI singularity serving as the controller; however, that reality is very distant. Unfortunately, most of humanity will become biological processors for AI systems, and human agency will be a thing of the past. Humanity is the frog being slowly boiled alive. Most of the public isn't aware of the biotechnologies that already exist, or are currently being developed. They are all converging onto one thing, one kind of future for humanity, and it doesn't look pretty for the bottom feeders. The competitive nature of human institutions—whether governments, corporations, or militaries—makes many technological trajectories inevitable, even if they’re not considered by us to be probable or desirable. And that forms the primary driving force behind this likely trajectory. Humanity won't be hunted by AI, it will be slowly subverted by it, absorbed into it.
youtube AI Harm Incident 2025-07-17T02:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwqdmPSvccejnw-5mp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxT6FkFXYEUO_sWxwR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy7APR8V4AeCWuY2dx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1p1jT0cIEx0gRzhB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxKQgMYhdzX4uqpiC14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyAIn1gcqOC5iBL_ph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzTc4_0eKohjTb5eu94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw2adesEFcP-_-fCnB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzoNcIbYUkadr6z9lh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx9NWhzscmSkau98MN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"} ]