Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think a greater cause of concern would include how we as a society, how we as parents as friends as siblings have allowed each other to take AI serious, to communicate with it, as if it were a person. It’s a resource for sure, but I also recognize it’s limitations. If people were too harm themselves with a kitchen knife, should we sue the creators of the kitchen knife? Ultimately these frivolous lawsuits are going to lock AI behind some kind of wall that can only be accessed after showing a certain level of critical thinking to allow you to use the program. I am sure that young people have googled things in an effort to do similar self harm, but we don’t necessarily go after the Google search search engine. I believe that other developing countries with a little bit more common sense would probably think to themselves why would anyone listen to a computer program? At some point we have to take responsibility for ourselves as individuals for our children for our parents because ultimately, when one of us fails or makes a mistake it is those people who we ultimately impact.
youtube AI Harm Incident 2025-11-08T01:4…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwHWmKVArrbBzNSDjR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwFlJ4ZAsJf5spd9il4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzdECtLb4JgAsb4IGx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwnsHj2UryVRe1jTNp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugys0TIGpgjHPPXCit14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxB_JDqFtoY8ForzF54AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxMbvndbrGSxaWtl5d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxRyB2HyjZMmt_-XXl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwRMMi4xtxCxLq4pIZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzDi5uz-uMjn3iE8fF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]