Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What no body is talking is this For AI and robots to function all the time they need to build data centers and those data centers take tons of electricity, then you also need 6g towers that is a lot radiation and radioactive activity that will kill the planet look those cell towers they kill trees, birds, and other things around, the second thing is no human will work so will be slaves, and what happen if AI determine that we are useless and see as a solution to eliminate every human so AI will hack military systems so next thing you know bombs everywhere and that will be human extinction. Self driving cars are killing people who are a target aka Christian’s o people who wake other people against the system in many countries we hear that the car breaks down stop, that is on purpose because that person was targeted but the person didn’t know, other times the self car run away people the same situation people don’t know that they are targeted because is so sad that people are not able to see but as usual they lie in the news telling a big lie like oh the system was hacked or the computer malfunction. Is 2025 and many people already died because they were targeted.
youtube AI Governance 2025-09-05T11:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgylxqDoG9wdRPPmVK94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyS8yvg_Lm8dREVhAR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugwf4ElBImpxocNGRfV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyT5zXWKJrAnLS40kR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwxU5aXmYlnpDDuwn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzmy3aLUAgoLfYBVf14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzhCfv-dfDt4cS3PC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzP8YpZ-VG36mLW_t54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxmStraMEHV6Ip4xbJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzxiRfeR4CXzycqW8F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]