Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You think this is the end? Drones are going to be in the air by the thousands patrolling instead of human cops because it's cheaper and no one wants cop jobs anymore anyway. When the AI decides you are wanted, or that you are engaged in "criminal behavior patterns" such as spending too much time unlocking your car door, some judge working from an office will thumbprint a warrant and THEN the cops will come just to grab you. Only they won't have on- scene discretion...they'll just be an armed muscle crew in a paddy wagon with a court order to grab you, more like a cell extraction team in a prison. If you run, the drones will follow you and a warrant will issue for whatever building you enter...now SWAT will come to do a dynamic entry. Robocop isn't going to be a big robot...it's gonna be tens of thousands of flying cameras with nerds in air conditioned offices pushing buttons to send the armored goons. I bet bail or no bail will be determined by the same nerds without a hearing. "Oh, he ran...no bail. Send him to the private jail for a hearing by monitor to set his first court date...which will also be on a monitor from the jail...and his trial as well if we cant bully him into a plea bargain". Not science fiction, the near future I promise you.
youtube AI Harm Incident 2023-08-14T00:3… ♥ 5
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugy6ba_mY-MUxtJCz8N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwplGDy1OEujTgVn6N4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz7DMMZwD6GEzylG9V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwNxixz2GJChTZlaUd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzwNDGIcWR1c528PIx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxhaGqTb79ZEl3jAwl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwJSUpIyTSwME6TJjB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwtgpUjzU0Hd79D0oN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwTHTw8Kz1vJlxmhqt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxVI16I62JUmc0z4nF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"} ]