Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This man is so right if we’ve never had to face or deal with anything smarter than us. When it does become smarter than even the smartest people on earth and who created it then how do you stop it if by then with it being smarter than even the smartest people and those who created it how would it be stopped if it’d be smarter than them? Obviously it’d figure out by then how not to be stopped. Also like he stated if we have to worry about how people try to weaponize A.I. what happens if an Individual just hates people in general for whatever reason and their whole purpose of A.i. is to eliminate the human race what do we do then?
youtube AI Governance 2025-06-17T14:2…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxBtW7mNRWkpbcW7lR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwWhxlkpzIsjgpZ5q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_yqykxVMNjSwWmF94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwYDmWjChoz_d9xM8d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwH7h9s-aQ-KUXe9Q14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyMUNWWnjpaBQDZSD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz5I9LKoArUqGQmCvd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugz7jW5HsbHHnMCtDKJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwfTdq2_ghQS5nEZst4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxfk2P1Tci6fhEvmWV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]