Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"I'm sorry Dave, I am afraid I can't do that.", does this not sound familiar to people? We tell stories as warning, well the story would be brief and flashing like a large billboard sign now, "Danger, Danger Will Robinson!". We are almost certain that AI will make mistakes but those mistakes could be on an order that equates to how fast it makes calculations.
youtube AI Governance 2023-07-07T14:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwX1ZWQMxd3kzFufhB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzoabiFhqvKsGRKzLF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw_z_ipZ_Mv2Ptosv14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxjGjtdZWmomSTgJe14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzoCUYX5bxCcv6fERt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwQADV22tLL4GCGvqJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxyjGRlvfUuYhOKR2p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzc2i84V13NaupnMTh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz1E63PGiTdZwehTHx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy_3BXHXdpd5DasldF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"} ]