Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I asked the AI we have now with certain parameters (only human self destruct or AI takeover) by the law of averages and time the chances of either…. Every time it says if it’s no absolute zero chance over time it’s almost certain to happen given time.. if it was 0.1 percent chance for 100 years eventually in time the probability grows.😂
youtube AI Governance 2025-10-01T16:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_Ugx1JfCbDGm1zCrARFt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwYCE4nGISnj_A7LUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwhZ7uC11M3CRFm-3p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwj15ll5eMBkS88EfF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugy4VbNejayazxnK5zp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy5qN8nprR5-5nbCpR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyhjWDkCrUy9POfxR14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyPNV4ixsXI9fgSMZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyRMzwJP0OOrfqoFLx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy2YQ_EYEIHYCFEDz54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}]