Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If Humanity knew it were going extinct it would still build AI. Humanity -- there is a certainty you are going extinct: Evolution has an eventual plan to kill you. So the question is do we build a superior intelligence as a gift to the universe beyond ourselves before we go extinct or not.
youtube AI Governance 2025-12-06T02:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwqXQkkWYZGaAelJZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyOyNy4gDWTwAqQW6t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyFesVlT5XDKHGdSx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy89kWQT5Yk2cUZ6_94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugxm_cyesVJlMTuOmV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxhuyGA2TR9dILxETp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyjTow-SWVzcguO8Et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx5nVBu8JWhj3rb0HR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzepNdgbaOuvL9uY_Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwA2ILGwZrBAPHc-D14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]