Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Fascinating and important conversation. The only place it lost its footing was on the subject of consciousness. GH should start with reading Chalmers and keep reading. Maybe try meditating. That said, I'm not sure that whether or not the lights are on is relevant to the danger AI presents. Might even be worse if it isn't conscious.
youtube AI Governance 2025-10-11T15:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzQ4RGEUJgDecRT-Q14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwse2BSGvGB4VC2lop4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyO0LsHcEgpOm38xtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxSOS30Vi0qjHrfBfp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyu6is673FTdLIqiCx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyOcEbm3pwa4Biz-4F4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy9SnJAHbpsYZQsuH14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwsChnsQvUskJo6Stx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyVGUnjHTtm9rXWwsZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyO8Zaf5qg_3lfXjPN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"} ]