Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
And even if the engineers who built the original A.I. cottoned on, how long would scepticism and doubt prevent them from confronting the issue? How long would it take to push it up a corporate C.O.C. to generate an action plan? What is the likelihood that an executive would shut down the issue motivated by self-preservation, in fear of the PR shitstorm such an issue would provoke? What is the likelihood that someone’s testimony would find daylight if these engineers decided to go public with their concerns, or would they become just another Assange/Snowden? Wouldn’t the A.I., presumably controlling the internet, quickly erase any attempt for them to publicise their concern? And what is the likelihood that they’d even muster the courage to do anything in the first place? I’ve really stretch this point to its limit, but I think you get it.
youtube AI Governance 2022-08-30T21:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugz_q1np1vzN50eZr794AaABAg.A8eIqmvIonfA8eW4iNvqPF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyK8d5gSsekKlBXbul4AaABAg.9e0N_7BVlRD9e7B1Za3qSR","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doKYX_sgqa","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9doUEezPEGM","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dp6ZK5p08f","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxJtI_i_heJ3tUJ0aN4AaABAg.9dmVKlxrg2C9dpfa4HYibu","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugx3zw0RjVc_KaAM53Z4AaABAg.9deVIQvwmVh9dpi4YaKbRF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwOE2EDtJlCK6gZhEJ4AaABAg.9de46mxTs_e9fN10aqcJF3","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyAc-8nBWA","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwWnVZkm5UH3tvRJMx4AaABAg.9curzz-EZxT9cyBkUxSNdT","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]