Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I spoke about this on our podcast this week but here’s my theory: it has less to do with the ability of the system and more to do with the perceived safety issues by internal and external parties. CONSPIRACY TIN FOIL HAT TIME My assumption is that the sycophantic thing was a way bigger deal privately then it felt to the larger user base - seeing as we got two blog posts, multiple Sam tweets and an AMA - but the *reason* it was bigger is because all the AI safety people were calling it out. Emmett Shear, the guy who was CEO for a day when Sam was fired, was one of the loudest voices online saying what a big deal it was. I think (again, this is all conjecture, zero proof) that the EA-ers saw in this crisis a chance to pounce and get back at Sam who they see as recklessly shipping stuff without any safety first mentality. I think that they used this sycophantic moment to go HARD at all the people who allowed Sam to have control before and raised their safety concerns to highest possible levels. I’m pretty sure the Fiji thing (bringing in someone to be in charge of product) has nothing to do with this BUT it 100% could be related as well. Meantime, the actual product we use every day is now under intense scrutiny and I assume we’ll continue to see some degradation over time until they right the ship. Hard time to go through all this while Gemini is kicking ass but that’s how the cards fall. AGAIN, this is all conspiracy stuff but it keeps feeling more and more like something big was happening behind the scenes through out all this. Don’t underestimate what people who think the future of humanity is on the line will do to slow things down.
reddit AI Harm Incident 1746997204.0 ♥ 7
Coding Result
DimensionValue
Responsibilitycompany
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_msa18gf","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_mrt5tdr","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"rdc_mrt10o3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_mrt7jqk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"rdc_mrvqbf5","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"} ]