Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What are all these analogies given by the Believer AI? None of them made any sense. Analogy: "It's like an ant saying that, because it can't understand why humans destory it's nest to build a hospital, there must be no good reason" That analogy actually works against the Believer AI's point. From the ant’s perspective, there genuinely is no good reason for its home being destroyed; the ant gains no benefit. In fact, we have only made them suffer in this world. The existence of a ‘higher reason’ that only benefits humans doesn’t make the destruction good for the ants. What good have we done for ant?? An ant is literally the worst example here. So if the ant concluded that god either doesn’t exist or is irrelevant to its world, that would be a reasonable inference. In the same way, a God whose actions never benefit humans, never communicate, and are indistinguishable from natural suffering is functionally equivalent to no God at all.”
youtube 2026-01-06T23:1… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgwAwadOcXfPVcHD-BN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwLY5w-DHBHoxQ8MdB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugygo0A4aUF8ThUalyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyGKgakdDnbpx3D3h14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw8ZWTb12Mn9uaOyc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw71Vg0gGwY7FBmyx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyvlaaR5FefIrAQVHx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwoqMNeich15Fo0cBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzZll_4jsAmoOSD9154AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgynBKQCi4_uNBRJYpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]