Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai seems to do its thing regardless, does that matter? Am I wrong? Thanks . Plea…
ytc_Ugx9zlNRw…
G
And yes he wants to implant direct internet access to the brain, opening the hum…
ytc_UgxtXsTu7…
G
Automation and AI cut costs for companies but robots do not pay taxes or buy gro…
ytc_Ugxokcvld…
G
I come from South Africa....my country can't keep the lights on for longer than …
ytc_Ugyz3CYbG…
G
He's wrong about jobs disappearing. What's really going to happen is jobs are go…
ytc_UgwGF9rmb…
G
Does this mean that a person who can not read or write English, can buy a driver…
ytc_UgwFPg00m…
G
I don't think it's ethical to just generate it and go, plus ai art is super slop…
ytc_UgzNwWNLa…
G
What is pain other than an electrical signal...we have been aware of this possib…
ytc_UgwH1gDAg…
Comment
What are all these analogies given by the Believer AI? None of them made any sense.
Analogy: "It's like an ant saying that, because it can't understand why humans destory it's nest to build a hospital, there must be no good reason"
That analogy actually works against the Believer AI's point. From the ant’s perspective, there genuinely is no good reason for its home being destroyed; the ant gains no benefit. In fact, we have only made them suffer in this world. The existence of a ‘higher reason’ that only benefits humans doesn’t make the destruction good for the ants. What good have we done for ant?? An ant is literally the worst example here.
So if the ant concluded that god either doesn’t exist or is irrelevant to its world, that would be a reasonable inference. In the same way, a God whose actions never benefit humans, never communicate, and are indistinguishable from natural suffering is functionally equivalent to no God at all.”
youtube
2026-01-06T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwAwadOcXfPVcHD-BN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwLY5w-DHBHoxQ8MdB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugygo0A4aUF8ThUalyd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGKgakdDnbpx3D3h14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8ZWTb12Mn9uaOyc14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw71Vg0gGwY7FBmyx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyvlaaR5FefIrAQVHx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwoqMNeich15Fo0cBJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzZll_4jsAmoOSD9154AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgynBKQCi4_uNBRJYpR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]