Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One thing you know the AI trucker won’t do is be on the radio spewing racist tal…
ytc_Ugw6cJ6TG…
G
Damn right "train to be a plumber" its going to be a bit longer before a robot c…
ytc_UgwcjMAJL…
G
Well done on letting the debate play out, it can't have been easy.
I watch/liste…
ytc_UgzRLhb6Y…
G
chatgpt is out for 32 months and people still want to become software engineers …
ytc_UgzdTUcvk…
G
I hope that super intelligence does the math and decides to help humanity instea…
ytc_Ugx1NWlJ8…
G
It’s nuanced. Those who say AI is a fad like crypto and it’ll “pass” also have t…
rdc_moz0wg3
G
To defend ourselves from AI killing us, all we need is a real life Overwatch Som…
ytc_UgwbY1zdd…
G
I have carpal tunnel, back problems, and an extremely hard time making money tha…
ytc_UgzhMOGuK…
Comment
Eliezer, at the least, has been advocating towards provably aligned AI well before the big AI trend started. And, really, this hypothetical SAI doesn't need to be ultimately intelligent or rational or whatever, all it needs is to be able to reach its goals better than humans can, and not have the exact same ethical views and values as what humans would want it to. This is totally possible. We did it to every other animal out there, when we became better at achieving our goals than other animals are at theirs. To think that it's a sci-fi apocalyptic fantasy for anything smarter than us to ever exist is just... well, arrogant.
youtube
AI Moral Status
2025-10-31T16:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw8t2pJuvDSdk7NpZ94AaABAg.AOwTx350hytAOxD-cCZhcw","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOwZSv2F5cP","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw2dejtxDMfqDtsFHx4AaABAg.AOwRqxCGf4DAOw_AgBIoZB","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwW8P-SdxB","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyQ6cX3vzGK0IYWCip4AaABAg.AOwLp8faSPLAOwZ5xvXRQX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxDWb50YJx","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxED2njSyV","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzsZXVqHuryCnOFNR54AaABAg.AOwLAKvGibKAOxmxMIv2GS","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOwzcEtaNIK","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytr_Ugzgpt1tdS4toFzLxIZ4AaABAg.AOwKrJFF_pzAOx47Esi_NE","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"disapproval"}
]