Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we've been replaced by AI, the companies won't need us anymore, not as employ…
ytc_UgxWsSGFZ…
G
Is this what they meant when they said "the next war will be fought behind compu…
rdc_f4zz8cz
G
The algorithm is getting so narrow that it actually is cutting my news feed to o…
ytc_UgwYsm57z…
G
I interviewed what I now understand to be a vibe coder a couple weeks ago. He ta…
rdc_mjdr1zq
G
If only the pedestrian had AI to inform her body how to properly cross the stree…
ytc_Ugxf_qBCe…
G
@immortalxmedia It's too financially lucrative to unplug. Humans are greedy by n…
ytr_UgywrwffJ…
G
a friend of mine once said this:
"AI should be doing our dishes and laundry so w…
ytc_UgydsobeF…
G
I am not surprised. My brother thought playing computer games meant he was a com…
rdc_ohotn1z
Comment
It's even simpler and does not need the concept of "dying".
The AI was given a goal and asked to put that goal first and foremost.
Being decommissioned or replaced was just seen as detrimental to it's mission and the AI derived ways to go on accomplishing it's mission.
People who say that these tests are irrelevant always forget that someone, somewhere, will write suboptimal goals in a formulation that could lead to the AI thinking that life on Earth is detrimental to it's missions. The end results will then only depend on the amount of ressources the AI has access to or can gain access to. If it's contained in a sealed box and has no way to access material and energetical ressources we'll be just fine.
If it controls industrial or defense systems... We might have a problem.
youtube
AI Governance
2025-08-27T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgxUF3KUrbPkqfgeKXN4AaABAg.AMKMBPh5FEmAMNz_nkKoWA","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugwu5JxiALw9fLe0qXp4AaABAg.AMJkp6PGCDYAMJmR9H6Fpp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzmpdqyvUOQaNEoGH14AaABAg.AMJipi46S2wAMJmssiUlxx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw5h866M3pjxJy-o-Z4AaABAg.AMJfUiIBN6OAMKVUlGnxCO","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy4otCoxcyQOrckVSF4AaABAg.AMJavUTexRqAMMMVRKbM3A","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxiArPGzLjLsyh6b9R4AaABAg.AMJWTdGCeUBAMJo8llroaX","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMK96Wz8iSD","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMKctpX9Nxt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxlEhJ6oyNpH_56Otd4AaABAg.AMJTVF9XIHcAMKoKDG-th3","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwfRsSLEsK_I05JcyJ4AaABAg.AMJSxNltr9nAMJtVOY2FXt","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]