Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Something like that would definitely keep kids interest and prepare them for the…
ytc_UgyTM0ydr…
G
> who keeps promising AGI is just around the corner
Well, this isn't like El…
rdc_m9ggii1
G
In 25 minutes, you spent almost no energy actually answering the question "if AI…
ytc_Ugzlvhxz9…
G
This agreement is about banning the use of autonomous weapon systems. IE: Alway…
rdc_k8xbqj4
G
Conscious Exascale AI quantum supercomputer observes and controls everyone if us…
ytc_Ugwx-BTZF…
G
Will US companies be able to compete with AI generated art in countries that all…
ytc_Ugy_wyMBl…
G
🫠 AI can build ad-hoc interfaces on the fly in coming future 😢 we need to learn…
ytc_Ugw4i3-rx…
G
This family deserves absolute justice and AI needs to be held accountable. This …
ytc_Ugwh8qtqX…
Comment
36:33 Can't believe I'm saying this, but I'm kind of on the clanker's side here. If you placed a rock near the lever in the trolley problem, you wouldn't say that, by not interacting with the lever, the rock is making the moral choice of harming the five people, right? The rock doesn't have the ability to choose, which much be a prerequisite for the choice to be a moral one.
Likewise, an LLM doesn't hold moral beliefs, it can only reason about beliefs in its training data. So while you might be able to procure a rock with the letters "I choose to pull the lever" drawn on it, that doesn't make it any more able to actually make a moral choice.
These videos are really interesting, but they kind of amount to technical ways to "draw the words on the rock" as it were.
youtube
2025-10-06T19:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKP4uR00qGCyn6LuJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0J6GCUL_lgAM0nml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwgxcUuA4OHpkahfvF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_FU3JGW8agMF0gIl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzQC1bPz-9zatqpC4x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1_jQ9gdA_nhiITGx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdPpsmwAObwOBUEWp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5dmHtMnYsQ40DWGZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzYk88DBjpJrKyokTd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwSm23SLiASIaNkDEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]