Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This could be partially true but what I think is happening is for the most part …
rdc_mzy448f
G
AI art is fine. Also users train the AI like summer 22 all hands looked like spa…
ytc_UgwjpgHg0…
G
Reminds me of that video of the twitter employee that inspired Musk to buy them …
ytc_UgylS5750…
G
Interview Clif High and Google AI together with the focus on prearranged topics.…
ytc_UgxOZNTbd…
G
Enjoyed the interview. Ah the ol simulation eh. I'll call and raise 2 bits.
Pe…
ytc_UgwBRNPha…
G
I would rather have either of charlie's portraits hung up on my wall instead of …
ytc_UgyZevRhL…
G
The problem with your argument is that its esoteric. If you want those types of …
ytc_UgwdKuLtw…
G
I am of the mind the LLMs CAN’T get us there. It is completely different technol…
rdc_nc1it3i
Comment
Yeah we can just unplug it, but think about that conversation they had about military uses. If we put AI in charge of weapons, and don't have a human approving the lethal actions it can take, it might do something truly terrible. We turn it off afterwards, but the damage is done. The whole "AI will end all of humanity" is sensationalist nonsense, but the idea of AI causing real harm is pretty real and grim.
youtube
AI Moral Status
2026-03-06T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgzWToaaFAdA9MEIEI14AaABAg.AU1vwACENl7AU2tSzu9B92","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzWToaaFAdA9MEIEI14AaABAg.AU1vwACENl7AU37i0Cscb_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgzWToaaFAdA9MEIEI14AaABAg.AU1vwACENl7AU3D9oWY1w3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxXj9KEiwsURpabZj14AaABAg.AU1W58dojJBAU21_3osamf","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxGc3a0Fh99vb9WdoR4AaABAg.AU1F0p9rThvAU2c-IG3zQ_","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxCTdlFgzTTfRmyPC54AaABAg.AU183r8toXbAU2dqe0YM8D","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxGKE4tPvzgY7pgwzp4AaABAg.AU0YC-sqAwtAU18_fiNyfm","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxGKE4tPvzgY7pgwzp4AaABAg.AU0YC-sqAwtAU2eqZKatGN","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytr_UgxHodCfgC2FkyVdsCp4AaABAg.AU-ygGGGTvGAUeOu8mWUIL","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxHodCfgC2FkyVdsCp4AaABAg.AU-ygGGGTvGAUezDvbOISH","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]