Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I tend to agree with you, that the study is simply observing that agreeable peop…
rdc_duv8lbd
G
And then when that time actually comes, I guess the next thing to say is "Don't …
ytc_Ugx1J-eqV…
G
Definitely adding my software to this if i can afford one. Or design my own. Eit…
ytc_UgzIhL7SI…
G
One of the easiest Things for ai, cause there is very much Data ai can learn fro…
ytc_UgyMbD3EH…
G
The thing that always stands out to me is how legitimately passionate artists so…
ytc_Ugxe_me60…
G
He is not an Ai artist.
He is a producer of Computer Rendered Artificial Picture…
ytc_UgxJnNJdP…
G
if she will get smarter over time then if he will learn how to build a robot lik…
ytc_UghriRFYh…
G
19:45
Actually, there are some scifi series that works with concept of superint…
ytc_UgzmkF6qM…
Comment
i dont actually think it matters that much how smart the damn thing gets. it matters what power they give it, and that theyve already shown theyre trigger happy with. united healthcare already used an algorythm to kill people before ai could have a half believabe conversation. an intelligence far more powerful than all of humanity combined on some old abandoned network couldnt do half of what three strings of code prompted to try and "achieve peace" could with nuclear codes.
youtube
AI Moral Status
2025-11-03T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxY0I70Rka2xXceaP54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPAxwLJP-9G-tb_KJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNy79pB1c3aKBpOAJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxccUaNrjegxSJu29N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz-ltEH3nQq-ZaUE-R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxd7Br7dyRJyWRdUst4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx69kz3yfBMpjscTrN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyVQ0oJ16qEHJrE8_x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw4EmDDbTyb4uXLpcF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpaeXUqr1HKp5PVfx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]