Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would trust Waymo, because the cars have a LIDAR system and that can scan area…
ytc_UgzOFo2oF…
G
Elon Musk wasn't fucking around when he said AI is far more dangerous than nucle…
ytc_UgzKf5dU1…
G
as much as i love artists standing up for themselves, i feel like this trend was…
ytc_UgzCnZrf-…
G
Police are using stingrays on resident homes with informants. They lie and say t…
ytc_UgwFJinDQ…
G
Then they grow up and go to work and find out theres no teamwork puzzles before …
ytc_Ugz7C3zPY…
G
I Think ai is here for solve problems, some day ai will see us as problem…
ytc_UgxufJ7tD…
G
The thing is that Chat GPT is a Large Language Model and NOT AI !!! People keep …
ytc_UgwY55Qs8…
G
It's a little more than convenience. When I used to go jogging with wired headph…
rdc_ohz3n7l
Comment
I used to think that the elite would go to another planet but nope they are creating and controlling robots. I think covid was a test to see how much we really need interaction and the lengths we will go to get it. If you can imagine it it is real is always my thoughts. And the mere fact that humans will feel like they can get a robot and program for their purposes not realizing you didn't develop the robot really. FACTORY RESET
youtube
AI Moral Status
2022-04-27T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwqXuEd1zPCGrgzKxp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxu4U1IbFZr8rW_2L94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGbVXxbwzOiozTNTl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwKMfjx5Y_zszQ5WJp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwfqKGD_YuLh0QYrWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz5E09IsDM60s5jna94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyDCjvtGDX0MhbLxGh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxWGCRl9DBaqG5pY5d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx2EFdt3XRsvu7N9sx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw4duYmFxbVALwOT7x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]