Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@babybatbailey03 it wont be because I’m pretty sure you’re allowed to make ai ar…
ytr_Ugz97ObOC…
G
But unlike Teslas in self-driving mode, they didn't crash and burst into flame, …
ytc_UgwH9nQqU…
G
But let’s make AI good for us normal people I’m not sitting around making video …
ytc_UgyQy14kB…
G
How was Isaac Newton correct with the theory of gravity?
Ai: apple
How did the d…
ytr_UgwCGaKjE…
G
It's a written exam, the clinical exam is arguably more important and something …
ytc_Ugx73hvfw…
G
If you hit someone and they die, you go to prison. If self driving does, they pa…
ytc_UgwSTkXLP…
G
All of these can be avoided if humans can just stop being so lazy. You should ke…
ytc_Ugz-YzlNc…
G
@2Bluzinmore fun fact for you. Both camera & Lidar use active photon generatio…
ytr_Ugxx5Nvv2…
Comment
What I am scared of is a group dropping a 100 or more robot that shoots everything that move in an area. Maybe have crude recognize system that will not shoots at something with the right collor on it.
Put a hundred or more in a area you are going to attack. Anyone trying to defend has the risk of being shot. The the aggressors will not usually be shot. But it is worth the risk because the moving things without the right colors will be shot.
I am not the smartest brick in the pile. If I can think of this, I know other people have thought of this.😢
youtube
2026-03-11T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxjFYESy6tomjwN4d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhSkA-8HdrOjj_7dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy0gQLgLOo6OkcUUVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwaPyWNZQ6ha3J2eg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgymZ5qYdd5I8ml6IAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxTT4_0jF2ORMzHTDR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTiWKgZdR_o3XQXdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy72D3wTYuNj6RjDVt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgynN6nUrxQ7PRQKWVd4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza79EGRXrOH4F_VZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]