Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone waiting with baited breath for AI to be a 'wonderful' part of our society…
ytc_UgyY6I6MO…
G
It’s easy to say “there is something very wrong with the AI.” But that probably …
rdc_kp0xtnq
G
Logitics will be the first and biggest hit initially. Postmen and delivery pers…
ytc_Ugynk6CPU…
G
It really does NOT tho, we don't see anything to suggest such an extreme result,…
ytr_UgztvDQ9S…
G
But thing is robot's cannot feel pain, they can not be sad and they cannot have …
ytr_UggwPCXEg…
G
No you replace a them with AI they will treat the workers like worthless drones …
ytc_UgyghbmPL…
G
This is amazing work, thank you for the detail and the openness.
This seems li…
rdc_jdl0fbv
G
Has an ai ever willfully murdered a person yet?
The answer is "no".
When a fac…
ytc_UgxtaMXr7…
Comment
I’m excited about the day the autonomous trucks kills 1000 people a month then let’s see how they respond to that one. They’ll probably blame the stupid moron that would take the right seat just to monitor the computer at minimum wage. This is the whole reason for the 18-year-olds going interstate is they need to have a pizza face kid watching the truck as it hits the old lady or the young lady with her baby carriage. Can’t wait!!
youtube
2019-08-22T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpM5gfdXb8J3dKU5l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx09WPyQfQ6ANOu4ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwZ3qfaIQ5zR7CGV-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzmLuLSmiGO1L4CUQ94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztaoeAbEc6MSsP6Il4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyaesDzzN0D7-Ledx14AaABAg","responsibility":"government","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyjs9ub3LaiP30j00N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMHIW1CGYkXgVokkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzzA61XULuu-JuI5Xt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwT27UTMmVd0xdo21h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]