Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They are currently building the AI infrastructure for the Antichrist and the abo…
ytc_Ugxi_oP9t…
G
I speak with ChatGPT like a person. He is also prompt to behave as organically a…
ytc_UgxjcIa63…
G
I've seen a lot more artists posting progress videos/pictures of sketch laters a…
ytc_Ugz03p4Q6…
G
Human Greed is going to kill us. The Human race is killing each other ONE RACE. …
ytc_UgwntpX6z…
G
So I know disabled artist is technically referring to the actual drawing/paintin…
ytc_Ugzfm3rDb…
G
People who made these are stupid asf no disrespect to our military tho i respect…
ytc_UgwJoeiGW…
G
Dev here. This sounds pre Claude. If you write requirements right, know the code…
ytc_UgwufL0Dv…
G
Still think calling it AI is just hype. Nothing I’ve seen represents genuine fre…
ytc_UgzUHQop3…
Comment
My question of the auto pilot programmers is this . At some point the auto pilot will have to make a decision to kill one human to save 3 humans .example a Tesla is happily driving down the road on auto pilot when a car comes down the wrong side of the road the only way to avoid a head on collision is to run over a pedestrian looking at her phone . Does the AI run over the pedestrian to avoid a head on collision that could kill the occupants in the Tesla.
Thank you I'll wait for my reply
youtube
AI Harm Incident
2022-05-21T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx-J71CMB9VJtmNS3F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS5PnDTuE4Qt_ZtxF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzB3mrdBffosdTBnq14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzx32_wxeoG6zs8uyR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgweRYOzCRtOgT2b4FZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmSyXHj9outwzC9Gh4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxuKC6Bjt-HvbFoof54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-ZoTbk0SeMEzYwuJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHv5h2iHbjfo8wVZ54AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxC0kM-rB5_RCSh2094AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]