Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can do the work of a dozen radiologist in seconds. Whilst it is certainly a s…
rdc_mzz751n
G
Ya kind of left out a small detail...I love that you are so positive about the b…
ytc_UgwkHSt2s…
G
By all means, we (humans) shouldn't become so dependent on creating easy problem…
ytc_Ugx6S20oa…
G
Once I broke the a.i filter. ..William A tried stuffing me in a suit and the cha…
ytc_UgzZRtkaD…
G
The way AI generate pictures (i will not call this art) is by starting with nois…
ytc_Ugz_6e-gV…
G
9:54 only point I would quibble with / refine. Sonic sources don’t have to be pa…
ytc_Ugz7zL9vq…
G
I’d wager everything I own that not a single truck driver stood in true solidari…
ytc_UgxZIb6tX…
G
I actually think the tech trend is cycling down on the personal side. I think it…
rdc_ohlicbp
Comment
I've been meaning to ask: Does anyone know if self-driving cars have an "emergency stop" switch/button that the passenger can press to compensate for AI/system oversight? I feel like cases such as this warrants one, if there isn't one already.
reddit
AI Harm Incident
1573271865.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_f6z46qk","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_f6xab7q","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_f6xae4f","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_f6y9jng","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_f6z6x0i","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]