Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I said many years ago AI is going to destroy us. But it's really greed, the love…
ytc_UgzzdAWLM…
G
You can maybe tell her why it upsets you, maybe because it steals other people's…
ytr_Ugwsq2RiL…
G
How about literally not using this completely asinine technology? There is liter…
ytc_Ugzpcwtgi…
G
I think he is spot on. Sure, AI may do a breathtakingly good job of _simulating_…
ytc_Ugw_DOPoq…
G
Governments can’t even be trusted to balance a budget. No way can they get in fr…
ytc_UgzanOQPB…
G
Even if AI art worked---that'd be MORE REASON not to use it! God, as human being…
ytc_UgxSJ4Ynd…
G
A company sold a bunch of AI enabled cameras to the Baltimore school system. The…
ytc_UgzpPbY5k…
G
The issues with the system people are talking about are not any issues with the …
ytc_UgzDdU8V9…
Comment
This is very interesting to think about. It also raises an assumption here: In this scenario, if all cars were self-driving, could they sense everything around them or not? If so, the other cars may be able to 'see' this event happening and account for it to move out of the way to make a gap for the oncoming car that would otherwise hit the object. However that seems to assume a perfect overseeing system, probably too difficult to create at this stage. Awesome to think about tho
youtube
AI Harm Incident
2017-01-20T20:5…
♥ 105
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggAC0mV8oC9jngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjpcS32Uc2yJngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggfHBar3vbNengCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiwvgjYZIffAngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjxDEIZXTjr23gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghcPoA1NFGlengCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgitDjAIO4MRV3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY90-a_EZ8FHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugiew_Ebk3iMfngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiX854HF1O3sHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]