Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So if AI replaces our congressmen, they will be more intelligent, be capable of …
ytc_UgwlnczxX…
G
@JimCarrey2005 I’d rather bust down AI developers doors then have their AI event…
ytr_Ugwfkk2Dq…
G
I don’t know how it won’t look at humanity as competition for resources, and as …
ytc_Ugwl6vQw9…
G
I work in the automotive manufacturing industry as one of the guys implementing …
ytc_UgyuCitL1…
G
AI will allow the ruler class to be less dependent on the common folk. Eventual…
ytc_UgwjBXv0C…
G
Instead of trying to deal with AI, use AI to facilitate students learning.
Ai, a…
ytr_UgzCN-c28…
G
To be fair.... Our current Government got in for a second term. They are a bunch…
rdc_da4ccp8
G
So what are you celebrating? Nothing has changed. AI will continue to grow, and …
ytc_UgytFEi-j…
Comment
This is only a problem due to human error in the first place; if all the cars on the road were driven by cars, then it would be simple to make sure that each car will always get enough space or time it needs to stop/turn. "Boxing in" would be prohibited.
This ignores the original human error in the very start - not securing that cargo tightly enough. That too is something that can be taken into account if the world drives self-driving cars.
youtube
AI Harm Incident
2015-12-08T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UghzuMMpBbsZkngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ughhc-RnxMS1LXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjY6HJikXmw-HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIQezVaUOb-3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgicExh_IjSyAXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjRcySEHSlNsngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugha7FPAvBu3AngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UggqSiIbUqJPI3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UghJ2QECp_kzO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghQd27Kawk0s3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]