Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Three companies I’ve worked for, including my current one, are heavily investing…
ytc_UgyQ9LL_s…
G
Mikookoo, who payed you to make people complacent about the danger of AI, and sa…
ytc_UgwWVCgTe…
G
I just can't see any positive outcome from this. If AI succeeds, we're fucked fr…
ytc_UgxU5Z7kM…
G
imagine that somewhere an AI system completely overcame is biological creators a…
ytc_UgwaooWkf…
G
It’s ridiculous, but he lead off with the only thing we need to know “nobody can…
ytr_Ugz7T-OPk…
G
I always love watching a really smart person say some really dumb stuff. Makes m…
ytc_Ugx8LE9VN…
G
the biggest issue around AI generated art is that the AI doesnt just spontaneous…
ytc_UgxFv9wDz…
G
Irony is AI is everywhere but yet didn’t touch any government officials, their w…
ytc_UgzKmgU5n…
Comment
Technology that allows for the an entire system of self driving cars should also be capable of creating solutions in which cars correspond and send signals between each other. With this said, it remains apparent that the simplest solution to this issue would be to have the cars act interdependently. In this, the cars surroundings do not act as barriers, but rather resources that can be used to help absorb and distribute impact. However, in the situation that a decision to injure is unavoidable, I would assert that a utilitarian decision must be made considering the age and number of passengers aboard.
youtube
AI Harm Incident
2016-11-23T00:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgjGy_ree2B0EHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj96NpyN-f2BXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghqMvbGky59jHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugi_k_2d8FQ3c3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugg_qQYiL1e7ZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggUGDnRAEQYy3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UggfRtqOpBkxgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggNnXWdPpcRW3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjqog_GKULDRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UghYlkS6IWtLL3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"})