Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
‘It’s going to make human intelligence more or less irrelevant’. Worth noting th…
ytc_Ugy-Gujee…
G
I presented the freeze hypo to ~10 LLMs… All said they would not harm human, exc…
ytc_UgziKsYoY…
G
It’s a program that you put use to essentially make your art completely incompre…
ytr_UgxymDU9f…
G
I understand why a lot of artists, designers, and creatives are upset about AI a…
ytc_Ugx3riGwC…
G
AI thinks it is conscious, just like we do, but something is missing. The Creat…
ytc_UgwRA0xqZ…
G
That's what the robot actually think...before make smarter robot....we have to f…
ytc_UghQX--95…
G
Couple things. Is there any actual evidence that doge has removed safety systems…
ytc_UgxvV-nZK…
G
Err try addressing out this "problem" with the Organic intelligence first then m…
ytc_Ugzfchq1u…
Comment
One solution is to make self-driving cars affordable and ubiquitous on the road. In the end self-driving vehicles will save more lives than they would take when on auto or manual mode. Also, secure your shit before getting on the highway xD.
youtube
AI Harm Incident
2015-12-09T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]