Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most striking thing about this is how much better ai art now looks in just t…
ytc_Ugxf5_xGu…
G
The beautiful thing about artists is that they can create under any circumstance…
ytc_Ugzyo1lLu…
G
prompter is such a better term than ai "artist"
it also sounds like a slur so bo…
ytc_UgwBUhU0J…
G
@Vorobiov_Evgeny AI will cause a lot more pain. This is not just horses --> cars…
ytr_UgyNingHk…
G
A.i technology is just the extension and automation of what humans can already d…
ytc_UgyMVlarJ…
G
I like your presentation. Can you really write from right to left with flipped l…
ytc_UgyHtMNaL…
G
@deejay1216,
Breaking = falling apart
Braking = slowing down
If something isn'…
ytr_UgwmnS1uu…
G
@ihavetostoparguingonline ummm no, i don't listen to these people and have never…
ytr_Ugw3vJ1b-…
Comment
The solution is that we need a world of ALL self driving cars. That way, they can communicate in a split second and do the necessary braking/speed adjustments. Or, it could be a world with MAINLY self driving cars. If one of the cars on the side was self driving, it could communicate with your car to allow it to swerve out safely
youtube
AI Harm Incident
2017-07-13T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugjnw_pI28jYpXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwOGDepVXCWngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj3YY9osWlB4HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UghifWP6y7_ogXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg4ldklSPeo8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjywnlXpJLqFHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjokRxbpwiSqHgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UghBFWCU-Fp7bngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjOOT8Vua498ngCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgirnfINQGNpP3gCoAEC","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]