Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sam Altman is the biggest bullsheeiter to have ever lived and is solely responsi…
ytc_Ugy1vVKna…
G
Every A.I. that has been programmed so far has literally *nothing* to do with ou…
ytc_UghG18WWY…
G
Governments instituted summer and winter time change to boost factory work hours…
ytr_UgzDYd8zy…
G
@kataliyun226AI is not an alien it is a construct we've created that(in this ca…
ytr_Ugx6pdXkE…
G
Neil, you’re comparing apples to oranges. From fire to flight, tools relied on o…
ytc_Ugy5_yt6z…
G
I think property rights already cover robots. It is stupid to think that I shoul…
ytc_Uggq3RZgj…
G
So... big question... if your alignment says "no chemical weapons" - how useful …
ytc_UgxAts-yK…
G
Sigh. It isn't though. I'm an ML engineer and AI is already starting to automate…
ytr_UgzqAFwsA…
Comment
I suppose we could have automated driving which is as bad as a human driving to which I say to such drivers: fuck you. Drive the car yourself.
Because if YOU fuck up you can be held responsible. If tesla fucks up, I need to hold a multi billion dollar corporation responsible? We have liability insurance, the worse you drive the more you pay until you decide "I don't like paying this much, I'll try to be better driver or at least drive less".
Autopilot can't be mediocre, it has to be excellent.
youtube
AI Harm Incident
2022-09-03T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw8IPKUTQvdgs9wllJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzfdZMldAwNJldlQg94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzb2h3vbVR5aGZ2mwN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylZdEHWFL2WufZ7wR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxOfLX-xbJRG3AFZ4Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiOLMBRwSvvP8ysxF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyIp_6-d-6urDdW6U54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxxG07_7OUjZAyyPSN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1RaRxo6z43Wjt58J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8ZnnajI9ZgOKcIfx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"}
]