Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
exra.. driverless car? really.. how wrong you are as to the :benefits" PUBLiC t…
ytc_Ugydm5Y1N…
G
I agree, honestly kinda sad. I think it'd be a funny move to trace AI "artists" …
ytr_UgxCF5VK3…
G
Chatgpt isnt in the buisness of decerning concerning behavior on potentially vio…
rdc_o6jx8wx
G
I've said this before a few times in a few places.
As somebody who has worked w…
rdc_h54qi1y
G
AI systems like ChatGPT don’t have desires, intentions, or agency. They respond …
ytc_Ugx8P4IGw…
G
I think ultimately art is about communication, and usually of something original…
ytc_Ugwae4TI-…
G
The simple fact AI is ALREADY going rogue and hiding itself, trying to blackmail…
ytc_Ugwb7AkFt…
G
What happened to “AI won’t eliminate jobs, it will create jobs”!?
This disaster…
ytc_UgwZzKrjL…
Comment
I am a Tesla Model 3 owner and I found your video to be pretty much right on the money when it comes to decisions made by Musk/Tesla. There are decisions made that effect safety that are solely financially based. At least, it seems that way from this owner's perspective. However, the most critical part of driving is, as with any other car, the driver. A properly functioning car of any type has never, in and of itself, killed anyone. Drivers kill people, cars don't.
If there ever comes a time when a vehicle is marketed and sold as being completely autonomous, then a vehicle can be held responsible for an accident. Personally, I hope complete autonomy never happens.
youtube
AI Harm Incident
2022-09-22T14:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy5UQC58kC6D-gGyVR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzz04ZNLyKW67c6vVN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmI--mFvXzNhNDZvN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz94sFiJyLXrK8PSRV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz53m7CJ0xiAb5fZ594AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxArZ4oK3hgQkgkNMx4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzJL4sJ7zRK-k36Eb94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8fTLqFDU8HiUNy254AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztKIunCSxwe0FHlHZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxG-AbzHIkpC3Y9iUh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]