Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
wth? using AI to make a "drawing" and actually drawing a picture are 2 different…
ytc_UgyW8p5cx…
G
the only thing is, why would he have to remind AI to say apple? for me that mak…
ytc_UgzyKGIIz…
G
Using AI for making “art” is not “saving time”, it’s outsourcing it to an algori…
ytc_UgyVUEKu5…
G
I like AI, but not when it tries ″creative″ stuff. I like the idea that my car w…
ytc_UgxEi1jYm…
G
I think a scenario like that is impossible, because of economic reasons and soci…
ytc_Ugy2Dm3zo…
G
My Dutch boss asked me once: "If the driving of a truck is automated and you jus…
ytc_UgwVl9zMC…
G
I do appreciate that you all touched that altmans "AI too scary and powerful" sh…
ytc_Ugy0_a3pl…
G
I appreciate the AI content recently. It’s the most important challenge we face …
ytc_UgzrdzLzW…
Comment
One of the problems with most of these cars all the warning beeps are the same. How about you given audio difference by the car saying what’s wrong? And if you’re not allowed to use the auto drive suite of products can you use the individual products on their own like cruise control autonomous safety breaking or these features all disabled making the car not as safe After.
What is safety and what does safety mean to each individual person involved in an accident? This is a question you must start to answer safety to the road juices safety to the passengers safety to the driver of each individual on the road. Who is Tesla responsible to the safety? This is that question?
youtube
AI Harm Incident
2025-08-15T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxWWyBn6n6G3LSZWm94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgSbnhGhn5jljvGz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwOIY9-DDAfetIgUC94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4Wv9FgboFZm_4QAZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxmLQTnVEWeEVulOt14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxvqUU-rfxHzSPR-R54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxFtU7bSXsjMDIctMl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxAntYtc1sV3QQm8Ih4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-6ID5ni9LkCFqgxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZWlEUFoWuYLVY7ix4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]