Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Without doubt the software has the ability to drive as a human would on the slig…
ytc_Ugz578Hly…
G
While I do believe that AI has some uses and can be helpful in various ways, try…
ytc_UgwiA2qOW…
G
I work at an IT company in India and regularly use AI both in my job and for my …
ytc_Ugyyj1waU…
G
I don't understand the thought process that a self driving car would lead to....…
ytc_UgyIqzEAg…
G
@vidamariaixchel4962 But I'm not afraid at all, because I have faith. I'm simply…
ytr_UgzS3RD61…
G
I've been thinking about this a lot recently and actually talked to ChatGPT abou…
ytc_UgzVwIOU-…
G
@Shadow-Wolf1492 aspects were. Not all of HR. I do employee relations and strate…
ytr_UgwBeXgZk…
G
if people of colour are having their rights slowing being stripped away, then wh…
ytc_Ugg8vx-hP…
Comment
Autopilot didn’t kill anyone. The Tesla driver did . Failure to monitor is inexcusable. Charge driver. Don’t blame Tesla. Tesla clearly warns ( as does GMs super cruise) that it’s drivers responsibility to pay attention while on auto pilot. Autopilot clearly augments, not replaces drivers attention and control.
I do agree not to call it autopilot nor full self driving capability. That’s an accident waiting to happen.
youtube
AI Harm Incident
2022-09-03T15:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwXq3HjQzB2quHhhkt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvHBsrIpyOZYvPAyh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy67ZbxeNoCBQfq5-54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUfRHoPubgJTkiHkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH5fJDcTsxiGO-UDd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzz84wo-J31GrMQ03x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw45s8wYvl_u4RiUt14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzvT7QXBAILjw5TX1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz1Zc3DzI9F4YNWIT54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOouGOdK2zx3_88lp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}
]