Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Me Personally, I know we cant stop ai entirely. But I think human creativity sho…
ytc_UgzHg7IlA…
G
The more I am watching this the more I want to take hit over this red guy, I mea…
ytc_UgwN-evor…
G
"Great insights, thank you for sharing! I recently uploaded a conversation showc…
ytc_UgxX9BZkm…
G
lol as a artist I am so thankful for ai it has opened up more creativity. People…
ytc_UgxsJinDV…
G
I couldn't even get through this.... Wow people are worried about morals and rac…
ytc_Ugw1FwtkX…
G
i'd seen the general "criticizing AI is ableist" thing before but i had no idea …
ytc_Ugxx6dpFo…
G
The problem is, most car crashes by humans can be prevented (except if you're a …
ytc_UgzkM0yBf…
G
So this chatbot company managed to program a very toxic relationship personality…
ytc_UgwMtwwMi…
Comment
The implications of the crash is worse than you think because if the driver didn’t have his destination in the gps, then how will a self driving car know which way to turn?
Even if the car stoped and didn’t hit them, the car could easily have turned the wrong way, and when the distracted driver notices, then they’ll will presumably try to turn around immediately even if they don’t realize another car is heading their direction
youtube
AI Harm Incident
2025-08-17T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUvURAXPpt_LnbY6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9mnxS1OupQtXq9bF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxtiWOqFpInUS9L3PB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxV9xQvtQpFpyioxOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxt374a4jhPLUocwwp4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzOQxNYoBpW0ClLqgF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwocRcvg5U9DkT4FK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgynGXyWljYnCbeX9EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgybR10bTgx_lzaRWhV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy7L8oGz1H-x8rtS9R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]