Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People (including ones who don't really understand the code they're asking for) …
ytr_UgyEggkvP…
G
@AmbitionBurnt dude I literally read an article about an Ai “artist” receiving b…
ytr_Ugxqk2avt…
G
AI should be stopped immediately!
and all humans engaging in its development sho…
ytc_UgyCn4T4Y…
G
I'm so done with this already it's starts to become really entertaining haha. te…
ytc_UgwKlm3uw…
G
I asked OpenAi to do a painting that represents its emotions and it crashed 💀💀💀…
ytc_UgxRGj5r1…
G
Mad world, reality often surpasses fiction; we don't need fake AI videos. These …
ytc_UgyUUOcnq…
G
Elon Musk has been promising autonomous driving for over a decade... and they th…
ytc_UgwwPos9M…
G
For those who think ai can NEVER replace programmers. Just try chatgpt. Yes it c…
ytc_UgzxD8oCb…
Comment
I have had the adaptive cruise control/emergency braking in my Mazda 6 trigger ONCE on the highway where it was completely unexpected. 80mph in the express lanes, topped an overpass, and roughly a third of a mile away traffic had stopped. That Mazda hit the brakes HARD, even as my foot was repositioning to begin braking. Both the AI and the human (me) recognized that it was time to drop out of warp, but while mine was a more modulated analog response, the AI decided to go full binary "Brake Now!" with a third of a mile of empty road between me and the stopped traffic. I don't fault the system for seeing the issue ahead, and to a point, I would rather it panic-brake than run headlong into the jam. I chalked it up as an edge case, and also a reason why humans should stay in control of vehicles. My worry is that humans following such a car would not have that kind of reaction time, and rear-end it... or because it came out of nowhere, they'd assume it was a brake-check and go full road rage on the vehicle/driver.
youtube
AI Harm Incident
2023-02-17T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzpL35m0Dv0A1cZFsB4AaABAg.9i5dVPvjdii9mFJ9Z1OWI6","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugycg_xd886XfDIBGON4AaABAg.9huDX6cAbhX9hueWuMLNZm","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugx-htXtuA8TYKqwh5B4AaABAg.9hc-RPfNC2M9qk8yk6g1vn","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgxJxJ1wepaQNeFhIyJ4AaABAg.9hWWH1oIe149hYKAymEy4m","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwmwRH51pnPLU6lCQh4AaABAg.9gUIk1BKSAm9lWyDbPpk10","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgyWsERcY3qOsYIVrpV4AaABAg.9gMvJztQ-2i9qFlsLM6WHl","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwYfEXyjtlZZ-mp9mV4AaABAg.9gHvIbPvxI69mFM4cux0nD","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugzz04ZNLyKW67c6vVN4AaABAg.9gHXv1w7AiT9gWfjgxKMhX","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugwr1sF6E0YCZk24OaF4AaABAg.9g5yu2n1VbZAJEWe-e87rp","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgymT5rwDhTO8TO-nM14AaABAg.9g0_nrL9V0f9gwL68ASfFI","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]