Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The more self driving cars there are the better they’ll be. Without human error …
ytc_UgwAofyVl…
G
"what's wrong with ai? it just speeds up the art making process" is what uncreat…
ytc_UgxiAnhcp…
G
I want to punch all robots.. I don't know why but it is the first thing that com…
ytc_UgzcjUo8H…
G
Our future isn't full of robot slaves feeding us grapes while we grow fat. Wheth…
ytr_UgjWl7Q-f…
G
Watching this while watching the show Humans that is about synthetic humanoids t…
ytc_UgxLerjrr…
G
Phones on the other hand DID kill going in to the bank! ATMs didn't 😅 the man wa…
ytr_Ugx-4M5X4…
G
যখন রাশিয়া ইউক্রেনের উপর হামলা করছিল তখন আমেরিকা পারেনাই যে রাশিয়াকে ধ্বংস করে দ…
ytc_Ugy1FNu5Y…
G
My perspective is that rights are a natural and necessary consequence of machine…
ytc_Ugh5Sf2tv…
Comment
Have you noticed that the big Tesla crashes all happened at night or during lower visibility (seems it was foggy in this video)? Tells me their "camera to computer system" isn't or wasn't as good as the "human eye to brain system". Most people would've slowed down with flashing lights because it could be an accident scene with an obstacle in your lane. But it seems their computer doesnt have such memory or experience to refer to. Probably why they've now switched their autopilot system to "do what the humans do". AI that learns from their millions of hours of "good driver data". Hope their "good driver" definition is the same as ours.
youtube
AI Harm Incident
2025-01-20T16:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwIGm-P-GlgZfLfSGJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy-ah4dWg82e2XD3AF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOO2dQ-OxJIJKb7eh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxCEtyW9TOm2turxDZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwxie838pcP0MygAph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw8_5Lo5l0a9Yn95Gt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy7hlyjlE4MHOHdLpF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxuzHjeZM5ExcnbWRd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy16S_ezLM1VYIlU194AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlCVcMIH6dO6KwpB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]