Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So people give AI a conciousness like humans, unrestricted access to human histo…
ytc_UgyN5k7k-…
G
We're not worried about the machines. We are worried about what the machines wil…
rdc_f8t5v5i
G
Wow shocker, the guy whose future investments is tied up in AI thinks AI is goin…
ytc_UgxKX8UEU…
G
Sam Altman doesn’t know what will happen to larger global society once AI takes…
ytc_UgwpeVPuN…
G
I get all the negative sides of AI generated art but at the same time it’s such …
ytc_Ugy0RLx9A…
G
How about us using that free time for emotional and spiritual intelligence devel…
ytc_UgwN0HGqI…
G
I am going to be the reason AI turns evil bruh me and chai ai having some cursed…
ytc_Ugx1vTTHU…
G
Says this AI generated video. It may be true, but it is an AI video sucking us i…
ytc_UgyUIFHyD…
Comment
You are absolutely correct about labeling of data being a core problem. Comma AI are a company worth checking out for their end-to-end approach. Rather than hand-holding the AI with "this is a car; this is a bike; stop if about to hit car", they removed labels entirely and simply say "learn what humans do in this situation", through the use of massive amounts of driver footage.
They also have driver monitoring that knows where you are looking, detecting if you pick up your phone or look at a passenger.
youtube
AI Harm Incident
2022-09-12T11:0…
♥ 39
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzuDyEE__d7i3aB1N54AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxYgTPwEHd-etqCGPN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugws3zajWGwX1wYUeKd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPT1uwJgKBKos_87V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRvW6Iae-LMJfkZzt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxpAuTXrnb-wKhwjwl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzwj0UjpgLVlMob-cl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxrYlmt2lhG5glTx9h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxHjf_i9lUTPospV-R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzSR7Gw7MnqlqgQVLt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]