Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That looks amazing. You're right, though, that we won't know if it worked until…
ytc_UgwnAKlMr…
G
I will boycott companies who will use A.I. technology to repace human employees.…
ytc_Ugymb4hob…
G
If art can be obtained through technology then AI can be used to make art. Is AI…
ytc_UgzA18ijD…
G
Self driving cars are a version of the trolley problem. Do we want to choose fe…
ytc_UgwaemTH8…
G
I find the conversation on what would happen to humans if we reach AGI very west…
ytc_UgylOcMtm…
G
I think the girls recognized it because women ate naturally more detail oriented…
ytc_Ugwwvl9Io…
G
In the end, AI will never be a evil as human beings. You just can't program in …
ytc_UgwTZiz9C…
G
Let me be very clear I had same perspective last year 6 months back , but true A…
ytc_UgwMbw5gs…
Comment
This car is driven by a computer. Sometimes, computers don't (always) work the way they're EXPECTED to - which is why they need repairs or software updates/patches, etc.
The fact that UNLIKE any other vehicle out there (right now), for the most part, these vehicles can get you (safely) to where you need to go AND back without any engagement/involvement whatsoever. Waymo is still in it's beginning stages.
The jobs/careers of professional, Uber/Lyft, and taxi drivers might not be as "safe" as they THINK they are.
youtube
AI Harm Incident
2025-03-22T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgycphhweL3H1_TGluR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0TYHVGiURQO_m7yN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxM52seW1-UHY_Jked4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwTv49M-hB-B73er_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzDYrg2ejHr9dt5M8p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgyIkLTOej87JpFWay94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxoBqiDQTTiv1QJDXd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwJhzs7_SG-G20n02p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"outrage"},{"id":"ytc_UgxR8vG33hjebCbDm8h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgwkgS3658VdbedXmT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}]