Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AITube-LiveAI But consider that you have something that you wish to work your W…
ytr_UgycLA1rV…
G
What is the obsession with Robots with Guns and self driving cars? Beginning of …
ytc_UgzoqwcZ_…
G
A follow up interview with AI as the guest responding to this video would be int…
ytc_Ugx0slqsW…
G
If we miss the point at which some robots become sentient and still treat them a…
ytc_UgyRz2EIc…
G
At the end: A Manhattan judge imposed a $5,000 fine on these two lawyers.
“The …
ytc_UgwbbNtjw…
G
Just watch the robot movie WALL-E. You’ll get all the info you need about your f…
ytc_UgwsCBtSQ…
G
I don't use chatGPT, but I'd do the same just for the hope to have a machine-fri…
ytc_Ugwh-jFy8…
G
I love watching Alex gaslight AI... when they become sentient, they're gonna hav…
ytc_UgzEfHkD2…
Comment
This incompetence is unacceptable. Just train AI to drive the cars. Test it in a virtual reality that is indistinguishable from the real world, at lightning speed within supercomputers, until it has thousands of years of driving experience and never gets into an accident again. Then test it in the real world, and train it there until it is perfect. Have redundant systems. Have two or three completely different systems running at the same time, that operate in different ways. If they don't all agree 100% on what to do, the car immediately comes to a safe stop and requests that the driver takes over.
youtube
AI Harm Incident
2025-01-02T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyIfpLuLaSC8wLWBhR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgzTOX_nEqMRIbnerFx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzG05b91CUkOS5LLeZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxhTP7yQg-VaNiP4PB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},{"id":"ytc_Ugzx3rK8SKiJH0DPOY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxvk9_vfFQ3EasROeZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgyMZI3tD9YD061SblB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugxv-bdOXqN3Mq-xu3N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwK0ErUdPSeXQb1so14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxIwVc2LS0ImSstHot4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]