Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A robot does not care for justice. A robot cares for what its programming says. …
ytc_UgjywnlXp…
G
to all fellow traditional and digital artist don't battle ai image it's not wort…
ytc_Ugyz9jl2M…
G
Because every useful task can be performed without it. Therefore, if we put stro…
ytr_UgxAkrhfd…
G
Skynet became self aware 8/29/1997 at 2:14am (ask Siri) We’re still here. I agre…
ytc_UgxBrVoI6…
G
Until AI learns to produce the best possible genetically modified meat in the l…
ytc_Ugw3m8WYO…
G
Chatpt doesn’t “ try” to do anything, it doesn’t know “ try” it’s just an algori…
ytc_UgyfT_MdO…
G
On the other hand Human kind is as dangerous as a robot and yeah it’s another sp…
ytr_UgyZsa_Pv…
G
thank you for talking about this! its absolutely disgusting for someone to even …
ytc_UgyjytgHi…
Comment
If at least most of the cars are self-driving, and have the ability to communicate, then why doesn't the car either quickly stop or suddenly stop? All the cars behind our car will stop (up to a point that depends on traffic), and no one gets harmed except maybe for those people without seatbelts behind us. So, the solution is simple: Cars communicating with all the others instantly, which we already do with instant messaging. Why not program it into a car to maximize safety?
youtube
AI Harm Incident
2017-07-18T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggJqTTxAgQpuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UggSR5TBSlFvAngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugg3Ooi6amKBaHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgiIyXvapa3ghngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugg6S0mO_XY-BHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugj2e2pqV-vGsHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UghPxCznJQ-7DHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UggvYPle2Wkb6XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgjlhktnJUrllHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UggvUq6OXIbKO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]