Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
In real world this situation would never happen. Let say the only self-driving car on the road is behind and truck, without factoring what's on the left and the right it should have kept a following distance of 6seconds. Now to prove my point I need to do a little math here. 1 miles is equal to 5280 feet, at 60MPH every second the car would be traveling at 88 feet per second. and the following distance should be 528 feet. Let say the obstacles drop from the truck and move toward the approaching car by 50 feet and the car took 4 whole seconds to register it and take action. At this point, the car has 2 seconds which equate to 176 miles minus the 50 feet that the obstacles had moved mean the car has 126 feet to brake from 60-0, I'm not sure if you know 126 feet from 60-0 is actually super easy to archive with most cars these days. Also, factor in the fact that those Self-driving cars often comes with superior parts would mean this should be no problem, also 4 seconds delay is super long. I doubt the car would require that much times, however even if it did it would still brake in time.
youtube AI Harm Incident 2016-11-21T06:0… ♥ 260
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgjGy_ree2B0EHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugj96NpyN-f2BXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UghqMvbGky59jHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugi_k_2d8FQ3c3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugg_qQYiL1e7ZngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggUGDnRAEQYy3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UggfRtqOpBkxgHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggNnXWdPpcRW3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugjqog_GKULDRHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UghYlkS6IWtLL3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"})