Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Violetnightdreams I already answered somewhere else on the this video, the basi…
ytr_Ugxrsps4V…
G
About half of all job creation is from companies with 50 or fewer employees. How…
ytc_UgxiS9uMX…
G
Tesla fatalities were caused by human error not the driver in supervised FSD T…
ytc_UgyOX8e2U…
G
I was working intensely with AI just until few months ago. I understand very wel…
ytc_UgzTbqyAe…
G
i am a teen auther and use ai for disscusin my story and telling me what i can d…
ytc_UgxkS3m1h…
G
The problem may not be the AI itself. It may be its voice recognition technology…
ytc_UgyrNAW8h…
G
@pinip_f_werty1382 yes, but that would be consensual, however, in reality ai was…
ytr_Ugz9ZztEB…
G
@prottentogo product photography... we were told with all the new AI image gener…
ytr_UgzYHRC6q…
Comment
I really like this channel but I think you sought to re-affirm your bias with selected facts. You left out the big elephant in the room - The driver is in control at all times (no debate). Car are killing machines in the wrong hands. You say cameras and AI is not ready - simply check out FSD 10.69.2 - it’s mind boggling what the capabilities and decision engine is doing AND zero injuries and zero deaths. Please try and make this video again and garnish facts. More than happy to help you with this in the interests of the full story.
youtube
AI Harm Incident
2022-09-16T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzEzw4ccp8J7Arzvdp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwBXV0RxeUpen6ZGbd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyoK1cpLNqaZV80fHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRyADljOnn2vgpll14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEY4gO8WBaJjMQscB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzywqUx_ffVWDIcZmJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0ohamZ0X3Wpqd5zR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy-qZSzVScDwQXFVAl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlUkkEdenbwVNiqg14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy9nu5XiXK84cthkE14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]