Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not real! IS A REALLY POOR ATTEMPT AT editing. If you slo-mo this you can see pa…
ytr_UgyyFWzUB…
G
If you ever want to know WHO you are and you have been using your AI for awhile …
ytc_Ugy5yRIEG…
G
Ty ik this is a year later but I’m using this to explain to people in my rp chat…
ytc_Ugz2NmCCZ…
G
Just take AI bros to a forest. Look as they cry and scramble to find internet…
ytr_Ugz8MQ9Ic…
G
I use AI daily and it is most certainly NOT sentient. It's fun, it's reactive, i…
ytc_UgwaLaMpP…
G
I record voice overs for a lot of different YouTube channels, and the quality of…
rdc_lz5nxg8
G
According to this they don't jail people using the algorithm. They're just being…
ytc_Ugwo6IB9o…
G
The goal of this short film project is to STOP smart drones developing, or it co…
ytc_Ugyeymu0E…
Comment
Calling it autopilot is the worst part. I do know that this technology has probably saved more lives than it has ended, just look up videos about close calls where no normal human would be able to react to a situation while driving. However people still need to pay attention while using it as it's not real full self driving. As in said the video the guy wasn't paying attention despite being told to by the car. I still think that something needs to change with tesla vehicles.
youtube
AI Harm Incident
2025-01-21T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZ-0lUFQrxdAMS9Bp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy2UX64H1Mo4c0vgt54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxApkn0WMMqvL5pWX14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0swZ4kh5iRug0c9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0DYmkwE5ehsW0JoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyyJWaVDd20D2MIxxR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGMAWY7O7_4eg2v_d4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3of2MQmWnLP309o14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7uO5xSoYu8S5aAnB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz9qbULdEK1QWoqTih4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]