Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It took Google a YEAR to get Gemini to be able to set alarms on my Pixel phone. …
ytc_Ugz8uRxYc…
G
“And then… ai art”
They wrote it as if that’s supposed to change something, it S…
ytc_UgxNRS61K…
G
I dont know. If AI can handle repetitive and manual work faster, cheaper, and wi…
ytc_UgxiRFkX5…
G
bro kit a single tape on the backgrund and the robot now are fear a tape ??? tap…
ytc_Ugxk40X8x…
G
This is good the governoment should tax the shit of each robot make it similar t…
ytc_UgxtOfoLY…
G
This is what caused Judgment Day in the terminator, he's talking about AI being …
ytc_UgxygIndT…
G
I don't morally or ethically think ai art is good. It puts people out of jobs. B…
ytc_Ugwqc4Iu6…
G
Bill Gates is working diligently on similar robots only much younger in appearan…
ytc_UgzbFX4d0…
Comment
Sorry, but isn´t, statiscally speaking, a common knowledge that the average US driver is bad at driving? And Tesla wants to use THOSE drivers to do "machine learning" for his " safer than human driver autopilot" ? Yeah... I can see how that´s going to go. this lawsuit as you said is just the beginning. And this in a place where road safety often doesn´t even take pedestrians, public transport and cyclist into consideration. I´m thankful to live in a place that has forbidden the sell of such PR egomaniac bs.
youtube
AI Harm Incident
2025-08-16T11:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgyssMyO0kAAt67H4DZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxrdAYFX3yS5LOvFKF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5mm5eR-d3KaUzob14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzl51G2rS_vMSUVkKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy-tlU-WjrTCsCy0yd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxjZCH6Hk1w9QNV2Hd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzp-eWGrr7NTFkCVKt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyhZUsJSIINe5P1Gh14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxfuNjssXt8GwLMWJp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxrU_jWhgswWvmljPV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}]