Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh please automate those useless CEOs and their secretaries first. Like as if th…
ytc_UgyNm6hU0…
G
Every tech executive sees dollar signs when they think of AI. They will force it…
ytc_UgzSLJe6e…
G
Ok really, let’s just stop hating on artists, and to the artists? Stop hating on…
ytc_Ugybs9AHR…
G
So technically the Internet and anything AI is basically channeling a Ouija Boar…
ytc_Ugw6EFLqh…
G
All these large companies or any company for that matter, who gets these robots …
ytc_UgyAFv5oI…
G
Cyber-terrorism is an overlooked issue with the driverless trucks.
Imagine some…
ytc_UgyjOZsnk…
G
I mean, AI pulls information from the internet mostly, doesn't it? And anyone wh…
ytc_UgyA8zqDh…
G
if she will get smarter over time then if he will learn how to build a robot lik…
ytc_UghriRFYh…
Comment
basically he's saying that your life is not worth anything..... like why have brakes and indicators even.... the real life nuances cannot be recognised if you are not human... auto-pilot is like driving drunk.... and yet its supposed to aid drunk drivers home safely..... from their auto-pilot to their explosive batteries Tesla's are more dangerous than Elon tries to pretend they aren't ... and ego his size, and his lack of expertise in the field are there for all to see if only some people bothered to understand a) the science, and b) the algorithms ..... coders are not good with real life functioning... and most coders don't actually understand the tech they are trying to code for
youtube
AI Harm Incident
2024-12-30T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugzlilf0kjcmOnrv5xt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJWIDt6oOiorm5J754AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyIZVw6KZCzKCEbXMx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyFs-QA1DH-NXdQTLd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyZKKhyCmaofP2AyTB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRDicqcNmcNWbYGtl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxQ97OxvHySPKgo3n54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugym16jXfrxj8jmWybl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxmw3WUMY0VbzNONA94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_6xhJE7PCeHxJkLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}]