Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I know the argument about "pulling the plug" is mocked but I still want to ask t…
ytc_UgwTnEBpb…
G
@jeremyward1983 Thank you for commenting, Jeremy! It seems like that guy's confi…
ytr_Ugz1N0Ftg…
G
I think most of this is reasonable, but this is the first time in any of your vi…
ytc_UgwPc8BLw…
G
Sadly, this is inevitable. AI is going to replace a lot of jobs specifically in …
ytc_Ugxa7mUQz…
G
What about discourage one about learning about sky diving, or solo hiking, or ma…
ytc_UgzAOZP-1…
G
Because clanker means machine, robot, ai, anything that's robotic in nature. The…
ytr_Ugy8e6xNC…
G
The firing wasn't based on the assumption of future exponential growth in my opi…
rdc_n7oippk
G
yeah but youre not the artist, ai is the artist
give ai the proper credit bro
…
ytr_Ugwfws-LO…
Comment
As someone who works on airpilot systems in aircraft I can say that even in the most technologically advanced models there are still costly mistakes. This is even with pre-planned routes, known taxi ways, runways, and mandatory radios that always report position (lat, long, alt, airspeed). So if these systems fail given the high amount of investment and oversight, I wouldn't by any means rely on the baby tech that exists in automated cars. Also given the insane variables that automated ground systems deal with, I would be hesitant to trust a system without real ranging capabilities.
youtube
AI Harm Incident
2022-09-03T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxLKj91_yUZcmtzKP54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfuReyxukU6hInDXB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzCvmcpbSIrarILhTl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVN0ZCKCao6_Zjh414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkerxGD8MNCT62-Vl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwhqePNsfSq-qGeLD54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxgKW9FglOYf1rTYZt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw3hys9fA5p-pXqASB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUQtiA2BozN-OGUjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwuu7-b-u-guWrgrxl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}
]