Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also to anyone who supports ai generative technology, do you seriously want all …
ytr_Ugw1pwL1E…
G
If you report these flavor of posts, they end up in the mod queue.
If these fla…
rdc_jaaxahn
G
Some of the art of AI shows us a picture of humans being extinct in the sense th…
ytc_UgzhOl3L5…
G
WW3 can reset the Super AI race and give us some 10000 years human race lifetime…
ytc_UgytfYJfZ…
G
“Those who consider artificial intelligence are those who have artificial concep…
ytc_Ugw4dpUA5…
G
Not an artist, nor AI one, but question at 11:52 inspired me to write this comme…
ytc_Ugx5KXFdA…
G
Is this permanent? Will my brain never be the same as before using AI? This is s…
ytc_UgwjGbNeS…
G
It is not pure mimicry.
It mix and rephrase, the result doesn't need to exist…
rdc_j8btnv3
Comment
This whole fretting thing is so stupid. The only thing that matters for AI driving is, does it SAVE MORE LIVES THAN IT KILLS? No driving system will ever be perfect. It just needs to be better than the average human driver.
With a level 1 or 2 system like Tesla, it’s even better because you combine the features of both AI and Human safety systems. The AI is never distracted, and the Human is supervising. The combination only is unsafe when the human becomes distracted, which is 100% the fault of the human.
youtube
AI Harm Incident
2025-01-03T19:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzaEYvJE2SSYW9L2Kl4AaABAg","responsibility":"manufacturer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyzFpnNOkVcKmH_aaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwzp58DTN1IBX_8ERB4AaABAg","responsibility":"manufacturer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGdm19dqdWYCXBTRh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgywLlKfnvOanu3C2f54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4HRAiHIiHg30CIKB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwHooEFPaqbXC2ePTh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzqXpcLVaK5rSykubZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztIu2__P6nFjw0cbp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNGo6TFyukcaQc9mR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]