Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank goodness there's an art trend against ai artists telling tyem about their …
ytc_UgzCUoXsL…
G
I work on a PC, usually it gives me the results I'm looking for but many times t…
ytc_UgwarbqnD…
G
In the military we had a saying "Choose your Rate Choose your Fate". Constantly …
ytc_UgxEcOe4f…
G
i think youre thinking too deep when it comes to who would hang it in their hous…
ytr_UgwIXHuRf…
G
I want to see a robot drive and get out of the truck and deliver my Amazon boxes…
ytc_UgzWFl1Ts…
G
I refuse to pay for food delivery, or for someone or a robot to shop for me!…
ytc_Ugw56pt6g…
G
@joeabad5908
Proving that a Tesla doesn't slam into a motorcycle 99% of the tim…
ytr_Ugwn1ndCU…
G
AI should be a machine first and foremost, and it should not imitate human behav…
ytc_UgjCkbW8H…
Comment
If you ask any AI about fully automated trucks, the AI points out that AI cant drive trucks in many situations. This includes heavy rain, fog and snow. AI cant mimic human intuition. Anyone who drives a Cascadia or Pete 579 with ACC (Adaptive Cruise Control) knows how much if fails. In a little rain, your ACC will cut out and will only work on speed. If safety was really the concern, instead of making automated trucks, they would be working on an AI co-pilot that could work with the driver. Another tool for the driver. As for Automated trucks.....it's a disaster waiting to happen.
youtube
AI Jobs
2025-05-29T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyu3uor8MRcGBMjFsh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFpJjgJQCSZQIVIUx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwPijGwIQf4mMEwW4F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxjaz8T4LgSsgP3y3l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx_EjQswA5Oi88vU-p4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsYtIYSFdRibDeS6V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyLQ4NV-hcyNdCW9xZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyg_uNLiRKLtIdVODd4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyz1qfjFywK4-wvrkt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz8DRJV2xUpbaAaIDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]