Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
you would think if someone tels an ai that they are going to do that to themsel…
ytc_UgygIiia2…
G
my analogy for ai art is like this: imagine youre at a chili cook off, everyone …
ytc_UgxdeRouE…
G
WHATT?? No way that was a robot. That was a human with makeup right? If I’m wron…
ytr_UgzOtY3d4…
G
of course he says the government is not ready, why would the government pause a …
ytc_UgyY2sboQ…
G
I think this is really cool and i wouldnt mind having a robot friend cuz that wo…
ytc_Ugy-7NBg6…
G
i hate ai so much as soon as it came out i hated it. why? CAUSE IVE BEEN AN ARTI…
ytc_Ugz2Dgmah…
G
@tullochgorum6323 Yeah that's the fundamental issue with AI. The only good ones…
ytr_Ugwgg8KvE…
G
That window is closed, we will see these AI drones in the Ukraine war, and Weste…
ytc_UgwuR2yHR…
Comment
He forgot to mention that most of those 5000 incidents are also attributed to the driver not paying attention because they put too much trust in the automatic systems. The law, as it stands, still requires someone to actively be in control of the vehicle. It doesn't matter if the vehicle is self driving. In the case of self driving cars the human at the wheel is the backup for most of the drive. On top of that, if you're the one who initiated the travel of the vehicle you are the one who is responsible if anything goes wrong when there's no driver to take over.
youtube
2023-07-31T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxukQjqRR6f6Xre5sF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBiMBQ5lWOhK-4mQB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxgySTwFUJ43HVWlvt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxJqYuRi0G1MGQMPix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMJz-IxeP6ueISXph4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkSf9SF8XFo-Dn6xd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_UNVwacIfOiICyyN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwutRFxldrAFzPqnBV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxbghNUGweGjTHowWd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxyGYXtuJJyi7iO8_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]