Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To this point at 33:53. Since everything makes its way to the internet. And beca…
ytc_UgxDwJxsv…
G
LLMs are appreciated for timesavers, though their "creativity" feels empty or at…
ytc_Ugx6-acEj…
G
dw, these "artists" are so stupid they dont even realize that THEY will be the d…
ytc_UgwrxiYfz…
G
I believe the main problem is: AI is trained on OUR DATA, DECISIONS, and how (so…
ytc_Ugywa4LF4…
G
Nice, super consumer nerd AI can't understand the complexity of human struggle b…
ytc_UgxzZdB-z…
G
"Why are they firing A.I. ethics every time they bring something up"... BECAUSE …
ytc_Ugx57aWBb…
G
If AI make "really art" then I make really sculptures in toilet. :)
This the sam…
ytc_UgzFkg8ih…
G
EMP. HIGH OUTPUT BROADBAND AMPLIFIER LASER BLINDING, STICKY NETS, SMOKE, SAWED O…
ytc_UgzI_hHUz…
Comment
Self driving cars are a version of the trolley problem. Do we want to choose fewer deaths, or the deaths of those who are not supposed to have died without our having made this choice? Self driving vehicles make it more evident that those who are behaving safely could be killed by a car which would not have killed them had the cars which killed them been controlled by a thinking human. I hate to think that I could teach my child how to be safe and have my child be killed by a vehicle that did something completely unpredictable and potentially, unavoidable.
youtube
2023-08-08T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxAcV3-jeGRD8Ee6zx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnChjmSX_yHITtIzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfQrExD2D6_UQHyEp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugykx2wAY5dREF-SFFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVTuSUCocmKajIjtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwx1y44FZI776ewm9t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwXrz0slgBdwc2zw1l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFCnmOfLiYdmU-wYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwaemTH8eWUccvEhjt4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt4Mx2dB7uiJ5TBdV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]