Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone who has ever watched the Terminator movies (which everyone has) knows wha…
ytc_Ugyy0ZIV6…
G
Ai come to play dirty And fair however u come across much more Unfair shots…
ytc_Ugy0fot9G…
G
Well, as someone who HAS built multiple Artificial Neural Networks, as well as w…
ytr_Ugyymi_tz…
G
THIS IS AI FOLKS WE ARE IN THE END OF DAYS MAN TRYING TO MAKE HIMSELF god…
ytc_UgzwXmHgN…
G
@thecaptain29AI are programmed originally but then learn on their own so technic…
ytr_Ugz6QURi4…
G
Autistic people inventing AI and discovers its like parenting more than coding..…
ytc_UgwqrJRQK…
G
I followed up this news for a while and many girls decided to suic*de themselves…
ytr_Ugy7M-1bE…
G
Talvez no es que este atacando, puede que sea implemente un error que ocurtio en…
ytc_UgzB2kuSz…
Comment
Money money money... Tesla wants to make the most money possible. OK, let's grant that - if Tesla releases self-driving software that crashes a bunch, everyone stops buying it and Tesla gets inundated with lawsuits and the company either goes bankrupt or limps along, a shell of its former self. So if Tesla wants to make the most money, by FAR, what should it do? Release a self-driving car that is safer than humans. The argument they're trying to save a few bucks in hardware to try to make more money is preposterously stupid. Working, safe, reliable self-driving is the difference between being the most profitable car company ever and risking bankruptcy.
youtube
AI Harm Incident
2022-09-04T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCxYZo9m_LHfHEkp14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKUOLxpSSl60xSwC54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyD6nQYTbMDdVNZHn94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3SrPcKbufoo9yxx14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJBR2a-RPrCK4RpP14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyt-Fa7mbWvHZ2gW5R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlZO5_AoMLXPOG5Qp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgysN3F3pzx6MrDVMVZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxu19QGgTea-JCpAHB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx51pLGEND4ettC6SR4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"}
]