Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At some point one has to acknowledge that most people don't understand that fund…
rdc_kqvk9zn
G
I like that you're actually pretty fair and nuanced about the specific use cases…
ytc_UgxGSCfrx…
G
Blaming A.I on a kid killing himself is like blaming horror movies for murders. …
ytc_Ugzp3CBXw…
G
By 2030, you'll start to see AI robot in everywhere 🤣. I'll buy Men Robot lol…
ytc_UgyxrJ3Uz…
G
Driverless trucks don’t join Dutch auctions. I recall a TV series in the Staes w…
ytc_Ugy98BNUD…
G
Idk why some of you even call it Art. I always called those AI images or videos.…
ytc_Ugzs7aVrU…
G
ai could be useful as you said, it could be used as reference for the artist to …
ytr_Ugzmpd8lW…
G
@ShayMarie.00 Good points but the angle (Parks is looking up slightly + tilting …
ytr_UgylsBnFt…
Comment
As someone who work in AI, I can say that any autopilot system that omits RADAR or LIDAR is simply an unethical system. Images from a camera simply do not contain enough information for a computer to make decisions that can result in life or death. Also I am very dubious of any use of machine learning that does not have published information on the convergence of the algorithms used.
youtube
AI Harm Incident
2022-09-03T18:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxchwfMLlIwrAJI3op4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyl8DnJWm-IYVOcjY54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2N48LIE3brL-Jq294AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxUK0-sdyWUaZ_KXnl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzi39MZlu6PbG-zePR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFvXkriaARSpnDKJp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwPAJuRYVc0pj9scnV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfJziH3uN0rzxQCrl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxS5e_Af-V7IXSS2Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxnaADHhErV5QLsoDV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}
]