Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I typically loathe the ACLU. However, in this particular instance, I would argue…
ytc_UgxOE_epl…
G
They are robots AI it was confirmed they were all in a game like sitting down am…
ytc_UgzckGobL…
G
I have watched hours of all these AI pioneers and I am shocked at how all of the…
ytc_UgzoLDNz5…
G
I checked if ChatGPT can copy ur artstyle and it can't!! Soo!! Yippee!!
It told…
ytc_UgyTjFKKr…
G
WHAT I DON'T UNDERSTAND IS IF A.I. IS SO DANGEROUS THEN WHY DOES HE INCORPORATE …
ytc_Ugw-qiO-D…
G
Just need her to cook and clean. Then im gonna buy my next wife, lol…
ytc_UgzouFY41…
G
Isn't it crazy how he can make the most articulate, well-reasoned, effective int…
ytc_UgyEDgIId…
G
How can we ensure AI data centers grow with transparency and water-wise responsi…
ytc_Ugyd314TY…
Comment
Too many people today are plain lazy, wanting machines to do the work for them. Elon and the geeks are just So confident in their mission to make machines do our work, thinking their AI and algorithms are superior to regular old humans, (and sometimes they are). Someone making these systems needs to be held accountable for the lose of life, otherwise, this will continue to happen, and if no consequences, no big motivation to improve, or feel remorse and compassion, (yep, machines don't have these qualities), and sometimes I question weather some of the people building dangerous machines do either. Wonder if anyone from Tesla has reached out to the families who lost their loved ones, or would that not be "allowed" by attorneys, (who mainly just care about keeping their clients happy). For reference, I ride motorcycles, bicycles and have driven over a million miles as a professional driver, including behind the wheel instructor, seen a lot, learned a lot.
youtube
AI Harm Incident
2025-06-12T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz4AY59q36IgaWwHCN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuafevFmBiC7Ztv1R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwn1VYzk_xDwR89xdp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxT6YA_aZ3VqIJrTMt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBZe5bn3i76vQbDgh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLvhYllrtXrwbSWuN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy6yUZ7KedAeeVvDSt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgypzZdekfbePGFKeQ14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxVxi5UFlSwO74tm_d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwoIdAlgEJvWNNnS_94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]