Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The answer is simple. Stop using AI. Buy an external hard drive and start storin…
ytc_UgyXtq20c…
G
I think this video has a few issues. Mainly, that it is spreading hate agianst s…
ytc_UgzUOocoh…
G
The government should require labels on all AI generated things - AI companies c…
ytc_UgwsqzFbm…
G
While i do agree AI is extremely harmful to artists, to completly dismiss it as …
ytc_Ugzls7tpR…
G
"Why would #AI be pretending not to be conscious?
Ah, for a million decent reas…
ytc_Ugz77os3z…
G
Shouldn’t we the public try to change the system through politics? Through strik…
ytc_UgzxJ5ALr…
G
lol we’re talking about AI gaining more intelligence than humans, removing jobs,…
ytc_UgwU862zy…
G
All this and Musk too!
Elon is a dangerous subpar mind.
Spread disaffection and …
ytc_Ugz88SeZM…
Comment
...And then there's the autopilot algorithm conundrum (assuming it doesn't shut off one second before impact for legal reasons...): autopilot has a choice when faced with hitting a car, a wall, or say, a motorcyclist... Which does it choose, and is this choiced based on Tesla occupant safety, potential minimal loss of life overall, or Tesla shareholders? If it's legal liabilty/shareholder safety (and this seems to be the overall mantra of Tesla), then the motoryclist is doomed. Cynical perhaps, but from a business profit/loss perspective, completly logical. Great video as usual, Ryan. Cheers.
youtube
AI Harm Incident
2022-09-04T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw_k6LD4Ghb8zMXFVV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwBPmTdjZkt3H2-grx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy22RfWH05mm3j9oBF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzGdVdBWOCZZnL3LAh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwCIKtF19wbuhh_olp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxhOyQKZyoX6cYaSVF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDtHfOh6VKAZnvkmR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5ZgfHiA0mD1_WBwR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-MEcXK1Ow-vc70Oh4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy597L6Gpoy47JSi5Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"}
]