Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The idea is obviously to open a pathway to infinite lawsuits to companies that a…
rdc_kz08s9w
G
“predictive policing ai” right, because the american police system has historica…
ytc_UgzQ1l0zY…
G
I kinda hope itll be a slow take over of ai and the only thing left to people is…
ytc_Ugx6cioYq…
G
Steven, I commend you on working to raise awareness on the subject. It would be …
ytc_UgziUVYtY…
G
Yes you are harming artists because when you use AI it is still using stolen art…
ytr_UgzkJy_9j…
G
America has to come to terms with workplace automation. Americans define their o…
ytc_UgwCul0jn…
G
I agree, every AI should be nationalised it has to be it's the only way it can b…
ytc_Ugwq6mh6e…
G
Pardon me for not being quite so fluent in Philosophy and or science. Tell me if…
rdc_dduo0nm
Comment
This whole situation is what happens when companies are allowed to go hog wild with technology that laws and regulations are not capable of addressing. If either my country (England) or your country (I assume the US, but i did see a British Columbia license plate and I’m not familiar with F9) had competent legislators, we would’ve seen enforced radar redundancy, we would have seen Elon musk forced to make Autopilot not a feature but a clunky private test that forced drivers to engage instead of shut off their brain, warning them of what the car is doing at all times, ensuring that yes the AI could be used and could learn, but in a way that both country and company trust.
youtube
AI Harm Incident
2022-09-03T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxVNm5TOdofdv8mDNl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx1jbDl6Pkhl-_xml94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzN__OAZE_QKpx-86p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgylDebzkyk1iuYkA1t4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzg9pM6G_zOD9fbpUJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyE4YGQxhj884BLNjZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyUm5Hi7Xb2ZAbABAx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz26V4bQtlOx5x-RvN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzFbVl0SXdztvjBIBJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzomGiqt3Upa3mr2tF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]