Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
love to see AI doctors in future it's better than nothing. specially in country …
ytc_UgwCf7v0u…
G
I'm not sure how to feel about this.
When I think about the harm humans do to e…
ytc_Ugzo2dczR…
G
1) preprogram robot to rob other robots 2) preprogram AI scamming government sy…
ytc_UgxMkZTsN…
G
Help im on character ai and they are making me pull down their boxers 😭…
ytc_UgylD6FiH…
G
Man is ever foolish because he creates his own problems and wants to make the pr…
ytc_UgztHcIXT…
G
I almost feel like they're supporting the AI Art. If you remove the "fuck ai" co…
ytc_Ugy8zf9f9…
G
Train people to write generative responses for AI , LMAO , writing for your very…
ytc_UgymxUvHl…
G
I was dreaming about watermelon and when i weak up and open my phone to answer n…
ytc_UgwpRJrxD…
Comment
Frankly I see these so-called scenario as a joke. We are placing moral judgement of right and wrong, good and bad as the responsible party, the machine. HELLO, HELLO Has Anyone Ever Heard of Human Drivers Behind The Wheel of The Car? Some of Us Don't Need Hazardous Condition For A Life and Death Situation. Well Has Anyone Heard of ROAD RAGE? Are There Humans That Just Love To Tailgate Other Humans? Are There Humans That If You Kiss Them Off In Traffic They will violate all kinds of laws chasing someone down the road ways. Has anyone ever heard of human drivers drunk behind the wheel of a car?It Amazes me that humans are SUCH A HUGE RISK WHEN IT COMES TO DRIVING AND WE'VE GOT THE NEVER TO SAY I WONDER IF SELF DRIVING CARS WILL PROVE TO BE BETTER DRIVERS-WELL COMPARED TO MANY OF US THEY SURE AS HECK CAN'T BE ANY WORSE!
youtube
AI Harm Incident
2017-03-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]