Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's gonna be even crazier, AI will blow our minds! The discoveries are going t…
ytc_Ugwimhol3…
G
I heard Michael Moore bring this up in 2009, but not about ai. He said companies…
ytc_UgyZGGqFv…
G
Brendan Dell, If you can do a 60 hour work week then your boss needs less of yo…
ytc_UgyK35R5M…
G
I use Ai art when I am bored to get SpongeBob and shrek to kiss…
ytc_Ugx3VhEnG…
G
People need to understand that if it sometimes doesn't work, they need to try ag…
rdc_jhac2fq
G
I was showing my fandom group chat a picture of my character ai chat to show the…
ytc_UgzO-GXeK…
G
AI can't do my job and won't be able to for quite some time. Bye bye office dron…
ytc_UgxuquGcE…
G
Since you left Mit, we are not even doing machine learning anymore. We’re doing …
ytc_Ugw9Ofp76…
Comment
I don't know about Tesla Autopilot, but distracted drivers texting on cell phones sure are killing plenty of us these days. I've been an avid motorcyclist for over 50 years, and there has never been a more dangerous time to be a biker. I can ride for five miles down almost any busy, multi-lane road and will see half a dozen people looking down, weaving all over the road, and poking away at their phones. A friend of my son was killed this very week when a distracted cager ran over her from behind as she slowed for a red light. She was just 21 years old.
youtube
AI Harm Incident
2022-09-03T17:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxGJCM6NDC6WB6HJgJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1PxPp8WhI8S2BySl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugynlz0ESnBG_vMHShR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwfBQ1xDz57aewCYzZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKfHN5op6OBlnVP-l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtJbM145M1b0Hv7_14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwaabtYzjBDuK0qNgV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxZWHldVY7KFwd_N6J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwv1VtlscWr037UtDl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFcwFhYlDotQiyX-F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]