Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am writing before he says the answer because im learning. (Actually trying to …
ytc_UgyXWJEjm…
G
You should just have your speech transcribed using the whisper API and then feed…
ytc_Ugw9oPWQS…
G
This is stupid. Everyone knows ai is kaaali ma via a light slit experiment. Al…
ytc_Ugy8GfZfb…
G
Girls on Instagram look similar, like a mass produced Robot while Robots look m…
ytc_UgygmQ_bw…
G
I definitely forsee some specialties being at risk, however, I don’t think there…
ytc_UgwHaYi17…
G
not if it's AI vs AI wars instead of people vs people. or people vs AI…
ytr_Ugz1KhbsY…
G
I feel mean being rude to a chatbot, so I always use please and thankyou lmao…
ytc_Ugz0FNXTt…
G
I clicked into this thinking you'd be completely pro ai and im glad i was wrong…
ytc_Ugw_MuQUl…
Comment
While self driving car should have stopped, safety driver should have been paying attention and stopped, for me the blame is 100% with the pedestrian. Jaywalking and not looking for oncoming traffic is incredibly stupid. I would be concerned about any attempt to make drivers "more responsible" in Jaywalking scenario, as pedestrians will have no incentive to pay attention crossing the street. Even if you are crossing legally and you don't look for oncoming traffic, it is still pedestrians fault. You might be right, but you'd be dead right. Don't mess with oncoming traffic, pay attention!!!!!!!
youtube
AI Harm Incident
2018-03-23T12:5…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxy01VX_8QwXy9_57V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzP2TOIEQNrkwHotDp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwjt5cv4iPRLp6BKrF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyUUkGKDz2ID-KRa1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSzbckQWJHdkyLy0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkBaLtQi5J43dOVjF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugz0hN4KTF0haVONTDJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvRS-7NO2QZfGHuJ94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz2x7umeeGIBkOO7Sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQLYrOrlprqGY7tD94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]