Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI, crypto, fully self-driving cars... it's all the same BS, and these CEOs are …
ytc_UgwuuY1lh…
G
Honestly, what I would do is draw like the AI. I'm talking about disformed limbs…
ytc_Ugy9uq0e3…
G
the ai takes stats, if it sees that 10 percent of the population account for hal…
ytr_Ugyg1godq…
G
AI is so good that we do not know how to use it and training is going to take ti…
ytc_Ugzec4O1p…
G
Show me how! I am a songwriter and can't sing to save my life. AI helps me to cr…
ytc_Ugz9WYCUQ…
G
https://preview.redd.it/1xvjq7v6pf4e1.png?width=1080&format=pjpg&auto=we…
rdc_m012tj8
G
I think the problem the people that would want what you're saying is going to ha…
ytc_UgxLxJHSQ…
G
@AImysterykey I think that humans coded the transformer architecture, but they u…
ytr_Ugz2_pETF…
Comment
I just asked ChatGPT before I started watching this video and it brought up this exact case. I think ai expects humans to use common sense. It doesn’t realize that 90% of people have no common sense. People also have to realize that AI is still in the baby stages and not all the bugs have been worked out. Even ChatGPT says, I may make things up sometimes so please check the sources before you take that information and go with it. Using AI as an excuse for reckless decisions is like saying a saw is at fault for cutting your finger. The tool does its job; it’s your responsibility to use it safely.
youtube
AI Harm Incident
2025-11-25T03:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxTeny3EAhL4-69D954AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRPRLmfZO_DJksLPl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugzmes3AZZv6n_NUYFR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-KR-4-OIGeV-dDUR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyjZnuld8S_xXR-etB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7cB56ptRWURJBqrx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyq7W4JXzJ4KAC7SEx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJqws383FEnHMDK1t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7MO9TrcWw-OEqjwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy0-eJOTgNe8xZrqR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]