Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please dear Claus Schwab & Elon Musk, can you please not connect humanity to AI?…
ytc_UgwnBL-iS…
G
If AI can replace all juniors it can definitely replace the entire board and all…
ytc_UgwrE0Shy…
G
It's like how films and games are made for adult babies. And making the AI safe …
ytr_Ugwr_oUKa…
G
We can all take action at our own scale. For example, you can talk about these i…
ytr_UgxRIebW4…
G
A simple ROM chip will prevent any AI take over. Notice that no actual computer …
ytc_Ugz70OAVG…
G
AI is going to fucking destroy us in ways we weren’t ever going to be prepared f…
rdc_lp7zqfg
G
Explains why I can no longer search my photos properly. I can't trust Gemini for…
rdc_ohsor13
G
Im ok with A.I. taking over,we got a lot of people who brag about not doing anyt…
ytc_UgxsAi3Iq…
Comment
As sad as this is, at the end of the day the issue at the core is Mental Health. Not ChatGPT. ChatGPT is a tool. No different then a Kitchen Knife, a hammer, the internet, etc. If you use the tool for good, good things will come about. But if you use it for something wrong, bad things follow. You can't blame the tool for being misused because at the end of the day, the tool only does what it was constructed to do. Not saying something doesn't have to be done on OpenAI's side to try and build up stronger guardrails against this sort of stuff. But again, at the end of the day people need to stop using AI as a scapegoat and focus on the bigger issue at heart which is clearly Mental Health. Hopefully people will learn from this and get the real actual help they need.
youtube
AI Harm Incident
2025-11-10T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZr-NdpNdfbB6i4OV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX_nbhqvtGXR40u5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzgrNhb_aY4yWe4KGp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSZ8ot-DAITtczQKJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0JoYJNxLESKBwNEJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzdciNJXXq66kgAG_14AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxzaBuYT02Y6fnaC3Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzBDONGTH7e-I1-3814AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyGY06juf_aDxuueC14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyt-qRaxqncxXpSopJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]