Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thedebtcollecter5006 the stealing part i sorta get, but everything gets stolen …
ytr_UgxryaTSO…
G
I'm not defending AI but technically AI isn't stealing it's using reference, hum…
ytc_UgxKW8sq2…
G
Sora is the first stage of an AGI since it can replicate physics without ever le…
ytc_Ugyc6RGHW…
G
You can just call me arthur maxson, because id rather destroy the ai than use it…
ytc_UgwTihrgP…
G
So for AI poisoning, you have to create something to make it? Like an apothecary…
ytc_Ugy9ew-kF…
G
human beings have this amazing ability to adapt, best of luck to you ! maybe us…
ytc_Ugwkh4UHN…
G
For some reason nightshade doesn't work on my computer, and it makes me fearful …
ytc_Ugyd_b5PV…
G
Well to me it seems that AI can “learn” all dimensions and basically become a go…
ytc_UgxpkNBLZ…
Comment
This is an instruction to think of the most scary scenario they can think of, not the most plausible scenario - which I think even AI can't predict. It really isn't helpful that even looking up reassurance there's more fearmongering but logically people have a history of freaking the f out over new technology and in 2008 they were convinced the large hadron collider spelt the apocalypse.
youtube
AI Harm Incident
2024-12-15T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgycGiuqvc5c7ql1ojZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1KmVh96QGyNV1Npt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5GoUB1m9lepg_ryN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw5yyoNzu5Fv1P8yLx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSB8SzsH3B1Zpgyex4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwMCVXHp3bWQyy1OVN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx8KTFufcIdEej7C3J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzB3uaWkkykAy3f3I54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyFZ9_3LgK_0MBZ4GF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyz2ECSYJDeFaowWWF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]