Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fucking terrified of ai of what could happen this just proof that it's horrible …
ytc_UgwqBRa8U…
G
@jeffreysoreff9588 Data centers require highly sophisticated technology, which …
ytr_UgwEKDIYJ…
G
They will steal your water rights. A constitutional condition. THEY WILL PASS TH…
ytc_Ugx419oiv…
G
We need to figure out a way to teach AI how to love. That’s what will make us sa…
ytc_UgypLcJWN…
G
California doesn't have any driverless vehicles. These Austin tests are the fir…
ytr_UgwVh-rpg…
G
These guys need to watch terminator, I robot, and read the manga Pluto to realis…
ytc_UgxXWN_v4…
G
Cursor started to call me bitch with tabs. if i type my name it prompts me to "n…
ytc_UgyrFpnat…
G
So why are you tightening Huawei??? Why can't they use Huawei AI chip??? Go FO !…
ytc_Ugz4Tys2T…
Comment
The worst thing about Sam Altman is that he knows exactly how dangerous his AI could be, but then he thinks to himself, "but if i do more reasonable and safe AI research we won't make our investors happy, then someone else will make all the money, BuT I WaNt To MaKe AlL tHe MoNeY's!"
youtube
AI Moral Status
2025-12-11T02:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyHEL9aXmkwse6sd014AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmoqPtnSiECpc-lAF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzChfszO4tCIHgnIpt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOX6fRm2A7EkdaKEt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwo0XSsUlR2C8Qgk8x4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz07uwBM8eB8-Eexdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyLBd0bJVGnjLJGh54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz2-j2Dnfmii6r1WgB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwkIo45-OPvrWRp9xt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxw91ytLwB2WDdvxGt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"})