Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a good chance of intelligence explosion scenario in near future where A…
ytc_UgzUovPNs…
G
They should make it so one of the cars sends a signal to the other that it has p…
ytc_UgxkwgWRS…
G
Are we hoping for the best or seeing the best future and not bothering to ask i…
ytc_Ugy75zseY…
G
But how will they destroy humans. Like a 22LR could kill a robot easily. So shou…
ytc_UgwViQRKK…
G
When ai threat will become big then world govt will come come in existence to sa…
ytc_UgxpcV3oS…
G
It’s our use of AI wouldn’t be able to be turned off or pulled back. That’s the …
ytc_UgwdcIRfA…
G
I’ve noticed that the only people complaining about not being able to use AI, ar…
ytc_Ugz-YIvTz…
G
I think my opinion is really based on 2 things: knowing that something is AI art…
ytc_UgxfO3oM7…
Comment
One thing i'm really confused by is this: How the fuck is it profitable? For instance, Sora 2. It is absurdly realistic, granted you can still tell it's AI with it's ugly and soulless, uncanny valley-type look, but it's realistic nonetheless. But it's gotta be expensive to run, right? Obviously, the more detailed and accurate you want it to be, you gotta pump more resources into it, so how much is it taking? Shit's gotta be taking a whole ass tree with it each time it generates a video with how much i'd imagine would be needed to run it.
Or is it just investors and other tech companies dumping money into each other, trying to snatch the next big thing by dumping as much money into it as possible and hoping it takes off, in hopes it'll be profitable in the long term? I don't know.
youtube
AI Responsibility
2025-10-11T00:3…
♥ 31
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy8a5tpa2GW2PlIKN14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_JjQ7UtECLddJ30d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwrSnWHJN_G48R1iah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEDEHrFe37ykA8Mot4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxtL8QSsp2C8Ke2tJZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxWN9umtZu66zgf4aJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxgicct-FUxyhw5Q6Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgzQFA6eha0XtysHwJx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzTefof0WbT_0akVCJ4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyiooXALIPhlz1Nqd94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]