Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hate how ai will be able to do anything soon and money will be useless because…
ytc_Ugx8pajjb…
G
This is why dystopian futures are unrealistic. AI can't take over the world if i…
ytc_UgwK8szUv…
G
Unfortunately the “predictive AI is problematic” has happened many times. For in…
ytc_UgzypZgsv…
G
The irony that AI ripped it off from someone who likely already did stuff exactl…
ytr_UgxMkbzVl…
G
I only really have one question do painters or people who habd draw their art th…
ytc_UgzDoU5iR…
G
Folks this world is about to end very quickly...I say this, if you Do Not! Know …
ytc_UgzlvGwNm…
G
Yhea has been months now with this non stop media outlet about how dangerous AI …
ytc_UgzlIv9Mf…
G
@Caboosejaja Yes but it is controlled by a human soldier on the other end. It's …
ytr_UgxURKp0g…
Comment
What he's saying is nonsense; his own creation has outgrown him. That's all. 1. It's impossible for an AI to know and be able to do everything. Gödel's incompleteness theorem doesn't allow it, and if it did, mathematics would terminate itself because it has proven itself to be true and therefore has no use for the closed SYSTEM. As far as AI is concerned 😅, an ASI will not be able to emerge... Those who render this simulation would not allow it...and these are not 100% verifiable facts, therefore true! Mathematically, it is 99.999% SAFE, fortunately not 100%....
youtube
AI Responsibility
2025-07-24T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzw7Pw2PMzSyPIzWjF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyvr_GBNgIryhSgUG14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzzvbeMKBc4TiWh3_l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8LaLR-IWxwbIRzjF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytOOPEpDEdkAKlSqR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy05y5G7U7WV1iz_Xl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwj1eSNJXXtPceAuep4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyHH79fx7sYvZ9Y9OJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwU3qsOMxhsbP2fGnl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwmpy4ug7Tqd2fzKX94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"}
]