Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are we still developing ai? Let it rot and focus on the fact we should save …
ytc_Ugx5Xel-0…
G
Punishment versus reward. Is it just me or could anybody else come up with a wor…
ytc_UgwEDxUtK…
G
Now BJP Politicians would get caught red handed committing crimes and they will …
ytc_UgyujMS0p…
G
Asked AI how to get good at Bedwars.
Was told to "Just lock in fr". Turns out m…
ytc_UgxdQg6ZH…
G
The global south doesn't really care because the idea of grains from far away fe…
rdc_jxzfz6j
G
I don't understand why the problem even started. Photomanipulation exists for ma…
ytc_Ugwj6mjx6…
G
In order to force an AI to reason better, you must I interrogate it deeply. Find…
ytc_UgzDu9Vmt…
G
The only way I am taking a subway is with an armed guard. Thank you New York. Th…
ytc_UgwfASdYV…
Comment
Interesting. Using the Echo story; it's close to what will happen. The computer system will be called "The Beast"a either by name or alias. Humanity will be reduced to roughly 1 percent of the present population. But there was one thing that this AI generated story didn't think of. No power, no AI. A EMP bust will remove AI from all computers no matter what it is on. Thus it dies as well. And in generic terms; that's what is going to happen. When will it end? The fall of 2036 as it's in the prophecies foretold about 2500 years ago.
youtube
AI Governance
2023-07-07T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfzEewqlrE84qyTxZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXB5VVf7ROi28aUb14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNWL5W0Em00oPyLZl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxFz0rxmpTChL_2jlZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzqSclAT4fTVzp8Cy14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzzpWXR2yGw2lKTsiV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxysfuBUayD0vsCjXZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxs2Mpoe1-hTX7VBnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxMnNHJ2YjEdixTWFp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxO9MIUMZrCdhe9aD54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]