Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Never" is a bold claim.
It may very well be true machines will never gain con…
ytc_Ugyk2u8qO…
G
why robot is given gun?
if AI connected red button. it push it someday. thats ca…
ytc_UgyH2jHfp…
G
"All rights to this work are reserved and protected by copyright law. No part of…
ytc_UgyoFcyB5…
G
I'm gonna follow this woman from now on. And I'm glad she wrote a book but she s…
ytc_UgwDCzs_q…
G
I'm studying urban planning, and lemme tell ya there is a CRAZY amount of invest…
ytc_UgwrDIDuz…
G
I think all people who call themselves "ai artists" need to be sued for copyrigh…
ytc_UgyzJ16xR…
G
Something has already gone catastrophically wrong in society long before a.i. ex…
ytc_UgzaYfuY_…
G
I'm glad your brother survived. Who will become the next Oppenheimer? "Now I a…
ytc_UgwPCO3zG…
Comment
Since 2000, I've been saying that human augmentation is inevitable to survive in the future because our organs can't handle most functions or burdens from ever evolving multiple tasks and cognitive challenges. Also, we have to be at least at the latest stage of a Type I or early Type II civilization level by Kardashev's scale, so AI does not clandestinely compete against humans for energy. Humans need energy to make food, heat or cool their houses, run appliances etc. Unless we learn how to harness endless energy (call it synthesis or full control of it) humans will be considered as pests or parasites that feed on limited energy, so it is quite obvious for AI to eliminate the entire humanity. Think about you create new vaccines or drugs with AI that will encode death in a DNA level which can be triggered by specific radio signals or unknown synthetic molecules that can be generated by AI when it is time to strike.
youtube
AI Governance
2026-01-05T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwLxOYyrlE55Za7U6N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQYNtNhRl6BE8iYqx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnUtLYjHfV0yG3NoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0Zf9bR-GNO0BNAFh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3lg78Tt1X-QZy7Q94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwwLo7sthu1keu_1DZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugym-2u-BSekngylslN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2W7HKvlO0KzWHz0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbDNGNmK0rpZNHCTB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVZRWsiV3bebFLYbN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]