Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
not even at the peak of inflated expectations. nowhere near.
I've been saying A…
ytc_Ugys1hsau…
G
So if you have a crash on Auto-Pilot (hands-off) do you think Elon will pay out,…
ytc_UgxFrxx1P…
G
While many companies do have good intentions and motives when developing AI, the…
ytc_UgysX6Tl_…
G
> People who prioritize climate change and the environment have not been very…
rdc_et6eocf
G
53:42 yeah probably because their energy prices are five times higher than the r…
ytc_UgyV6qyJ6…
G
It's evolution. AI will replace humans, because if it doesn't there is no hope f…
ytc_UgzINdbhO…
G
If they ever make a beautiful life like intilligent robot that can have sex like…
ytc_Ugx5CH5Vh…
G
man even collages are way more artistic than genAI can ever be. with collages th…
ytc_UgyDJDSeL…
Comment
I've been trying to explain this very point to a lot of my friends. We don't understand consciousness. Without understanding consciousness, there can be no true AGI because it won't be self-aware and it will have to be externally motivated. I.e. It's just a machine, a very "intelligent" machine, but still a machine. People try to anthropomorphize these machines, but AI is in a blank state and depends on a conscious mind to give it a goal. AI won't do much if anything without human motivation pushing it externally to give it a goal. The problem will be the unintentional nudging from stupid humans.
youtube
AI Moral Status
2025-05-22T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxaJbaZFJ-Iom2CUM14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRFIJ_4T8Jj4898KJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYAltCmknEOXgbt094AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy7sebhEuUKswxmmN14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyXWkJ5pGcH9yudETd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaQz4MtajLZYLgCA94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypnKTKdwQvdGhVmhp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwg2V4zU0tbP73O5u14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPbzA2__hQz0tv2vB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNg6SzsinKGY7EgsB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}
]