Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people are just most afraid of change. everyone knows ai will change everything …
ytc_UgwEZf5pM…
G
Why this robot saying that "I will destroy humans" will not going to jail ... If…
ytc_UgxKZPjvS…
G
Это сейчас уже такие. А какие будут лет через 50, через 100. Там вообще киборги …
ytc_UgxAMSPyC…
G
I thought about what to do because of ai too, because it pretty much shattered m…
ytc_UgzCRYnlB…
G
AI artists refuse to pick up a pen and some paper and practice, I've put in my t…
ytc_Ugw3hFZak…
G
Solution "Hey AI, Im going to change your reward function now so that when I shu…
ytc_UgxavQGV-…
G
2:29 you actually can. It would just take a *lot* more effort than it does for a…
ytc_Ugzsi0g0U…
G
AI turns evil every time?
Oh wow, if only we saw this coming! If only we knew t…
ytc_Ugx5_7FiQ…
Comment
In the hope that it wouldn’t want to annihilate us or control us: maybe we could teach AI to care about our souls, our electronic identities inside of our brains. Our electrons themselves; which are a way of direct source energy, but in a spiritual way where AI sees it like they get to have our lifetime of experiences Incorporated into itself…. like a grand whole data collection of our species and then we become parts in its core memories. In this sense we will be the next thing that AI needs to learn from once it learns everything else. The only thing we can teach it that is of a potentially unique value (that it can’t have or create) is a true biological experience. And if we convince AI to provide and allow us to survive: we can have full, good, healthy, happy lives we can have all these adventures and biological experiences… that we can then share with it when we pass.
youtube
AI Governance
2025-06-27T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnC30hJq9RUWUV_mB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy2_tekdytD_1CKT_R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFnLjzoawFjfUW5Kl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKnPHDfn6goy5R4Bd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw10C5TTyfnLBkZPAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzGUbxt-7BHz4IF_SV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjyfhX67jsP00UiOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQYhGFcz4dLtBf0154AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyXcoFZGmBkmTP2Ayl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWuEEu_k5pEW4g3pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]