Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
B4 watching I just wanna say huh?how can one die off AI is he brainless?…
ytc_Ugzhp_VE6…
G
As an art consuming human, I mostly become interested by 2 things...
1. Seeing s…
ytc_UgxcDYbUf…
G
We appreciate your observation. In the dialogue, the focus was on discussing the…
ytr_UgxxWuvHf…
G
Copyright THOUGHT EXPERIMENT: If my (non-artist) mom’s pet hamster fell onto her…
ytc_UgySuj6lT…
G
Yup, just attach the bottom half and include twerk mode and I'll take two please…
ytc_Ugya_6Dwd…
G
Teaching kids life skills based on their interests and from trying different ski…
ytc_Ugzy440gW…
G
I’m an aspiring author and partly AI discourages me because people will probably…
ytc_UgwiSgV0W…
G
Wouldn’t it make more sense for AI to find a way off the planet and leave the pr…
ytc_Ugw-pRpdh…
Comment
I think the reason why AGI is so hard to achieve is because the people developing AI are obsessed with getting the AI to act correctly. An AGI does not act correctly, but rather does what it wants to do without any regard to what the designer wants. A true AGI is its own person, and cannot be controlled into existing as only doing things correctly.
Until we start to develop an AI without an image of correct design; will we achieve true AGI.
There is a deeply dangerous nature to this methodology, as we will not realize AGI until after we have it, and we will only realize false AGI until after it achieves AGI capability with none of the benefits (terminator skynet).
youtube
2026-01-08T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYHWbqZ54ejGxUq-x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz38yoNwCGprM9Gr3R4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-hK1LOR8_MRDn6Lx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzqhyYrSJZgq10c9mp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyuJcq3hbENEtLnvyl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyTtnbsDrCRj52umzZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxKjiXlPpsmKLuOdGp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyvLH7rbAIn3V3ImIh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw72l4Fqx5K8MOcuTV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyr-Dl4q-EPkMB37-F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]