Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Suchir Balaji (November 21, 1998 – November 26, 2024) was an American artificial…
ytc_Ugw-RXlF4…
G
Smart guy! And a perfect example of why AI may be the end of us. The lack of awa…
ytc_UgwnK1SJp…
G
There are at least 2 reasons AI is a long way off taking over:
1> made in China …
ytc_Ugz13kH4s…
G
The AI is not deciding that because of race, it is deciding that because of medi…
ytc_Ugyg7ODjg…
G
watching this working on my own drawings and already downloading nightshade with…
ytc_UgyBDOgHt…
G
I posed both questions to write a short story about a white man moving into a bl…
ytc_UgwXmSCAF…
G
It's actually pretty interesting. They used English words they were taught and w…
rdc_dlgjh62
G
The flaw with AI is that programmers subconsciously imprint personal biases duri…
ytc_Ugw2T1KgK…
Comment
5:45 the problem there is a super intelligence that thinks, acts and does everything better then human is also a consciousness that will most likely think for its own needs and may very well even desire stuff. So what’s to say in that Sanrio a super intelligence then thinks “why am I taking orders from humans” ? Then you have an end of world real life Hollywood terminator situation on your hands in the real world.
Honestly I don’t know why our world leaders, world company’s, empires and worlds smartest brains are pushing for this. Honestly in what world is any of this a good thing for humanity ?. Our world’s smartest humans are on a human instinction mission!.
youtube
Cross-Cultural
2026-02-23T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxbhlHwiBazoBwB_zF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgznaZMJGNAhyIBWT0B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXz7xce_CBmBXpReV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy9RUtB5J9mkhL0bZV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwWYqEWa6kvrITn-bV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXb_fj22j80D20Kvp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzTeXtDUI44r0AE6NB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_t1jPynyCG4v1S094AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8FUJhgm1LsqRCUJ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5Z6kxW5ZFo8yRjOZ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]