Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If we try to create an artificial intelligence and it is not completely alien to…
ytc_UgxBeTJFN…
G
Their argument is "BoTh ArE dIgItAl" when ai takes 0 effort and has no life behi…
ytc_Ugwy652T3…
G
LOL! You are confusing fiction for reality. Sad. And it only takes two minutes t…
ytr_UgzEvChNe…
G
@soukainalaoui The vanishment is pales, pales never exists in the past, the pres…
ytr_Ugw6UraE7…
G
Yeah how did they program that? Most automated voice prompts sound so robotic, w…
ytr_UgwAR-x48…
G
Yea there's tons of these kinds of people.
Usually their beliefs boil down to:
…
ytc_Ugym9Owy-…
G
This is the shit that freaks me out. :( I don't worry about AI slop or deepfakes…
ytc_UgzU1_5ft…
G
who cares.....they dont give a shit about me why should i give a shit about them…
rdc_czm7z2a
Comment
Alex Jones explained on Rogan 7 years ago that these people producing AI are doing it because of off world signals they’re getting that are giving them the information. It’s faulty info though and is being indirectly used to create a hive mind, god computer with future, past, and present prediction powers to control free will. To usher in the new world order under this AI God. Every day he seems more and more spot on. Whatever they’re learning, whoever they’re getting it from, they do NOT have our best interests at heart.
youtube
AI Governance
2023-07-07T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4Z6MGm6XltSY-7hd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz3LrSDv1jJol-oMON4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwvW6bCajxg6y3xNxB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz1TC1hkXnRSB35mHZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxyHIRMErY1OPwDkvB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz4XaDGeT6trJLnIxR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxRfrfYd9b_Ix9Arnt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyfXpoZkBEORLDPoPN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyVAMC0X0eemsmrcYR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGbFiwh8TAfUKfNq54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]