Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I recommend checking out a sci-fi short that’s available here on yt. The film is…
ytc_UgwFQh-P2…
G
I find it funny that they don’t realize that CEO’s and Executives will also repl…
ytc_UgzSgIdYc…
G
Is Ai going to build a house. Replace a roof, put oil in your oil tank, cut down…
ytr_UgyxI3lMz…
G
Well the Chinese are NOW building these AI Server Farms underwater in the ocean …
ytc_UgyXzd7Uk…
G
What happens when you ask AI to create an image in the style of a Musician ? ...…
ytc_UgwTsuPMp…
G
Bro talking about ai, I saw a dog clinic with ai dogs with clothes 😭…
ytc_Ugzx5ZyTg…
G
No, these language models are not conscious and LLMs literally cannot be conscio…
ytc_UgyACfGhW…
G
i want every leader of every country to add a law that gets rid of ai art…
ytc_Ugx8Y4Xvx…
Comment
I believe without humans no matter the level of intelligence the ai have acquired, the entire ai system will collapse.
My reasons is because, the components and power source of these systems will require multi dimensional mind and will system.
We talk about mining for minerals needed, refining, transporting, fuelling, maintenance. This will require independent minds, making various independent decision to an ever evolving world system lol.
Ai may know all we know and use old and current patterns to create a more accurate decisions but if lacks the human will and unpredictability it will be subdued by humans no matter what it becomes.
There are already super intelligent creations in place but funny enough man have managed to be an ass through generations.😂
youtube
AI Governance
2025-09-06T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwlqYlEi-bWg161LHt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxPI2hTG8tcK9678iR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy37ttqbBc06-2TVN94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_Ugyfup_FYQub4KVVD5Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx6x2nxZ4SvA4DQ7st4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzP7IR6IqApgtdULgJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw5VV_WmU_YddPe8gh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAZBiEq0yHFEBpbYZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwuw-SQwPChPjINuL14AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxPgU2ChKRG_dWVsNF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}
]