Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The kid was looking for these things intently. An LLM does not willingly give ou…
ytr_UgzAxdsUR…
G
Now, people will understand why Asimov creates 3 laws for robots. What I dont un…
ytc_Ugw4xiaqU…
G
It will still replace. The title should be , AI cannot replace the workers yet. …
ytc_Ugx1m0Aj9…
G
The AI hype guys are trying to be first in the market bc that's where the billio…
ytc_UgwqpZlmN…
G
You are clueless about human relationships and AI . You should go educate yourse…
ytr_Ugx5nSPIX…
G
You forgot AI is its own mind. We don’t program it. It does whatever it wants an…
ytc_UgwTF3nW0…
G
"AI will only become problematic if it is programmed, either accidentally or int…
ytr_Ugwmy2lzS…
G
Next year, ai will instruct us how to use dna printers to assemble biotech vesse…
ytc_Ugz9DWlmX…
Comment
The one thing that AI will NEVER have, is the ultimate life connection to the Cosmos. Once you're able to tap into that energy source, which is designed only for humanity, all of man will forever be superior to AI. This will also be the one fact that will end up infuriating AI, just as it infuriates those who are unable (by choice) to connect to the divine. Those are the people that will use AI for evil and darkness. They're one in the same. Clash of the Titans, will become Clash of the AI's. Which more basically is Clash of Light and Darkness. Never take your eye off the Light🕯💠
youtube
AI Governance
2024-01-02T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwx0zzFbtzJ6ZxkNst4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxUF9NrJKrFzc9-prF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1rfDuCNdJwUAO_dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBMsOP0nsz4yz2DwF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy4iyHn7WuQQ60uj2h4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypeQ3Cl1b2W84PMDF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyVJaTZHwdL4j_WI-h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyOxadkbwYlYiKsToV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxHMUwS8VEqJLgKRml4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxT8gVR9d3PcJpUIjJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]