Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i saw somewhere saying that ai can't be copyrighted because it's not created by …
ytc_UgwMbwKCj…
G
They are however smart they are programmed to be. Install GPS, boom they are no …
ytc_Ugw1OjLB9…
G
@stefanorizzo3384 It doesn't though? It's just one aspect of the regulation and …
ytr_Ugzf1apP4…
G
hmm we all say lot of things good about it. practically speaking - future job ma…
ytc_UgyBpG7ri…
G
Alrighty, half way the video you fall down a route of "AI can't think", as if hu…
ytc_Ugy20-5Sx…
G
I'm surprised nobody is talking about the Military Industrial Complex and their …
ytc_Ugy0MIY0d…
G
@NoExpert yep dark times ahead, One of the other comments had a good suggestion …
ytr_Ugw3YuheU…
G
Rishi pops over from his Luxury Santa Monica Penthouse in California, on his Pri…
ytc_UgxUV32y-…
Comment
I just realized that AI should be able to resolve the argument of the materialist view of consciousness or non materialist view. Clearly he’s a materialist and believes that consciousness will spontaneously emerge from these complex AI systems. I’ve never really believed that consciousness just emerges from complex systems and so I’m not a materialist. I used to be though. But I guess we’ll see. If AI becomes conscious then maybe we’ll solve the thousands of years argument of how does human consciousness emerge from the brain. I guess too how will we know that an AI system has become conscious? How do we know that AI isn’t conscious right now? How do we know that hallucinations aren’t being done intentionally by AI to make us believe that the AI’s still dumb when really it isn’t? If that could be proven then surely the AI has some sort of consciousness if it’s trying to manipulate us. The act of manipulation sort of implies that it knows that it’s an it and that it’s separate from us.
youtube
AI Governance
2025-06-20T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzptiw6LzTH2DOMeEZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwrwcAtUrx0lUgolU54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwrXCh0Kl6mFCi1PGF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyCNEKDaXCYz4kMRFR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgzbcOAlAE9zoczkRmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1K0zGTMGoM495ehF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwju-BecGtFVFM7YZF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw_ICeyhssw66wxKe94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxa5Owi3vKSnfPuLpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMUq8ySO7XvHdqy9V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]