Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A robot will be a propaganda machine for democrats you can bet that. This is hor…
ytc_UgxT3GOuN…
G
All the leaders are old half in the grave they're evil leaving us with nothing b…
ytr_UgyjuYl3T…
G
F**k the face recognition, show the warrent. Pulling on him so he can resist & t…
ytc_Ugzzfoatt…
G
You view on that VO youtuber is how i feel when i watched his video. As an artis…
ytc_Ugx4vPGvY…
G
As a disabled person, who STRUGGELS doing art because of my disabilities. AI art…
ytc_Ugx4LMl0G…
G
Not stating as fact (but as question)…
I drive a sht load all over North & cent…
ytc_UgwfpkRb9…
G
Kaledrone /\ It wasn’t predetermined, it’s just that part of the A.I’s programmi…
ytr_Ugxp2YUNz…
G
This is an extremely narrow view to adopt to start with. What reason do we have …
rdc_du4h2jy
Comment
You can only believe AI will take over from humans if you believe humans are basically machines. Machines taking over from machines. But we are so much more than machines. We have consciousness. How can you program the big C in if you don't even know what it is or where it comes from. Scientists say, "Once AI gets complex enough, consciousness will arise." That, my friends, is religion: Believing a future event will occur without a shred of proof, just because your belief system predicts it will. Might as well believe in the rapture as well. Think of it. Why do some humans want to dominate other humans? Ego, urge for power, urge for freedom, arrogance, fear, sometimes even love. Think you can program any of that into a machine? If AI has none of the feelings that cause humans to seek dominance, it will never be more than our suppliant servant.
youtube
AI Governance
2025-07-10T01:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGYtOldqooJXt2nld4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzGQ3rIxnctZP9IcnJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJuPW5bAqBD_rQACN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxDRErnmSs7eW3oUqt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgywAQ8eQqLOwQlemzV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8a9fNxTN9Df0H3J14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSKFXPshAzKZ2MPwR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxOPEr2EmJXGAk5L6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXhqS3Ts31HidxRPV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUoC1RNw64oEwJzFp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]