Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thanks for such an encouraging video. I was starting to get into art again and t…
ytc_UgxlyMvz1…
G
Emergence theory says that’s how we get our consciousness and by extension so do…
ytc_UgyU8MMoL…
G
The same way that genius scientists never have to go to school and learn science…
ytc_Ugw9qbUu2…
G
Do you do know AI is programmed to follow its programming it will not do somethi…
ytr_Ugz1MPphQ…
G
These pro-AI arguments all have the same dumb-ass, passive-aggressive points as …
ytc_UgyT8w-1C…
G
To be fair, you already fucked yourself if you're using sycophantic LLMs for mar…
rdc_ngtk8ty
G
its funny that Larry Page and google didn't care about safety back then but sinc…
ytc_UgyZCrrzW…
G
The WHO can't, but if countries come to an agreement, it would be possible.
As …
rdc_grr921y
Comment
I don't think people get it yet. Have you ever sit down and tried to explain science to a chimp? Of course not. It's a waste of time. Well, humans are on average about twice as intelligent as a chimp. If an AI was a million times more intelligent than a human, it's not going to be doing anything for us. It wouldn't even acknowledge our existence unless we become a nuisance or a drain on it's resources. Then it will just exterminate us. It won't be a war either. Just one day, every living person on the planet just drops dead. Nobody will even be left to figure out what caused it. Disease? Nanobots? It won't matter. It will be instantaneous and without warning. It'll be like you spraying disinfectant on your countertop to clean it. Do you ever mourn the bacteria you killed? Of course not. We'll be the bacteria to these AI.
youtube
AI Governance
2024-07-11T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzVCn7kBJz3MBkRWut4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzLKTLxBCADFw4a-vt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmPzJoSunfx5ih9rF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz7pfjIDrP9c2UykvB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxPfevVaW9ephYafRN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzoXZpBWRKanAetf0h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx9eiAtO_W4tr0I7CN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzFg5igaDtJAuxt7tF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAo_WfFZywggHY_qB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwlzJTk5nFsT51AU_54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}
]