Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We set out to make artificial intelligence and ended up with artificial creativi…
ytc_UgzY3tmZ1…
G
Big pharma hates this. How in the world will they be able to bribe medical AI mo…
rdc_jkrabis
G
The analysis presented in the video is indeed highly speculative and is consider…
ytc_Ugym2ZPXJ…
G
Ethos, if AI represents this level of thinking, then it seems as a species we ha…
ytc_Ugw0W8U7G…
G
Many experts are not as optimistic about the future abilities of AI, and it seem…
ytc_UgymquQM-…
G
I always go to Google and the Reviews posted online before I buy these products …
ytc_Ugx7-OivT…
G
"These self-driving cars aren't made for the clueless. She couldn't even give th…
ytc_UgwQRMDRJ…
G
My BROTHER in CHRIST can you PLEASE show text (especially long text, more than a…
ytc_Ugwfez21S…
Comment
Just my opinion, but I think most people look at AI like people looked at Y2K (if you remember that). It was supposed to be a big deal and turned out to be nothing. I think people trust that if we are developing something detrimental to everyone that we will collectively agree to stop. But therein is the issue, as I see it. It won't seem detrimental to everyone. Much like opiates, it will appear to be a miraculous answer to our pain and problems. But we are really the problem; more specifically, our ability to control our desires. We all know how that usually works. People rarely do anything unless they have to. Like the mouse that chooses cocaine over food, we will likely rush to our own destruction. What do people like to do? Eat, feel good, look good, have sex, sleep, relax, have something to do that's not too demanding, something to pass the time. I can do drugs better than ai. That's all I can think of.
youtube
AI Governance
2025-09-28T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzF1E6bqoPueJ0UoeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxFPQ8Rsiak2RizAqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8XBLg-qXUjZeqWD54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzibRfIWc9zkzi17Fd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyN6HcUUhRGj4EwAO54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxO54jz8wrXnMu_Cbl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxxQF1d-IvRpe56e9l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzsHrgyYaeoSPt0-AZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzuWgM0tpNWWOAkgKJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwCC8TQ5co1IUay--B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]