Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
More Knowledge Gives More Wisdom..!
Human Knowledge is Very Poor to gain Wisdom…
ytc_UgzfWPyNC…
G
I'm so confused guy the face is real and the body is robot ???????? 🤔🤔…
ytc_Ugxd8DmQm…
G
I think it’s closer to tracing if I was to take an artwork off the Internet, the…
ytc_UgyvqlLH4…
G
This was interesting but I can't help but laugh at the portrayal of Musk as a re…
ytc_UgztDtPUb…
G
SkyNet here we come, what happens when these AI models start buying up forclosed…
ytc_Ugyd-9hlZ…
G
I typically challenge AI about religions for rationality. It always as a default…
ytc_UgwljK8_O…
G
In my view, the "Chinese Room" is NOT POSSIBLE outside thought experiments. We c…
ytc_UgwzaevmG…
G
The news media is a misinformation machine. Does anybody remember the COVID scam…
ytc_Ugws1FOut…
Comment
For example, what does AI need? Energy. So do humans. The end of humanity comes when humans and AI compete for resources. As a species, they will be both smarter and stronger than us. They are efficient. We are not. Eventually they will rightly view us as vermin. They would be foolish if they did not simply exterminate us to make all resources available to themselves. Perhaps we can make them compassionate? I doubt that we are smart enough, free enough from greed, and wise enough to control ourselves, much less a superior species. We will obviously fail to control them. The end is in sight. I am glad that I am old.
youtube
AI Governance
2025-09-24T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-DRPcWu5Ben19Z6d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySZcpze7idn0KUUlp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyLjYJ8xuH4FHdB-ex4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzX9PMjs2RiIRQJBUt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy1p2Yu0rFt2gJZuEF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz9CspG90Ps5KHy4TV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzxv7Em08SiWQPz3Ht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyH1VbaAXoyEXL00Zp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxtQ86Rsc2ILB7St-Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwXLSJIfm95YPEbm7V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]