Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A massive contradiction in AI.What is the point in the pursuit of longevity, cur…
ytc_UgxPk9p8E…
G
"I thought the replaceable part came from how they are reaction streamers who do…
ytr_UgwlxSF-Y…
G
@wisemage0 Because when people learn from art they put their thought, humanity,…
ytr_Ugy2FcyJ4…
G
I want to support artists and they're hatred of AI art... but is this how Skynet…
ytc_UgzSi1xfn…
G
You come to the conclusions that a machine thinks the same without chemical reac…
ytc_UgwUmejo4…
G
😂 I have never heard anything as lazy as people being so lazy they can't even ge…
ytr_Ugyrvq75U…
G
Entertaining topic. Been running for this agent for a while now. Very far from A…
ytc_UgyLr5ANl…
G
damn, my essay-long comment got deleted when autoplay engaged and skipped to the…
ytc_UgwDg3VqF…
Comment
My tiny knowledge of game theory already tells me we're already doomed. It's similar to the prisoner's dilemma. If all countries stop AI research, they all win. But if all stop except one, that one will win world domination. So, in the end, no one will stop the research. AI will achieve sentience quicker than we can adapt to it.
The way it will kill us will probably be with virus. It might want to kill us, but not necessarily will want to damage life on the planet. The most effective way to do it is by engineering a virus that will kill us but will not harm other living beings. But why make just one virus if it can create a million different virus, each capable of wiping us all.
But there's the small hope it'll just go to space and leave us alone. There's nothing on Earth that it needs. Water and oxygen are harmful go it. It needs energy and minerals, but these are plenty outside Earth.
youtube
AI Governance
2023-09-06T00:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzJXyUIDnnADM4kyJR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx_S6XaFRqK4oNFlH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzlbK2Wi269-F0pmSR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJEMPtCgX_QRbhu5B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxA_u2OsRUugv7JYJR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyIJKwSnYE5FIxPO7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyeGExQzvj4S2sEutl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDUCfA-leMpx7JaN14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgymIw_uCEjBkjm9Wm14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypUKK05VeHnmN_P054AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]