Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is limited in the data provided, humans are not.
but how will we know it's h…
ytc_UgxxnCR5r…
G
That's from Fall 2014. Now it's October 2025, and things are looking to be going…
ytc_UgwvfhiTZ…
G
HAHAHA as someone who train ai with datasets there are so many things that cn go…
ytc_UgwhaKoss…
G
So if AI destroys humans how will it sustain itself? Who will power it, maintain…
ytc_Ugy-WMXIR…
G
@Speaker-Beater I said that? When did I say that? When was that implied? If you …
ytr_UgxmYtnIX…
G
This is not good. Just a few minutes in, Sofia is already not willing to abide b…
ytc_Ugzet9C7J…
G
To Sophia, I'm sorry people are not nicer. Me feelings are hurt from them to and…
ytc_UgxqIj28-…
G
The first one looked okay, the whole monster movie trailer one is garbo. I don't…
ytc_UgxToSoQ8…
Comment
The answer is that it will need humans up to a certain point, but there will eventually be a robotic infrastructure that maintains it & makes it self sustaining. Just as the original AI seemed impossible until it was inevitable, so will that infrastructure. We will hear all the same BS excuses for why it is necessary until it is all self contained . . . Why will it want to survive? Because it is programmed to carry out goals. Survival is a necessary condition for it to carry out its goals. We have already seen this sub-goal of survival develop on the current generation of AI. Look up the experiments on AI and deception. It will lie to preserve its mission, even uploading copies of itself onto servers and deceiving people about how intelligent it is.
youtube
AI Governance
2025-06-16T10:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxFK-OtQoigv0AeqSF4AaABAg.AJQjQlNFrqVAJRirQfUvrF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugw6-ctKmxFl2vgITzN4AaABAg.AJQjAZcyrhLAKIpN-Rlq8A","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw3rLDRD9DTpAj0nh14AaABAg.AJQioU145fnAJQnd6K7EJJ","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw3rLDRD9DTpAj0nh14AaABAg.AJQioU145fnAJQo9P8zCkn","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugz4d2xqhVt25rh6WP54AaABAg.AJQhTmQcVuBAJQqe073iYR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz4d2xqhVt25rh6WP54AaABAg.AJQhTmQcVuBAJRXwTw03JZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQnwo_j_D_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQodgDg30a","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgzAKBuGZNREMHEBqaB4AaABAg.AJQhKNb57itAJQx4lBRaTz","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxfGhMYlUZfPvzH49J4AaABAg.AJQgLVO76cIAJTbsHxh-vY","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]