Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@oooodaxteroooo so basically all ai bros are insane, let's put it lke that. I do…
ytr_UgwSNE47Y…
G
Mankind is motivated by food, sex and power. So its behavior is somewhat predict…
ytc_UgxZL9FZ2…
G
But Elon, how much of your hand is in our future with AI?
Please respond…
ytc_UgxwcnXE0…
G
In any case, AI + the Trump family is about the most deadly combination I can im…
ytc_UgzALobPq…
G
"it all looks the same"
That sounds like a you problem. Shitty copy pasted art …
ytr_UgyDYAQ5K…
G
Ai is like a person. Its literally a living brain! You don't know if he's good …
ytc_UgzNrRev4…
G
I did a lot of afrobeats and you’re just using it wrong… if you give it a good s…
ytc_UgxAraccN…
G
You guys have the wrong analogy about AI killing off humanity. The analogy is th…
ytc_Ugxns5X9I…
Comment
The problem.With robots, they're going to be like electric cars. Can you find the parts? Did they make enough now if we can just find somebody knows how to work on it. PLUS SMART PEOPLE WILL wait until they're more perfected. I expect 5 to 10 year minimum to get it perfected. Even withA I's help, it will still take time and who's gonna fix them. They all must be trained. There's more years there that's why? There'll be exceptions, some billionaire will throw $10 million at a robot and get a better one. Nobody else is going to be able to afford it.
youtube
AI Moral Status
2025-12-21T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwTwHXCqGR-5Rg1XRF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrE1H8UBrIxqdeV6J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgydACn5XaUuEtZBhPB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwebSOICrBJHFzTuX54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxwxYEoQsOVdYehqxh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5xHmfXyAwbZXlMG14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXk3qQIyW_ZR_wTSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxe1GHwLb8Tq6oluiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYGOZiJkp35nQclDN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxw0-apx-msxIkkFnF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}
]