Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is not "intelligent technology", just because "intelligence" is in the name. …
ytc_UgxDk-TMZ…
G
I think it is hilariously TRAGIC that ANYONE thinks there will be ANY winners in…
ytc_UgxwFmDMM…
G
I don't think AI "artist" will ever understand, we're Artist we will never quit …
ytc_UgwLdmUjF…
G
I have wanted a robot from the time I was old enough to read Asimov’s I Robot. …
ytc_UgyRIxS0S…
G
@russell-gt1dy You are indeed, since pretty much every other first-world nation…
ytr_Ugy8EOgAO…
G
funny thing is that we are so desperate to avoid AI coming to the conclusion tha…
ytc_Ugw1PLeJ3…
G
To me there are 3 scenarios of how AI evolving could go for humans
Scenario 1: T…
ytc_Ugwia89r2…
G
Vertical farms are very energy intensive. I don't see how that is an answer to g…
rdc_d2xa982
Comment
That’s the kind of crap this world does not need because one day they are going to turn on mankind then what? talk about opening Pandora’s box. You scientist that build machines like that you are smart in every way imaginable, but then again, you are the dumbest of dumb because you have no clue of what you are doing. Artificial intelligence is not always the way to the future all it would take world, be dummy like you to program a sadistic Killer into their programming bank, and if they decide to start talking to one another, they can transfer that information to one another and then what. again remember given the right circumstances computers could talk to one another. The movie Terminator is one thing because it’s fictional but to put it to real life. That is not the right way to go about doing things. Essentially you will have created a monster.
youtube
AI Harm Incident
2024-02-03T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy62aWC9Twmudot7Rl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzmKdA5WlDOBg7RNFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxh2LXAydDvXTAA2iN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyNBhGmX8ddaQcfSWx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwjxtc_l_wImBqA8814AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQqJoHuyQmmGWE2Q54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7lV-SLDao2NL462t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBksy6E9M7JWOeKTh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyfOtYjs24wFuG6set4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxfV8jtkORitcStSBx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]