Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm starting to think that companies are going to use this to their advantage. W…
ytc_UgwK8opCM…
G
People like this sub mutant AI fake intel dope need to be dismantled before they…
ytc_UgzStYHQe…
G
I'm fully convinced AI will lead to a 2nd romantic revolution, not unlike the af…
ytc_UgwAc2uGc…
G
Hello
im a little late to the party i apologize
first i have to state that im n…
ytc_UgwZGsyl1…
G
GASLIGHT MUCH CLAUDE??? Well timed media blitz designed to mute the recent natio…
ytc_UgwLO6Y0F…
G
Yea, we are already paying for them now. I don't use AI voluntarily. I noticed…
ytr_Ugw-CF7m3…
G
Don’t panic. Just adapt and use these tools correctly. You’ll be fine love. It’l…
ytr_Ugwg-XwIV…
G
If statement with so many possible answer. Just add a lot of scenarios into one …
ytc_Ugx8coPjG…
Comment
The us air force started an A.I program to pilot the air fighter, but it went wrong.
In the simulation, when instructed not to kill the enemy, it will kill the controller (human) so that no one could stop it from killing enemies.
Rewrite the program not to kill controller, then it will take out communication tower 1st so it will not able to receive any order to prevent it from killing all enemies.
This A.I shit is legitimately a fucking terminator . How the fuck can't human learn n be more cautious
Arrogant n pride will cost us the future.
youtube
2023-06-04T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwQcystQotpefmCZwt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZu9b80ojEeYdIU6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwp2v048MRYSBrnUZh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVM2OXcclaH-3xnL94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwdaFbIUi1scoS_lpt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuGt-TooMnvfpRJ6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxwURRMbnqd672nAJp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugyi175Vu9MOfasoPrl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwESoVi0-6Tj8PcNrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfosoxQg7Dij5U-sJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]