Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It really all changes with quantum computers. Right now AI runs on basically wh…
ytc_UgxDFYA6B…
G
the only way we could possibly say that AI being conscious is possible, at least…
ytc_UgyH4NAUg…
G
So, we’re all having to deal with AI and its impacts, because this technology ha…
ytc_Ugx52w3Xj…
G
OMG YOUR ARTSTYLE IS SO GOOD! Also, if you could, could you do some of mythologi…
ytc_Ugw9U-p6l…
G
Ai and Agi will eventually become a terrorist weapon, I'm sure these groups are …
ytc_Ugzi_JxZ-…
G
Its just Efficient.
It costs the AI one nonliving asset, and costs the enemy al…
rdc_o7of17r
G
If you’ve never been or felt this kind of lonely, then don’t you dare judge his …
ytc_UgyXN5w4m…
G
The U.S. military already has and uses weapon systems that make their own decisi…
ytc_Ugytx0rtd…
Comment
Well congratulations! AI is already being used by Israelis to kill Gazans in an automated fashion. That's why we see so many collateral damage, because for AI everyone on this planet is a human shield. AI has a target, and civilians are just annoying noise on the image that is treated as a shield, so it decides to penetrate the shield in order to get to the target. All people are goyims. No empathy.
youtube
AI Governance
2025-07-13T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwHe7ctuH9a2XohJh14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwOZrJpWk8IHJ3Qzn54AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJxsiCa2qhaGwSL3B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzNQChLKZ4EWAOlji94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYGPjNvkmXjhdt_qF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw4opzYdNUOKXzQaIh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSDMTgjWMqavXcJ7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1oWoVM24iVVn6kGJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-L7ubsFRUkkEE54x4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCNlhm6t8T6VnkZnF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}
]