Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
With the decline in population AI will make up the difference in labor shortages…
ytc_Ugybxj3mL…
G
On the other side, at whatever point I was cleaning up some basic CRUD api call …
ytr_UgxwSPgZs…
G
Ai art is a TOOL. A tool can be incredibly useful if used properly but can be in…
ytc_UgzBiU2ob…
G
They literally programed the AI with biased information, and then were surprised…
ytc_UgyzHxSlR…
G
when this kids grow up, the headbands data will need to be correlated to the res…
ytc_UgxKHKrZJ…
G
Everyone calling AI artists absurd didn't know there was a real duct-taped banan…
ytc_Ugy_wirJE…
G
My thoughts about automated AI improvements: the automation only works if, and o…
ytr_UgysbT0gJ…
G
AI is inconsequential if we just had the right to own our own land, and be the b…
ytr_UgwzwZLUj…
Comment
There is a prophecy that WW3 will be fought on horseback with swords and arrows. The one war to end all wars. I don't understand the reasoning at the time, so I ignore it. Now though, it's becoming clear. In order to shut down AI's threat to humanity, we have to shut down technology itself. That's the kill switch. There is no other way.
youtube
AI Moral Status
2025-08-06T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugzw38dK2TbzLLk79zZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxod2zCVN53moSbGy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyJc3ZCH7wmHsSiREp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxI4k6pjmFXS3mVNzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugzor7yUukoYl65c9oZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"unclear"},{"id":"ytc_Ugx352IJEolGM_gZZqh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgxmD5lFdwvxncVpEGt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugy8D4ERakVc3iWdRlR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxiCFl_UXd0ypxmsJZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},{"id":"ytc_UgzAcVDHe6pAzRcEx5V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]