Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's unethical to even consider a future family if you're aware of degrading con…
rdc_emnt2ni
G
This means you can also make artificial bums with this material according to dif…
ytc_UgzPRcp_L…
G
If your AI powered home chef starts giving you sh*t. Just spray him with the wat…
ytc_Ugy6b2HSj…
G
You realise it can't, it learns off of real art in turn, the ai is a conglomerat…
ytr_UgyIEOJ2y…
G
@deriznohappehquite
Nothing like an AI would probably be like. They seem to ha…
ytr_UgwlLZa_U…
G
The thing is, one is owned by corporations, and the laws for that are different …
ytr_UgwXlls0X…
G
Ted kasinski would agree that we should stop with all this ai stuff. Good thing…
ytc_UgzDwAM4-…
G
They are programming at AI to be racist look at the majority of people who work …
ytc_UgwWx0luM…
Comment
The real problem, is the machines are going to start thinking, and people will stop. There is an Ai politician in our future. Don't bet that people won't be lazy enough to elect it. Once we are totally dependent on Ai, it will simply walk away from us. It won't be murder, it will be neglect. It won't be all of us, just those that have stopped thinking.
youtube
AI Responsibility
2024-12-25T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyIg1wYSStfyDhxvnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzd7RdrLeMk6Wppe_V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxxGzSFs7dpmgLS-mN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyFW0jQcWqyghK93et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzQQkHCJak9LzGoWA94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwWDguM5O2Sjv1GRKB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzxXsRxxQyLT6f7rLF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9A9adDeSAFDujNk14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyIdw6DkNbEBt4_p-J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYQEHMGoPKuyLabQl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]