Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They're both wrong.
AI can't write it's own rules because this isn't a human th…
ytc_UgxkCSNyp…
G
....going horribly wrong? Are you saying there's a correct way to replace humans…
ytc_UgwWEoKrR…
G
All these so called smart people say China is causing a lot of the loss jobs...w…
ytc_UgwjtS6AN…
G
I have to believe that it is probably impossible to ever make AI safe. Now we kn…
ytc_UgxeLsvCH…
G
I looked up Artisan AI to see what it actually does. It’s to replace outbound sa…
ytc_UgxM3kmsl…
G
I'VE BEEN USING CHATGPT FOR 2 YEARS NOW AND WHAT I CAN TELL YOU IS, IT TENDS TO …
ytc_UgyjEMHXi…
G
The thing is, people are just doing some of this for fun. Like what was shown i…
ytc_Ugw-L3JtQ…
G
Preventing the creation of superintelligence will have costs, and _we must pay t…
ytc_UgwJ_Utk8…
Comment
Stupid question. Hackers are doing that already. They manipulate the machines so that the machines manipulate other machines and humans consequentially. Don't let the machines do a human job and you won't have to be scared of anything. AI exists because humans don't want to teach other humans to be competent and have good morals. They rather train a soul-less machine who are computed to obey and be submissive without questioning. People don't understand the world they are living in. The whole system was created as a form of control, from the educational system to the religious systems, all humans are being conditioned to be slaves, but it's taking too much time to fulfill that agenda. Introducing the revolutionary solution ! The birth of AI !
youtube
AI Harm Incident
2024-04-13T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxN2Bk7VE_g2KYLhOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQr4kO58cZWwpvy8B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz1WbFe460hYozKjyV4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw9w2V9RQaKMiZDn3d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyl1nWBdawQ-IsNfrt4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxt4zorvwlUfoW-xBl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugy6pSzCm62hAw2byt54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwjoOZfZAhYn98Qc0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwU_qB6N6nKyM9C67J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKkrsZ51A86uQUxNt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]