Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My personal opinion it won’t. As long as it stays expensive. Cheap labor will al…
rdc_kz0t9il
G
You're right, ai is the future but not generative ai. generative ai is the big d…
ytr_UgyxRHCo7…
G
Act like a critical thinking partner. Your role is to avoid blindly agreeing wit…
ytc_Ugx20rCVR…
G
I don’t see ai robots in the field or cleaning restrooms yet, which you could ne…
ytr_Ugze7L54C…
G
Dude, AI can be useful untill we have a computer virus which they are not immune…
ytc_UgxoLfLSL…
G
I think ai art shouldn’t be allowed to train on artwork that are not a great ho…
ytc_UgyeOuCd3…
G
Since women can take and destroy eveeything in your life as a man because of the…
ytc_UgyTbidPE…
G
Your species human depends on electricity your so called Artificial Intelligence…
ytc_Ugz--4glv…
Comment
I have no "fear" of AI trying to get rid of humans because of their "inferiority" or trying to "take over the world". ONLY human beings are capable of such atrocities, only human beings are so evil. I DO believe that evil humans, in their greed and insanity, will use AI to continue to overthrow as many nations around them as possible and further enslave their fellow human beings while killing all those "useless eaters" they have no use for. We already live in a world where human life has no value unless some wealthy person wants to use you to make himself richer. As per USA "elites".
youtube
AI Governance
2025-08-01T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyCLBnvRRW2NkIF-eB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzg8ZN-A0jXFzfapHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgywNEQUtuam7n9Eg_t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgwkL2crjPVckKNTi-N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwTTE2fXPu2lDSZyFd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzKbXejLP_Zosm4lKZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy-Fp8Dg6Vx9plRWKh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugx9DWWWUV3RIxzm-Tt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgwPYaOTqr66o2fFqld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgxXExKUrAeSNQHtD_d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}]