Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyone thinks Ai is so great, what will people think when the take all the jo…
ytc_UgxIfsDGT…
G
Okay, but keep this in mind: They project that the newer AI models will be bette…
ytc_Ugw5Wh5Qp…
G
Not just AI models, AI news presenters - and let's hope that, unlike BBC news, t…
ytc_Ugzr1dCGL…
G
1. Build The Bomb. 2. Go on talk circuits looking sad and saying wistful things…
ytc_UgzS0t94o…
G
We just did some training filming for one of the UK's largest companies. They h…
ytc_UgwBg_I1E…
G
I work in the AI industry and can tell you that you are spot on. Once AI gets ou…
ytc_UgwXVWMbY…
G
This was actually a conference on the impact of AI ... im all for conspiracies n…
ytr_Ugy62lFX5…
G
Isn't it partially because Tesla and Elon's unethical marketing and releasing no…
ytr_Ugzh9VqdZ…
Comment
Don't give Robots emotions . Emotions can play both a positive and negative impact on their thinking affecting their decisions because if you give robot emotion like love he may love one person more and second person less he may do more things for one whom he love and less things for unloving person so they are just acting exactly same like humans . Don't give them any emotions .
youtube
AI Moral Status
2021-04-27T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzBSdO-QmX3hTsLH3Z4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEh7ZIvAetcHvKD4d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwdwpJBv13tHIU0Dvp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuPM_DWuMRhuZKi414AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxnG919xDfwG7AlM5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaWKQoOiXKnQpxCWp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4DgMzv3LNXzIwZUF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxHVnpkiK1tJBAo44d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyL8-IaW13dE8l_Eit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-pVhQwn8HCJLYEmt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]