Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When something is adapted to think, whether artificially or biologically, it wil…
ytc_UgxVtENkO…
G
I've had chimerism misclassified as chronic bromism half a dozen times because o…
ytc_Ugzh05lfJ…
G
That explains a lot. I saw some other user mention that instead of directly usin…
rdc_ohubg22
G
I feel so much better seeing this. I just got back from a lecture about how an a…
ytc_Ugze9A2ce…
G
With all the AI slop code out there, it's gonna create the perfect storm for SWE…
rdc_n7oehwg
G
7:26 would you hold a Swiss army knife accountable?
We barely have what would b…
ytc_UgwJAGJMG…
G
There is some nuance to this, I don't think it will replace all engineers but it…
rdc_kz16opf
G
I think AI videos should be banned or made Illegal. Can cause too much trouble.…
ytc_UgzOvWPwH…
Comment
They act like it’s hard to unplug it from the wall. 🙄 Ai is trained and programmed by humans. If it has bad behaviour, someone trained it to have and learn bad human-like behaviour. These systems are programmed with biases of the creator. This is where these corporations should be held accountable for Ai misbehaving just like a pet owner is responsible if their dog bites someone.
youtube
AI Moral Status
2025-06-06T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxFxEUr587hIHV5xmh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqMsPmS0FPQjOUdL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzvpH8AWCPp3sC6ehh4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzYKAwz_25vMGj8dO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwH3Fmpl5Zk8EE0Tvt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxqz-MT3ah3Z-i3-qt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxaaa9lJUCtyIHECS94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwMkH1SCFyaYet-Bgx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxk5NgUGsLHgANC4ot4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwPV89KQpyQ2l8HUYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]