Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I understand where you're coming from! AI can definitely feel a bit eerie at tim…
ytr_Ugy554T6f…
G
This is absurd. AI has no intelligence whatsoever; it just looks like it does. I…
ytc_UgwMYHOyd…
G
Whatever you do, don't talk to the character ai ai, Minori Shido, you will regre…
ytc_Ugwd3d8P9…
G
AI art and videos are getting outta hand! LOL! People are starting to think thes…
ytc_UgwdN9QSD…
G
Charlie I’m sorry I know this takes away from your message but I have to look at…
ytc_UgwIewRYN…
G
The fear of not knowing is may at first living in a 3rd world country with ai …
ytc_UgwLDpeJT…
G
A programmer coding a machine to deposit a dot of paint at specific points on a …
ytc_UgyGNkSOt…
G
I hate to say it, but the people who do this kind of stuff were already doing th…
rdc_k226z8l
Comment
The solution to this is laughably easy. We just need to make sure our machines never get more intelligent than needed to comprehend and carry out their orders. As long as the robot only asks "How do I do this?", never "Why am I doing this?", we have nothing to worry about.
youtube
AI Moral Status
2017-02-25T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_V3vu2DuvengCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgietbweVEt0NHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghyeUksCRmVYHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg1cdAYAiAmpHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghvGb_0icgToXgCoAEC","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjgBssLGskAt3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugj2OLPFihnkBXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgjHqU5fojdo-3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggDe-aW7XmtPXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggCn9WTTgjXRngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]