Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You did not just compare training an AI model to being a serial rapist 😭😭…
ytr_UgzW4p0R7…
G
How do you program empathy towards humans into AI and AI robots built for warfar…
ytc_UgyMeUTRv…
G
Deemoons/aylieons are trying to posess (steaol a body) artiyficiyaol iynteiyllig…
ytr_Ugzz51Yc3…
G
While I overall agree with most of what Bernie has to say here, I think we need …
ytc_UgxpmW0M_…
G
What did you prompt ChatGPT with to get it to help you include the ad? You would…
ytc_Ugwt2N9Lb…
G
No, AI is just programmed to be almost always neutral, so it wouldn’t offend any…
ytr_UgzBtEGo6…
G
@elkinflautero7638 thats not the "ai fault" is the programmer fault. Ai will cha…
ytr_UgyF7TlzB…
G
on the bright side, an AI might just be able to disobey it's creators and actual…
ytr_UgwKYPGeW…
Comment
Tbh I think you have to draw the line at the fact that a robot's simulated consciousness is just that, simulated. You can remove bits and then realise they're not conscious again. The only reason they can seem conscious is because they are replicating human behaviour
youtube
AI Moral Status
2022-04-17T17:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyBo2zjNMeQ9Ck2T8h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSXFR7BTWYMg5MWyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy4cMygtzq_TC7mZeF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0Arn2F-BVDe0CJAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuW5OasCPfXQ_6lbF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxrNO_EKV4QDi2A9gt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxxoplc8U-xt3R5k7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4e8oGV9YvDSjr3xJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx67u6m0mM8NboC7ap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkEtom8EoLfXurhoB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]