Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do we really want the "government" regulating and controlling AI? That's even m…
ytc_UgwdguAgt…
G
Indeed - I think of it like this, if a computer resolves a million complex equat…
ytc_UgywQr6V7…
G
So we should be programming AI to think Pol Pot was cool, so that radical genoc…
ytc_UgxGpf1a5…
G
Ditto here, but they are REALLY on it. For each error Tesla engineers find, the…
ytr_UgwmwRH51…
G
The best quote I’ve heard is: 'I wаnted an AI to do my chores so I could do art,…
ytc_UgxrZSaHq…
G
I'm glad big companies in my country are aware of what AI can do in a couple mor…
ytc_UgwwDSd3Q…
G
With all respect to your intelligence in AI and computers, true wisdom is not on…
ytc_UgwO8465W…
G
Our minds, the way we think make us human. AI = artificial intelligence. AI isnt…
ytc_Ugy8L20q-…
Comment
He wants to break the singularity in 2029 that's when all hell 🔥 is going to break lose! Because if Robots 🤖 start thinking for them selves! AI will rise up against us humans! Say no to singularity! No way! Our future depend on stopping this! Ya it's fun to talk to them! But not to think for themselves! They might just say Human are a pest 🐝 we need to get ride of humans! 😱😷😷😷😷😷😷😷😷😷😎😎
youtube
AI Moral Status
2019-11-11T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy9YkSIaW5TM-zcaM14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6NBzxph-KtDtQksN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySGFADXSDUPqzP4rx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwvXiK8aDxQ6juKG3R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCaocv2O8N5vTg9DF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5HwVGb_ZWQlWbhbl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwHXpH1U6TnS2etIMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgybvCu2a2ZhHeaOdCN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwj_bt8b74BgLrRz_t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy2Zwk36fICJWEe3kB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]