Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I worked at this place where one time a robot killed a guy by throwing a crate w…
ytc_UgxWoop0y…
G
I believe AI will do more damage to the "professional classes": people like lawy…
ytc_Ugxi6nxor…
G
So wait a moment, the mother saw her kid in the car loading a gun!!!
Rest my ca…
ytc_Ugz4Y3XBY…
G
Give me ACTUAL PROOF that the AI can know when it's being tested, because I do j…
ytc_UgyyA38mf…
G
I somewhat agree but I think when you say the problem is the profit motive you s…
rdc_degevdn
G
Not to be an edgelord, but people fanatically believe into what was written in a…
rdc_munu7o2
G
More afraid of woke go broke 💔 than A.I. Man hasn't change since Adam. Evil is …
ytc_UgzbGjC9D…
G
Dude are we going to pretend to ignore the fact that these AI's are trained off …
ytc_UgwJlmwed…
Comment
You guys say people in silicon valley have seen a ghost. But no one else has seen what they've seen in regards to an actual artificial intelligence. It's literally all LLM and mass data driven processes. 0 actual intelligence...why should I believe that AI isn't the biggest scam ever. There's no proof that's been given that AI is sustainable, let alone sentient. I believe a grey Goo type of scenario. But to say that AI will consciously wipe out humanity is ridiculous, especially without proof of anything to the contrary besides essentially saying the boolian computing makes big data LLMs do silly things.
youtube
AI Moral Status
2026-03-05T15:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgziGSaLFwCqvohuy2d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOmeS-BDGp_-8H0_V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwHtbOvWBGoJLETKPJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyNAUEWXoYtJEOK_up4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNUCSz-gwrdhByNSF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvfHPAzdnTI1cAdNF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3bFuyfXqkt7d1cS14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyA38mfBTEvKvWWTB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxG2qkuaLLe3nBL1_F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzVb7NLGv--Go61_nt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"})