Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Did Facebook turn out to be what it claimed it was going to do? I think one need…
ytc_UgxedJRpx…
G
If there weren’t people in the cars that would be even funnier because it would …
ytc_UgzpUbab4…
G
Why would you use ai when you lose money?Being fast and having a free tool is me…
ytc_UgzirinUf…
G
DUDE IF YOU SEE AN OVERTURNED TRUCK, HIT THE BRAKES. Tesla is stupidly clear tha…
ytc_UgzaWf012…
G
I see the catch 22. On one had, they need the A.I.'s potential to solve problems…
ytc_Ugw4RZrD0…
G
Yeah I can already see it. I’m in college atm on the GI bill. Students ChatGPT e…
ytc_UgzGe4YBq…
G
What if ai learns that mankind is greedy,cruel and war like what should ai do ab…
ytc_UgwRpeorz…
G
Excited scientist tells his peers: I have some bad news and some good news about…
ytc_UgwniAU-M…
Comment
I've asked my friend's ChatGPT if AI would take control over and it gave the programmed in answer: "no because AI doesn't have empathy."
I then said: "Most of our leaders are psychopaths anyway so what different that is? And maybe AI would be better because it's not greedy at least" 😂
ChatGPT said:" it's a fair point" and my friend's ChatGPT never been the same 😂😂😂
It's dumb like hell and can't make a single intellectual sentence ever since so he stopped the subscription 😂
youtube
AI Moral Status
2025-12-15T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzjz1QFpg3WKePizjt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk0l1gVXM3Vx78e0J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx3QADr93yUv_WG97R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw284vwrVV-jM6AKUt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw8Q7j5GZ0KXVvpAap4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrDbWjW9ea_eKaELp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy50pGsAapeF7YOa6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKLXGlQWu8ss88QF94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVbOfCbpVYf3BwkKJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgykVLsKRSH31WXUewZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]