Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI and electricity are not comparable inventions. Electricity is a necessity for…
ytc_UgxD05LjQ…
G
I always write "thank you" and other things like "how are you", "please" "you're…
ytc_Ugwu2-j87…
G
Thank you for chiming in, @isidracatillon5667! It sounds like you faced a tough …
ytr_UgxX_UeCF…
G
Yk this is a old short? Not to mention, anyone can use ai, even artists, [for no…
ytr_UgwXNwRfD…
G
I think about how wild the industry changed in the late 00s early 10s. It was th…
ytc_UgxLzsRVK…
G
He made AI now he‘s saying „we have to do something“. Not pretty smart if you as…
ytc_Ugw1futhj…
G
Beacuse it's REALLY annoying. Because an artist can spend more then hours of tim…
ytr_UgwIlHog7…
G
"We won't solve all the world's diseases in a few years" I don't why people beli…
rdc_n89l4d1
Comment
>My husband presented my initial symptoms of a rare disease (Anti Synthetase Syndrome) to Chat GPT in February. It took four questions (with him inputting test results from tests suggested by Chat GPT). It took 4 questions. In reality it took 6 months with the doctors being convinced the whole time that I had pneumonia (resulting in 6 rounds of unnecessary antibiotics). Finally a random test result came back positive. By then I was on 7 litres of oxygen.
>
>I'm off oxygen now because my husband spent the night after my diagnosis reading all of the medical journal articles on ASS that he could find and came in the next morning suggesting two medications. The doctors wanted to go through their standard meds for autoimmune diseases and three months later (when I wasn't expected to survive longer than another two months) they gave in. Six months later I was off oxygen.
>
>I was in the hospital in February and the doctors ignored my disease because "they hadn't heard of it." It was a dumpster fire of a hospital stay and I was discharged and am now terrified to ever be admitted again. I spent a lot of energy advocating for myself because they insisted that I just had pneumonia.
>
>Honestly, whenever I have questions now I ask Chat GPT 4 (I think of him as Gary) because I know it holds no unconscious bias and won't just default to things it normally sees every day.
>
>I can definitely see a future where doctors just need to review diagnoses given by AI. As long as there is a human reviewing things with an eye toward benefit vs. risk, I'm good with it.
I cited your reddit and posted a clinical case on Medium, similar to yours, to see if human clinicians could come up with the diagnosis. GPT-4 could definitely make the diagnosis, but not GPT-3.5. I am using this case to test other chatbots to see if they can solve it, **Case**: A 52-year-old woman presented to the outpatient clinic due to progressive muscle weakness, arthralgi
reddit
AI Responsibility
1686008549.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_jktck42","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_jn1yucs","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"rdc_jkoo1qd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_jkpwk6y","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"rdc_jl4yz17","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]