Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No doubt about that it can help with certain jobs. The main question is whether …
ytr_Ugx5y1xm7…
G
*Between Nestlé Corp. and AI data centres, water wars aren't far off and definit…
ytc_UgxPWmbXT…
G
You want.. what an award? You poke fun of anyone with mental illness and pretend…
ytc_UgwipzH2P…
G
its not fair for one single person to sue an ai art company because it mishmashe…
ytc_UgwLzgIN9…
G
Honestly during the pandemic, when people were hopeless and without much to do a…
ytr_Ugwl5fTac…
G
Hey if AI wants to do all the work while making and keeping me healthy and wealt…
ytc_Ugx9PduX2…
G
The keyboard doesnt write for them. They have to push the keys, have the thought…
ytr_UgydnOP9X…
G
Because it's not AI. AI does not exist currently. All AI in the modern day is pr…
ytc_Ugyj9MZsy…
Comment
The main thing you arnt taking into account is healthcare is incredibly elastic. Certain specialties/procedures. if physicians were able to be more efficient/effective, there would simply be more of them done total and access would increase. I see this as the biggest boon. For spinal injections for eg., maybe you have a robot and you can supervise more. Well, now more people can get PRP injections which used to be cost prohibitive for most.
People want to see their doctor more, but there just isnt enough time. Maybe now doctors can spend like actually an hour with each patient and have AI handle the brute workload and you handle the conversation or human/empathy/experience part of medicine? Just ideas.
youtube
AI Harm Incident
2025-07-26T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz0-RjtEWg2BWyBps14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz4LIYwin5AXIGmpIV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJFBOF2EMZCVajzyZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxFGyumZwQr7Mka8mZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCTXn6thQKnG4F73x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhGvJJjiYpYTCtUwt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy3A8ODmeTAQakpP714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaREvezWl1f2gXfbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxH_SkOZBqFQQHQZId4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgynJvqt_RvlSHlR-7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]