Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I always thought it would be interesting if there were AI models that were actually trained with past patient x-rays, I feel like AI would be able to detect issues so much faster or even things that a doctor might miss. I don’t know why it’s not being used more today.
youtube AI Harm Incident 2024-05-31T18:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugydut7gRuUSpcDD7Qt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxcEpRQ-CZ0fyIktXp4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxPiSiWj-O2QsuiQ_h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw2VMygv9EGzk0tgid4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwFAfYqHm4RowZey2t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxujPAAka2q7HOYER94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyXHrPKQw5ot92xnvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwoRMU4neec6QGWJIl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwCf7v0utqAApG2ekB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz_GALp9O-msg41hIZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]