Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If the algorithm is 'predicting' black people over white people more, it's obviously the white people who programmed it to begin with that didn't have a clue. OR did it on purpose to further persecute black people, just because they can. Sure, pretty much all algorithm's need time to adjust, but THAT far outta reach. NOPE. That's definitely been programmed that way.
youtube 2022-07-26T22:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzG42WDC-z6EpnwyYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyN0MORpZaY1v49Lfl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw-51QLPjQ6yxdapfx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxwxbiKbar9ktWP3iZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwqf155AC7MjMLIQyt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyNT-lVYcDsR1Z66KR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzDSexj0m_MZQrtp894AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugz5xL4nSIRF2WVX5pB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw1i4_BYgtF5y6aGpN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyeKeAQU0xLkw3RKP54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]