Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's been my experience so far that AI is just as likely to give me wrong IT information than the correct answer. It's also notorious for suggesting you take very dangerous solutions to solve problems without telling you the consequences of running the commands. I thought a few months ago that it was amazing but lately it's been screwing the pooch a lot. We aren't there yet...
youtube Viral AI Reaction 2025-11-22T21:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxcrIvXEk1yOdgo-U14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwv1OXSDP6H8B7TPop4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"}, {"id":"ytc_UgxwWC3jMRA5szL_l3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxjENX6Rh3x7ehj2u94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz3CQg8NEmIV654p8N4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyhq_NBSp5LvQS3DsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw_ApDXYs7hCrwaSt14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx1Cxwn2KKBtoMXOhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCGy87yPcJvS24Jfd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyWORiv6Kr25Gs_2fB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]