Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you don't even know what F3D is when that's apparently the thing to check when citing cases, you definitely aren't qualified to be a lawyer. Dude should lose his license immediately. MAYBE allow him to go back to law school for like 5 years and retake the BAR or something, because I do believe in 2nd chances usually... but without redoing schooling and the testing, certainly shouldn't ever be allowed to practice law again. Honestly, if ChatGPT or similar were actually connected to real legal databases, they might end up better than a legal assistant or whatever (paralegal?) at citing law within about a decade. Would still probably want someone with the proper credentials double checking everything for the next 5 or so years after that... but then you'd probably only need a standard secretary to handle your schedule and stuff. And that can probably be replaced not too long after that lol. As Philip DeFranco says, "This is the worst AI will ever be. It's only going to keep improving from here." And I for one am ready for robots to replace you all, and Universal Basic Income or something. Why should we worry about robots replacing us? Instead of having to work for ~60 years to have the money to enjoy what's left of life, but no longer have time... Why not just enjoy life for the brief period we're here? Do fun stuff instead of being corporate drones and stuff. Let the robots do the work.
youtube AI Responsibility 2023-06-10T19:3…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxREwDvf3uYcxm_Iap4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugxes2nVwFtJGt0myQF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx-AuIeJMwrosXjq3J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugxd_TOz3Givk0JAju54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz41ClDWx9afQJ04AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxD0eiVDug1g4R9lmh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy3H1mEH63AlqyEnjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxqyhQzZxx6HpEePsh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwnWKRad2ewHO7P28x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgysUo1whnOK6Cn30YN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]