Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Any A.I. Bill of Rights legislation would likely take longer to pass in comparison to individual courts taking the initiative to hire digital forensics specialists who can verify if evidence admitted was altered by artificial intelligence, or not. The individuals who have spearheaded The Artificial Intelligence Movement have a framework for how it works, which means other individuals were trained to understand that framework. Digital Forensics is a field of specialty for many individuals already. Elon Musk is wrong about Artificial Intelligence erasing the need for a bunch of workers in tech. They should switch to Digital Forensics to help the judicial system verify if deepfakes and AI-altered evidence are being used in an attempt to pervert justice. 🤷‍♀ Unless they can create an A.I. Bill of Rights as quickly as Congress created legislation to force the Secret Service to protect the president and presidential candidates in September of 2024 🤷‍♀. But considering the country's longest government shutdown followed the quickest protective legislation passing 😬 They could have 500 digital forensics specialists on payroll within 3 weeks while lawmakers are still discussing an A.I. Bill of Rights. 🤦‍♀ Or, just normalize having a Digital Forensics Expert Witness take the stand, just like, always. If it's not going to be a role at the litigation support level, at the administrative level, then just normalize calling a Digital Forensics Expert Witness to take the stand.
youtube 2026-01-24T22:5…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyregulate
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugxsmj5kXjmKGneH_4N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz5Y3OSfR_ChD3KOxB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx1MSw7cQ33Vr7XmOl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgypkVMstCmT7vK2s7B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzHmgGzhTPoYyMSnjx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugx1H-8DjZYSfzFXtS54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgywFvx1WdAxGM6vOb14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzCU1puhD6oLBWbgfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyTWX3LM348Q_9R5TB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugz7mcBeTtxJ41--hI14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]