Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hallucination is an analogy, not a literal term. But there's no evidence that consciousness is needed for intelligence. And humans are constantly and confidently wrong because human memory isn't that different to an LLMs. People think the mind is a database, but human memory is reconstructed every time based on neuronal "weights" and is very sensitive to what "prompt" you receieve. It's why eyewitness testimony is so unreliable even when witnesses are convinced of what they experienced.
youtube 2026-01-25T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzaFyUx__pwPmBai0x4AaABAg.ASKKm78Jm6NASR8cGQ_8Av","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw3akF6Z0hYhNST1mh4AaABAg.ASJn2yJApqiASQYy3JSIot","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_UgzmR1fXcItpfixKs0l4AaABAg.ASJYHOX7H_WASQGA0xZczY","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQGHkgQRi1","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyYSu47-_ZvBUHwOJp4AaABAg.ASJ1vgVG1pbASQI0Ia7MfW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyTtnbsDrCRj52umzZ4AaABAg.ARlJ-u9p2gKASQIGDdm8OB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgxKjiXlPpsmKLuOdGp4AaABAg.ARl5nqxmxd4ASQINw9pt4K","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS7pdydpqJR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgwNR0pu_6IxR3-PeXt4AaABAg.ARKXtCh_tvPAS8wjyUw2TG","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgyECy1JrJdYgzogrut4AaABAg.ARFbR5wAQsGAS7pnrRsvAy","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"} ]