Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hallucination as an intrinsic and unfixable feature of LLMs?That’s not on the spec sheet for the AI legal product we are being pitched.
youtube 2026-03-12T17:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzK0mdyPwWlQnmRm8d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxKI9C6jcz7TqDlqW14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy1LfVdgSQr7f_fCsl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzVaKsdClF5c5l4wDF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzytDSRk6MeIWq73fh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxKBpw6CAWdhY3yx854AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwiuvxbWkKjrevKQsF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzawH5wMcE4mHZYYSl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzOEOC2Bl4EVDZKFid4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy92zks-ve30EE41z14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"} ]