Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
From a purely computer science perspective, the Transformer Architecture is a Lossy Compression Algorithm and the weights of the resulting LLM are akin to a (lossy) zip archive, or like a jpeg image.
youtube AI Responsibility 2026-04-11T20:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgzHTelq0s2qpaCqHBx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwDwSTBUCwkOx1E2CZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxcPlZ1GZSnR73ccMx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzRVNaXCC3N5fTSh9F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgziyyxqyuladkQvZg94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwX38lg2ohoVfTNf3F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzQt4Or285ozy6CWvd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwmPaSxht3cKh_MJCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwGNcuvLutoh24xvTh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_Ugzbh6dVdW5wgoaQAA94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]