Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
While ai context engineering, I take these hallucinations into account. Normally firing up a new 'instance' works fine.
youtube AI Responsibility 2025-10-01T12:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugw44sACsghTTLSIcrJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxbsGSnwKnKRIrCLQJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy_RNRZu3qIJTmL-xh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzGv_eOyg-fP0NHK_N4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugwble50HI7doGtwBI54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwD8OaPPBXl4BIgJh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyUuyRSDgx-Oa6qIId4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwQcnj9sluA3qtVn9R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgySGRFGuTDM7SWt2C14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxO61gJk_pEmwZNpkl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"})