Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I guess there could be a possibilty that at this moment in time, whoever is behind this, wants humans to rely on it the same way we rely on our smartphones. Once we're hooked/addicted, they can then sell it to us (like some sort of ChatGpt premium with unlimited searches).
youtube 2024-01-06T00:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzVqyB9SIVezavFaXV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyFWWHttt6GYJiYOGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwhUmj-yjh_8_9hvlB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyifBFxDWceqRkbIrV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwymHr25Fn_Bm1BZap4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxQKayT2Xl5R6wE9ip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxLU1huybkn4UXr5gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyxrQyqC0j_LzOI7914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz9DWlmX0IDtYu9aGJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyIs8MgLDka7Z3zadJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"} ]