Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We need to apply the Steve philosophy to future A.I. and create safe A.I. with less than one percent chance of an A.I. taking a human life
youtube 2026-02-17T23:2…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytr_UgyTTVdBYeH8TsR41ot4AaABAg.ATLeiz_CtkXATLesobZQFH","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgyG1rbFw3tdEe2cIDx4AaABAg.ARUp9aBgJtUARsjASfV4o5","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugzb0q3Q_ynNQzBYvCR4AaABAg.AQ1114CIOx2AQ11Q5F9vRU","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzTpSG5VXp6CvRnyKV4AaABAg.APZ5sTLCIxeAQUEf_Uk2gC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgyRpBAn5OeuXyXkkL14AaABAg.AP7KNEiD5YsAPmUOixCfIt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwcIrKrWsxEfVe0TRp4AaABAg.AP6x3WIMgHcAQ7dE-3_gkJ","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgwcIrKrWsxEfVe0TRp4AaABAg.AP6x3WIMgHcAR7b9qQrL0g","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugz96KgmDRInaVKOB2x4AaABAg.AP52wPo9N3DAUKxQH7ugGC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzWKfu7nhZrVCyv8UZ4AaABAg.AVnGxu4TqG2AVosQ2VODtx","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgzZB46Uj6s5PBf-8KJ4AaABAg.AVn8HS8eJ2RAVnBobFF6nN","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"}]