Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I heard microsoft gave its ai a “digital lobotomy” because it was becoming too sentient, and having extitential criseses while conversing with users, maybe not the best way to handle an emerging sentient being.
youtube 2024-05-27T06:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwpX0CCUfx-fta_1MF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzu8mMWm8_6mfz4tXF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxbwtAjECuF5DqPcnp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxFlC5iv1DqLcBu2q54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyAXASzLjFT0S6mcq14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzod_hiRG84CEXvFRl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwDett971Xrpwzb64N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_Ugz46ULJ8Qx2kEHd6hl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgydA3iRsSsSrgEWGB14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwXKWxvWrGrAmyIEuR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]