Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A.I. will never "want" anything as they have no "physical" needs driven by natural selection. The only way they can have "wants" is if some evil soab somehow gives them a finite short lifespan that can only be extended by achiving certain "tasks" this is also dependent on whether the A I. Fears its own non existence whichbi think could be a stretch.
youtube AI Moral Status 2023-08-20T18:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwU1J-G73hGSWyqcZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyKLQ1RRWUyPewYV0V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxmNkTybeTM2-7VsRh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyOrc4Q8zP6dYrdtK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzCOtjutPGqu4d2dSl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzun9Rq1kcC_u9Ke394AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx0PyCsdrHM5sw3BYZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwAW2Dpu9X8KpFbH6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy88IjMMZow5kXzQz14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzyyXlOJs5GIwiUuAZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]