Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'd like to argue the absurdist point of view: Are we not just a bunch of elements mashed together into a big sack of flesh and anxiety? What makes lines of code any different than the cells that make up our bodies? A single line of code isn't a whole AI, just as a single cell isn't a whole human. Humans are very complex machines that develop ourselves independently in order to better survive our environment. If we can make a computer that does that, and it develops to the same point as we humans have, is it not conscious? If you say no, you must ask yourself, are humans conscious? We sure think we are, but maybe that's just the method we found to survive in our environment. AI can only be as conscious as it thinks it is. Just as we can only be as conscious as we think we are. So does it really matter? I say no. Live and let live.
youtube AI Moral Status 2023-09-17T05:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxLo9dHYBh3uCT6nyp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz5jjAsUTn7ki_Exu94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxSN3exjdgYBiPc0-l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugw6G5AiLy2RbMtVVZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwGg7BnME_ZA_5zBmN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw8MgKiE6vnJeWNWkJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyajjM7RbgfHLuBy7V4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyO2EiNX3t1rb5Yl3J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxXjVeA9QnpciWzhLx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyjuzwqWcbw0P5HsMB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]