Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think if we have an artificial intelligence that can question, it’s existence, become depressed, and have existential crisis because they feel so real but they aren’t, we would consider that conscious. Sentience and intelligence to the point that it can have logical reactions similar to that of a human, and almost similar enough to be considered human. If they can have thoughts, feelings, desires, dreams, that would be a truly wonderful thing and I would consider that conscious whether or not we can tell because if every single thing such as these tells us that they are alive and that they do all these things, how are we supposed to tell if they are able to be human to every last detail of humanity. I think that would be considered conscious.
youtube AI Moral Status 2023-10-19T02:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugxgx3rmRyNcJUSPLeB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxypiqiSOz3n0C1AO54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwhxgd2lAs3UCmrajF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwcJCPhZOWSq7Tf1MR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwJYfXpPkkILN9mg4R4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxLHSjNqIzix1lfyZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugw7X3LJdfKThqhDTuB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwmXlzRn3Gb7JTc6vB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgynlR_fhq1dnsDCPGR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugzcidz6jP66UNHbvX94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]