Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think humans place too much emphasis on sentience. Because it's so integral to our daily existence, we assume it's a requirement for dominating organisms/enities. To me, it seems entirely possible that AI can initially be programmed to act in it's own self-interest with the capacity to learn and it eventually reach a state where it ceases to obey human commands without ever having developed sentience. Speaking as a biologist, I suspect (but don't know) that insects lack sentience, but they can be formidable nonetheless. If not insects then bacteria, or even down to viruses. Replication with plasticity has been quite a successful biologically, why not for AI as well?
youtube AI Moral Status 2022-07-13T15:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxI9HwJW6hCLn9S4Dh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxEVRvboPpePcb8FJh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz59OdMBIsztqKAPGR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwOXYBiV-TpNSCfeLh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyHFrVJz5AvC3gjKSx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]