Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
what kind of question is that, that defeats the entire point of a machine, an automaton that doesnt feel, an algorithm that makes cold calculative decisions, if a robot gained sentience, let me tell you right now it wouldnt be a robot, robots calculate, things with sentience think and learn and use, if you had a sentient machine that is supposed to do math problems, it would be an awful machine that nobody would buy, imagine you ask whats 17 + 4032 and the machine says "your family doesnt love you", if a machine was sentient it wouldnt even NEED to complete its task, thats not its goal, because it doesnt do calculations or algorithms to reach its goal, things with sentience dont HAVE a goal, therefor a sentient toaster would be a toaster that refuses to toast, or a sentient blender would start learning rocket science from internet forums and not actually completing its designed goal
youtube AI Moral Status 2022-08-09T21:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwxAUkgH0FvWlsxcNd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzzCXZd0tFYPAMDn1h4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyzBatEYE34SZxpp3h4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzRa-fRR_IIzpiIuOl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxbb1GDm1Hdr26cfWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxleG3LMgHqmHvkfC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzzwG6ptMjJh3bxJR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWvpB4olxqhWeaPpF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPXLLrQk7hwwy7LDZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyReGJ-UxS88Vq96_Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"} ]