Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The issue I see with AI is it has to learn. In order to learn, something has tell tell it when it is right or wrong. Someone has to tell it that all this data you are seeing about chemtrails is nonsense so don't include it when you process your inputs. As far as it writing code, code that works isn't the same as good code.
youtube AI Moral Status 2025-07-24T13:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzH6TXipICLYs9pgFp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzyWf18CvHO95gfAoR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwX5YcFnlSjRPk1Vap4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxCHXgRxtpPUg6cd994AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwMUMBop0KNA46o58N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugywzy_0LtEhRErC4Lt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwsUjs54L74Xgnvgxx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxwELpb3zk4KZ5kEjJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgyIB2DOo1JeJsdFHKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzaBNGj1b-H78DO7AZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]