Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I believe it's as smart as it will ever get. I've used ChatGPT from the beginning (~3 years). Over time the models have produced better text, but it still makes up/hallucinates whenever it does not know the answer. Chat GPT NEVER says "I don't know" - it just makes it up. If the entire world's internet(s) have been soaked up by AI, what is it going to train on? AI data that's hallucinating. Sorry, but I think Roger Penrose is right and there will never be AGI. It will be just smart enough to fool the average human but when you delve into esoteric concepts for which there is no viable dataset for the AI to work off of, it just makes it up all the time.
youtube AI Moral Status 2025-07-30T06:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_Ugz7T-OPkus0TvyTLA94AaABAg.AL2TPZmCYgJAL2qWayKp2H","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgyQNmkKMeyodwcCub94AaABAg.AL2T8JvVrW5ALBffpVGc4_","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugzc7f5C6QbUywSxBsp4AaABAg.AL1Ya_bi8FEALGUE3Y4oYQ","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugzc7f5C6QbUywSxBsp4AaABAg.AL1Ya_bi8FEALIQ0mijLli","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugzc7f5C6QbUywSxBsp4AaABAg.AL1Ya_bi8FEALYLaBliLi-","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgycuRmXdnQ8oKgMB914AaABAg.AL0sC3KaXamAL1v3c9HF48","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgycuRmXdnQ8oKgMB914AaABAg.AL0sC3KaXamAL2JnzeAvxX","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugz9GUJN6HY6pNNIFSF4AaABAg.AL0bl5F9sSIAL1tiyWA7tW","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgwMa7sb0ZgWawrIvwV4AaABAg.AL0Nn84ajiVAL9127x2JZK","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxc4rhY875jOg-ToNl4AaABAg.AL0L7Y-kVriAL0L_IG290z","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]