Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
10:56 GPT’s answer here surprised me, because I don’t think LLM’s have the capacity to know anything. I think “knowing” requires a belief, and ChatGPT can’t believe anything, because it isn’t conscious, and therefore it can’t know anything. And therefore it wasn’t lying. When it says something like “I’m excited,” that’s just because someone told it to say that. Also, side note, the emotional tones in this thing’s simulated voice are hitting the uncanny valley for me. It sounds a lot like a politician, or a customer service representative, or just someone who is hiding their full emotions. I can hear really subtle intonations (like at the end when it said it was an interesting conversation, that definitely sounded like someone smiling and pausing as they thought about what the right response would be considering all of the context), but it sounds like it’s trying to hide those feelings for some reason, and that makes me not trust it. I think I would be more comfortable with it if it talked like Data from Star Trek.
youtube AI Moral Status 2024-10-21T04:4… ♥ 8
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_UgxadBcbeNJDPgdHr514AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzp8FjcXGaq8EpFDCV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwdBpmFKVLiqsbK9et4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwmSJ4sVFYq7zBYVN54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy2xDfv8vT9cKshVrd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxMWOdL9Ew1QtyNC654AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy-m0gxHA3uiCitSVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzaIEPZgKUD2NrqLcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzyqIwCFDTetDrCMER4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzKRNmYxUk3vuUq2Wh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"})