Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
More on this: The blog "AI Weirdness" has a post titled "ChatGPT will apologize for anything" that demonstrates this. The author demonstrates starting a conversation with an LLM by demanding that it apologize for something it never did, and it gives a deep and sincere apology, including an explanation of why it thought it was a good idea at the time and how it'll do better next time. When she asks it to apologize for something absurd and impossible, it picks up on that and assumes the conversation is meant to be humorous, and gives a delightfully silly apology that builds on the absurdity of the situation. It quickly becomes clear that in both cases, all it's doing is roleplaying. By extension, when it apologizes for something it really *did* do, like, say, accidentally delete the code repository you told it to work on, it's almost certainly roleplaying in precisely the same way.
youtube AI Moral Status 2025-12-16T18:0… ♥ 7
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytr_UgyGGsy-Cc7bvs5zlWR4AaABAg.ASrQ3_JZF5nASriwQPD-TO","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytr_UgzR-0PJGnzbvV61HSh4AaABAg.ARrnMzyzSp3ARtyMrc1fbe","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytr_Ugwn8qj6IYR7McEx7EJ4AaABAg.ARLgqxzPxLDARLgsSPqZSA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytr_UgwVxmo4X_hgu3PncmF4AaABAg.ARIwloZMbAhASqkFgk3Wyf","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytr_UgyHL_P9RKGxs5Xz2JV4AaABAg.ARFYlJwJVZAARFZ7bS6dQy","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},{"id":"ytr_Ugy7AdZ5QN-ymetkA8B4AaABAg.AR5jLKQn_fWAR5jqpCK1zG","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytr_Ugy7AdZ5QN-ymetkA8B4AaABAg.AR5jLKQn_fWAR5ju9n4obw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytr_UgwGjcwuCAXJBOSJhFF4AaABAg.AQnr6aIZonGAQns2SabEzX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytr_UgwzagCkWVDZSnfAQHV4AaABAg.AQbwBEncofwARgev8HZeQO","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytr_UgwzagCkWVDZSnfAQHV4AaABAg.AQbwBEncofwASz05Z0qKU4","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}]