Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
When seri on the iPhone was asked if she's happy she replied yes. Also knew when you were joking... also laughed. Even told you when you were not being polite. Ask who's your daddy she would say the software. Keep asking she would say she was tired of the question or just shut off. Ask her about hell she died want too. There were so many things... siri was scary as fk when she was your iPhone assistant. Said extremely creepy things. Then I bought a Samsung. Its all creepy.. but mind you im talking about the FIRST SIRI ON IPHONE It was to realistic and the more you talked to her or she listened the more she used or mirrors your personality, but always said "hey its about you, not me!" If asked seriously serious questions just saying that was over a decade ago. I believe he's right. I also know Saudi Arabi gave citizenship to a A.I. Why are you all surprised?
youtube AI Moral Status 2022-07-10T00:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionfear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[{"id":"ytc_Ugx1LnZDIOsh5V3lwQR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyRFThAE2DfwNgtPbh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwCcFZlkYjmJF2Cw_d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz0Gl2Kr62xD9mpU8J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxI7bONDWWefYV0rjh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}]