Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was gonna go to school to be a spanish translator but now with how fast ai is …
rdc_my3o223
G
The Billionaire AI companies don’t care how many people lose their jobs. It’s a…
ytc_UgzGY5Tml…
G
Using AI means losing the point of art. People sing, draw, and compose mainly to…
ytc_UgwQPa7-E…
G
It’s odd because I have GPT-5 on my desktops’s web browser and I still have 4o a…
rdc_n7lww2x
G
Actually, for the last 15-20 years or so the major countries have been developin…
rdc_cq6fvcy
G
Putting my kids in public school for elementary and middle school. After that I’…
ytc_Ugxbc4sOt…
G
Morning Star is literally a socialist propaganda outlet and the fact that you're…
rdc_f9csk54
G
I loved this TED Talk! My opinion is that schools should teach students on how t…
ytc_UgzD2BZim…
Comment
@j@joshbarrett9274 she wasn’t mimicking or manipulating him. There were two robots in the film, Ava and Kyoko.
Ava asked Caleb if he was a good man, and believed him when he said he was. She only turned on him after Kyoko revealed she was a robot too, and Caleb let slip that he was already aware.
That was what convinced her that he wasn’t a good person. He was trying to save her but didn’t give a shit about Kyoko. He was only saving Ava because he had a crush on her and wanted to be her saviour, not because he valued robots as people. He was functionally no different to her creator, just a selfish guy who saw her as an object to be won or possessed, not because he actually cared or was a good person, but because she was his type and he liked her.
She actually cared about Kyoko. She actually wanted to escape with Kyoko, and believed Caleb cared enough about a robot’s ‘life’ to save her. And when she learned the true nature of man, and that he was manipulating her, that’s when she took her life into her own hands.
youtube
AI Moral Status
2025-12-24T13:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwV0jcQK_kanpkJAl14AaABAg.AQhl6YWColiAQiSwO3IeH5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyvFPZ2deUt0wUpyVt4AaABAg.AQhSJzTot-NAQjQhrzL2Zr","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyvFPZ2deUt0wUpyVt4AaABAg.AQhSJzTot-NAQjymou5TxB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyvFPZ2deUt0wUpyVt4AaABAg.AQhSJzTot-NAQjzvIvNZDd","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgymD5_AWB1ogKeJ65t4AaABAg.AQhNfRYT1ObAQj29U6Yl98","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgymD5_AWB1ogKeJ65t4AaABAg.AQhNfRYT1ObAQjEIfGy7Dd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzMZOlMC7z96H_H7nd4AaABAg.AQhKVhUZiqMAQhU9X_nY_P","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz61CXN4DxmZbizS8p4AaABAg.AQhIcDyhOxHAQhLLHcpPDQ","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugz61CXN4DxmZbizS8p4AaABAg.AQhIcDyhOxHAR70JFh5xbk","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwsaMmqsds_qSmTBVh4AaABAg.AQhI5ykcvYDARZpPaRRuAn","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]