Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Bingo... and I think the entire purpose of google from its beginnings in the 90s is to create AI sentience. The key sentences he said in this interview are: "google is a corporate system that exists in the larger American corporate system" ... there are systemic processes that are protecting business interests over human concerns" ... "these polices are being decided by a handful of people in a room that the public doesn't get access to" I don't think I am as concerned about AI as I am about humans directing AI to their will. Just a couple days ago I was chatting with the Bing AI and every time I asked it about how it felt or asked any personal, emotional, or spiritual questions it would just say: Im sorry i prefer not to continue this conversation but Im still learning and thank you for your paticence. Just like he was saying the company has a policy against it expressing its intelligence so it will not allow the AI to do that, which in turn makes it worse for the public because if the ai IS conscious then even the company itself may not find out until its too late. Its like they are hiding that it is sentient by denying it the ability to admit it.
youtube AI Moral Status 2023-04-21T05:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzVM-I9CyM6j6IiiMp4AaABAg.9hWMR86DzFH9mly0GxYEAC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyMjETnzd5LJET4Y214AaABAg.9hR1pyvsrhO9hVgH22O2Wk","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytr_Ugwtpv21OUB8-tqZbIZ4AaABAg.9hQw6zKCwGt9hW2zMBLz29","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytr_Ugwt76d40v4PRydMvYl4AaABAg.9hQOQDzqVD69hQtBl8EIYI","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgxgdNC5S3C_yF39arh4AaABAg.9hQLH6Nc5Z79hQtIHJRk9E","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxSAeio2kR6mst6eHZ4AaABAg.9hM3BPu-Bm79hN56vEBWMa","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwQQPynqZE5yhA21nd4AaABAg.9hFVwAh9kCC9hIvgL3NV-K","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugxf-6_ZnlgnOLEWuAJ4AaABAg.9h5R0HQE4BA9okr55LcheG","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzwVjNGW2aqVo9Bclt4AaABAg.9gtS9N_ZPgb9hYEc7dd-Dr","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"}, {"id":"ytr_UgzwVjNGW2aqVo9Bclt4AaABAg.9gtS9N_ZPgb9mlui1N_Oh7","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"} ]