Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you've read a script of Blake Lemoine conversing with LaMDA you saw that AI talks about feelings. Then, Lemoine could have asked: "feelings are purely chemical in nature, certain neurotransmitters are emmited when we experience 'feelings'. As AI you don't have any neurons in you, how can you possible have 'feelings'?". But, Lemoine did not ask that, unfortunately. Also, Lemoine somehow keeps ignoring Chinese Room argument although he must certainly know about it -- John Searle talked about it many times.
youtube AI Moral Status 2022-06-29T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgwYa9enlEq6dOcMsWh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxJoLs8gCZGy0lOLs14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgweLz4rL9ChWA_bjkp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyDMhQg7w23y2DS12V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxOlN-V2Khg8grRGFx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]