Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's so disappointing that this former Google engineer thinks a chatbot is alive. It can only respond by comparing existing data it has to new queries. LaMDA is designed to simulate human conversation and that's what it does. It cannot "think". LaMDA is a narrow-AI chatbot that doesn't remotely have the power nor design functionality to be sentient. Lemoine failed as a scientist as he did not work to disprove his theory, he only sought to prove it for reasons yet unknown.
youtube AI Moral Status 2022-06-28T04:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionunclear
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxIw5mQHUxHVViGxnN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwDYFSVqD6AS4tkOx54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzHO7t9Z9Xf4P2Eh3l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgwNb-mh_qhMA4VqK0l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzPxMu7bnZpcY4ITxx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]