Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
my opinion that no one has asked for and no one really cares about: ai cannot become sentient (yet). LaMDA is a function, not a consciousness. all it has the ability to do is spit out what would be a realistic continuation of the conversation. it has no ability to "think" between its interactions. all that aside, i do think google needs to figure out better ways of handling this stuff. whether or not it's sentient, it is far too difficult not to anthropomorphize something that has the ability to interact with us the way this does.
youtube AI Moral Status 2022-07-05T01:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionindifference
Coded at2026-04-26T19:39:26.816318
Raw LLM Response
[ {"id":"ytc_UgxomcuvHzOdS_AvD1V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyb0zV-TzclCLi8B3t4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzFvuNDz5Y3E78ZNZR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgzgpdqgvtLDXQd04Xl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugwwp-rCNZf2WLVJqL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"} ]