Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Nope. GPT doesn’t collect information, it just learns logics and possibilities of meanings. It might store info for NASA, but not for any one like us) but overall, AI learnt to fantasize using our language. It just knows which info MORE LIKELY is appropriate for the requested situation.
youtube AI Moral Status 2025-07-11T15:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[{"id":"ytc_Ugywhza-eJH9l9aYdnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugz4-fdXNpizap8TNbF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-QC02oCFl-mZO0z94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxFLNyYZsHz5mEdTOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzzn1JPFSyFeMadEP54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzIFUl8CxHbp_RejeF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyngXMhf8c_IOFA89p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxyVXuQGPHKrv69oRd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyFNGsRRUPxaQlwKi54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyBj-EhpUS9q_TTyyh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]