Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
IMO consciousness requires memory and change which LLMs do not have after they are already trained and static. During training and fine-tuning, or intervals between, that's another question. Train a base LLM and then fine-tune it on conversations that it then recalls in a new conversation as a slightly new LLM in the new moment. Is that consciousness?
youtube AI Moral Status 2025-10-30T22:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxxuL0rIDRv6S4onAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxV2YgRxgdc1F1hK-R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxxMcFp938sqEB2x6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx51tCuxt7S0BiUp614AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgwYohzxjxoYmuBkcrV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"}, {"id":"ytc_Ugy1pg6e_fFmqKOJTHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxFZpLLvJEtoqFWd654AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgwRwTJYJFvGhe5WBGd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhB5pcpXVKzCtGOUx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx-BFV-_V6K0ci-9zt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}]