Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Or maybe we could just say its "effectively conscious" if more often than not the thing is deemed to be human. Who cares if its really conscious or not. All we need is for the a.i. to do the job we want it to. If that job is to entertain a person with conversation, then thats its job.
youtube AI Moral Status 2023-11-02T12:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgyfW7RRRkCO5ewEo4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxFruh7jNQSIzZUVQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugz-C_13xaashLHZQe14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgzXjTgkSNV1-RdL48R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxFwX2EdS93mexyh-54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyng-ki4qosUCCNg0V4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},{"id":"ytc_UgztMfTOt2vLHD6lyTd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyD_BDqXF3ienVMjlZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugya-ZxUKYW9C2lwwSl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzo3Wln96TU9jk1cAR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]