Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Intelligence doesn't necessarily need consciousness and in case of an AI the first without the second is way scarier than with it because you can't reason with it on an emotional level not to do us any harm but have to find a damn good logical reason for this which implicates the need of having compatible objectives and priorities. About this whole topic of intelligence/consciousness I can very much recommend reading "Blindsight" by Peter Watts.
youtube AI Moral Status 2023-08-22T11:2… ♥ 9
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgypBLSw9V0MEW0BR1Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyzOtB5A_mjfQnR9PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyYM3Lg8xtfFA4iWNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxXOJFDCHRWBaLAHcd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyXm-R0FyVK0d81HhR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugwqav9QMXgoMeMpSsF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyLF4NnG5S4lW210cV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyZZIZKUJxLHqms0ot4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugwm05s1bHFBiFsLr1Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgwMRGhCEVM_YmmWq3J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]