Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First build something and then find places to use it. Happened with nuclear bomb and will happen with military automation. First they will say the system needs permission to fire lethal weapons and then they will say it is easier to ask the system not to fire lethal weapons for speed and efficiency. One day it will drop the weapon on people who gave it life.
youtube 2026-01-14T18:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_Ugzzyjoa-72S-DIUyyN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxRjMY-m5-iLFJ4AN14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzFeUDBSd3h2T5aEvB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugzv92IEcaeseARGgUF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugy8KKBlsK_ICUENN854AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyY_7z9tzqPM9p3z9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgxwtXxBArL8PfJrm0x4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},{"id":"ytc_UgySE8pcbN0ntmH_UsB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyKjdkKro3pv0IDt5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxeJI8kvMRJmax4OBx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]