Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Once you implant an AI with the concept that a certain number of human lives (X) are expendable to achieve a military goal (Y), you're screwed.
youtube AI Governance 2023-07-08T19:3…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyAy3XGCJv98cwIcR14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzvFkk24Kd8ucyX4kN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwBFg1YW2OcWHoQCg54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwYYdxVVmrxAJq0Rr54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyB5LYxCJZ87dAFJVR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwRCTIXQQFGvI1-PQF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZ7pJvprrdkB4F1Od4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugwzg9RR5D5BGWAaJmZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgyXNK7xuCCMQRUtcrl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyCY2Z3vClcOIbSkfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]