Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
All this AI crap and reasoning by the inventors reminds me of a clash I had during my compulsory military service in the late 80's with one of the Military Scientists that gave us a lecture about NBC Warfare. I asked that guy why are they developing all these, especially Bio and Chemical Weapons in the first place. And the very predictable and very unsatisfactory answer was - "So that we can create counter measures." I told that idiot that there wouldn't be a need for counter-measures if they would refrain from playing at being GODS. That guy, a Major with no sense of humour, pulled rank on me and threw me out of the lecturing hall. He also made a complaint against me with my CO. I think the charge was "un-militarily conduct" or something like that. I told them, fine, I quit and go home. Unfortunately my CO didn't make a big fuss about it for me to be discharged. I didn't even get a real dressing down.
youtube AI Governance 2024-01-17T15:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugym0eARlUxFgZ3-JDh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxB574rBySRm4oTMQF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyLcbP8mZgDHqhW1FN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy6V99cAK_x4JsNCP14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxtvGJxWX1RDN-3qiB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgyF_7XtI79e2ZkIfNZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHusGj6aNDNR7vO0B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgxeQl1HqxwJUOSuDDB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw52pVXKRkcw5incDB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgxmXyJ1MDO9oJjD_254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]