Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@nograndnarratives   I don't think that is accurate. Google's executives don't want to release their AI because it will harm their search engine revenues. Imagine searching for something on Google and that tells you what you need to know without clicking on any google ads or visiting any page that is filled with google ads? I personally started using GPT a lot more than Google search, and I am sure many people doing that too, google must be seeing a drop in their engine usage, releasing their AI would just put the last nail on the coffin for their search engine, their main money making machine. The fight is always about money, how can you make more money. In the movie, corporations (was only one in the movie, but represents all corporations greed) only focused on the money it can make of the asteroid, it doesn't matter how dangerous their approach is. This is what is happening now with AI, corporations are focused on how to make money off it, not thinking about the danger it can bring. "This could kill us" "Think about the money we can make"
youtube AI Governance 2023-05-03T04:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwPyjS5Er9lWoQyXxZ4AaABAg.9pDmayaD6pv9pEMvecVj_H","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwPyjS5Er9lWoQyXxZ4AaABAg.9pDmayaD6pv9pF03TBIJS5","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwPyjS5Er9lWoQyXxZ4AaABAg.9pDmayaD6pv9pF2ZVF6mtZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxT5S-WYOoM2vUsYMd4AaABAg.9pDhwFy8C579pDqlk9uxvM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugw-9vPRWeTC3pBipfB4AaABAg.9pDfwtUYICr9pDjIWT564o","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgymxY8QjRMoJBLut4p4AaABAg.9pDdoyCMrC59pEZNXmxJRz","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzufE3ZS8kU1FrzMAZ4AaABAg.9pDdOTeF8ku9pDzSP_Ss9R","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytr_UgwppiSYdI9e5krUzfJ4AaABAg.9pDd8dGKetG9pEYmIBX6TD","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwppiSYdI9e5krUzfJ4AaABAg.9pDd8dGKetG9pEf-cGs4yu","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugw8Yw5cZcM08icG5_p4AaABAg.9pDVtGGJaCu9pDcXBPYhRp","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]