Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The Trouble is corporations own most everything their spending many billions every year on AGI. Their bottom line is trillions that their investments can make. They don’t want to think in terms of safety. Elon Musk has been saying for years to slow things down, It’s not happening. If we don’t get AGI first someone else will. So there is your answer, follow the money. AI may be built by us but it won’t ever be controlled by us. I don’t understand how anyone can think we can control something that is 100 times more powerful then are smartest human. If we have the smartest human, say for example, this person can speak ten languages and is a master in math. This same person knows next to nothing about medicine. Well, AGI, say at the present time knows twice as much as this person in math. AGI also knows just about every language plus every other subject, and can out perform most every human in two years. We’re not controlling it know and it keeps gaining more and more data every day. Nvidia is talking about more and more data storage etc etc more memory needs billions of dollars and so it goes. I agree with Roman, big money speaks and their not going to allow safety as their first objective. I don’t think most billionaires can possibly think in such terms. Yes AGI could help humanity in so many wonderful ways, but it just might be as distrustful of us humans, as we are about ourselves. It’s hard not to see the mess us humans have got ourselves in, but you must be blind if you see it.
youtube 2024-06-11T02:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgzpLlNGFc3YJuNeRux4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwcmfqcgiBy3UKK_Dx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgxOpm8Brpy_RzBvFLB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx7WK25ydv724vvlfF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugze3e9pdWk1-9ARvlB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxXEKfbMZq_SFkchH14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzjIvlLUvmrQQGjE6d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxTA72GRTokAuYcYEl4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy0wINNYX1bofNiRsZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5pCVmEXXuxeS96mh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]