Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI ain’t taking over nothing. Most companies are feeding miss information and hype because it makes them tons of money. But when things level out and you lose money in the stock you bought because of the hype it won’t matter because they got you. Have a look at the halting problem. Meaning: “ we need machines to help us do the things that the machine itself can’t do. “ Example: we need airplanes to do the things that the airplane itself can’t do. Can the airplane build another airplane; can the airplane feed itself; can we as humans fly nonstop 2k miles without the assist of any flying machine. Example2: I, myself need a human or humans to help me do things that I myself can’t do. Not sure if anyone can do this but I can’t write code and draft up and proofread a legal business contract at the exact same time. No one can perform heart surgery on themselves and it would be extremely foolish to represent yourself in a court of law. So, AI allows us more time to do things with our brains while it handles other things as an assistant. The brain needs rest and after you have studied, worked on projects or just been active your brain needs rest. If you did not know, your brain is a muscle. It needs rest. But with AI as an assistant with many things we don’t get to be lazy, we get to be effective and in some ways move faster. However, no matter what, some things take time.
youtube AI Moral Status 2025-07-24T07:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgySiHo-lXOBAz1eqTp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgxVNYOuaFW9tjaSf9V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxM1IxVtSr5-ACdcc54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzptDQn2YRbMel4swV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzZ_oGmzt0BJu7LC0h4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugx-zJkIrwlGUGAuLzh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxrzxqtE93cthrwhCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyiOFc_t1naNAeO0N54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzr82dArMgbpwk6G9F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugyb5NNdwoy6y9uHDi14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]