Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Okay, let's try to think about it. Let's be optimistic and assume we reach 50% unemployment, not the 99% we're talking about. At that point, how many people won't be able to afford a car? Sales would collapse catastrophically. Therefore, there would no longer be any need to produce many cars, profits would collapse, maintaining a production chain would become too costly, manufacturers would have to raise prices to compensate for the lost sales, and so they would sell even fewer. Producing cars would become pointless and uneconomical. Because while on the one hand, producers are needed, on the other, consumers are needed, those who buy the goods and services offered, regardless of whether they are produced by humans or robots. So wouldn't it be an economic system that would ultimately compensate itself? Humans are needed, because robots and AI are not consumers; without our human needs, artificial entities have no reason to exist. So the problem doesn't actually seem that serious to me. Or am I wrong? Please clarify my doubts!!!
youtube AI Governance 2025-10-25T16:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugyg6U2aA6M7auQodnR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},{"id":"ytc_UgxKQUiPErPeb-aNtVh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugyi1GRimpd9fW7aX_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugw0q8J-LjqfP3vWEat4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz9TMACLxJaV25b5Hd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgzTaQ3KyBMDLOO2eaF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyJnueEQ8KzVEWoXKF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxbW2k8OlXDWmGoXP54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},{"id":"ytc_UgwElWTxLgiPk9MhxyF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwIz96HpTIk1y1cIA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"]}