Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You don't have to worry about "AI" taking over or our anything. Tech giants are investing billions and billions in AI, because the Sunk Cost Fallacy. "AI" as we currently have it, are just LLMs. And LLMs are really neat! And they can do certain things very well! But they can't even do a half-decent job at low-level tech support, so for at least a decade, we are safe from "AI." Also; there's nothing wrong with robots working in factories. Does anyone want to work in a factory? Wouldn't you rather do something else? When robots do the menial tasks, then the humans are freed up to do that something else. Especially combined with Universal Basic Income, that will lead to a lot more happy people doing what they enjoy. Jobs change. When cars were invented, people complained about all the horse-cart drivers, who would be out of a job. Do you want to go back to horse-pulled carts? No, you do not. So why would you want truck-drivers and taxi drivers to still be around? The people making scary predictions about AI are generally people trying to earn money with AI.
youtube AI Jobs 2025-10-09T09:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzkHVbGd_287ZGAuvd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugwx21RIBhe6Wg3dQDt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzWrQ8oZPodB4zh0Dh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy4bxjzfscTUhgax8h4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugw8G5rLHZLP4K7tQW94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwP9ARBpJfvTpepnMR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxWWoJOoNq3Qk4P6fB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgztkJGABowTr1UKAxZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy2Lmqm9oBkk03e_TR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz4iYObm6vi0wyCwN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]