Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm sceptical about this. 1) Rapid job displacement at a pace faster than govt redistribution ultimately means rapid economic collapse through erosion of the consumer base. 2) It means complete dependency on 3 or 4 US hyperscalers and a product which doesn't work if they decide to disconnect it. That's huge leverage and risk concentration. 3) I'm simply not convinced AI is that capable. As a productivity tool harnessed by a human it is immense. As an autonomous tool it is, in some use cases, worse than code. I don't see it emulating human consciousness. There is no sign of that. Yes, improvement has been exponential, but the architecture itself is stable and it simply doesn't allow for much more recognisably human cognition than we currently have. 5) model collapse. If a large proportion of training data is AI generated or, I don't know, state authored misinformation there us currently no way to mitigate this that doesn't involve humans in the loop. Yeah, lots of theories about how it could be done, but nothing tangible. 6) model capture. Other media is routinely used by proprietors to further views they are sympathetic to. Newspapers, Social Media sites can all be biased at the whim of a proprietor. There is literally no reason to assume AI platforms can't be captured in the same way and won't be in the future. They reflect prevailing cultural biases. Why not owner biases?
youtube AI Jobs 2026-02-18T14:1… ♥ 8
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxWBVcoGRY4aU6THnp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxhJL3xJzfVpZWSgKV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyjed2_gzkPoeFGz0t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzVd3HarDTfHby9mzZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugz8ABI8skVv2G2IfSN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugzl3SKKYSZ7aRayOPp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyaiRNPPCjlwTNZmDd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugy5rvENFz0OJCEgCVx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyibltRINo8ebq_99t4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwPw46r60dTvIHy8Hl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]