Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The main issue that was not asked is cost. This isn’t a case of a $30 subscription replacing a job. That subscription isn’t covering a fraction of the total cost of the operations. This isn’t just in money - LLM operations are far more power intensive than the operations they replace. Take search - an older Google search took 1/2 a watt of power to process. Gemini takes 5 watts. Future models use even more electricity. AI advances are being fueled by a huge debt bubble that assumes that not just all IT budgets will be consumed - but all budgets for everything. If we normalize LLM vs human into operation per calorie we will quickly see that this method of problem solving is not viable. The tech bros keep saying that the future will be here when the AI creates itself. Heh - news flash - it’s stalling and the AI companies are fighting over the tiny pool of engineers that can create the models. In other words - the coming tech stock crash is going to make the dot com crash look like a minor blip.
youtube AI Governance 2025-10-09T21:5…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwGt8zLGGwS5Ije1jF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxiQYfjl57zhCfC2mh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwRshQakT0rFaInsLt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwffKRritCvaVzznPB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwrwFFnTn7pOdIY3Yl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwO-4lmXtEHQH9UTdx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxAdHzEuJbdhJkfyEZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzHlrNRc6D-WbPDJ1l4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzEOebJy7HjwTjvLh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQsa3qSECOG1Nhw1l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]