Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The reason we're short on compute these days is real-time inference for people who want to talk in real time to an AI chatbot. People underestimate just how much math is really happening. A single short query to a modern chatbot like Claude Opus is 600 teraflops - that's 600 trillion floating point calculations. And the worst part of it is, due to self-attention (which is required for modern AI to work) these calculations require \*all\* the data to be in one place and readily available to the GPU processors - that's why VRAM bandwidth is actually more of a limiting factor than compute in modern workloads. Here's an aside; suppose you had a math genius like Ramanujan, with pencil and paper, doing floating point operations like those seen in AI operations, 8 hours a day, 5 days a week for a 50-year career. His lifetime output? 12 megaflops. One query to Claude opus would require 50 million Ramanujans working their entire life doing calculations by hand. Other commentors mentioned SETI@home, but if you distribute a calculation like this to multiple distributed processors connected to the internet, you now have replaced the bandwidth of an on-chip VRAM bus and memory controller with a CAT6 cable and possibly even the open Internet. That degrades performance to essentially zero. There are elegant solutions for hypervisor-controlled data centers with fast interconnects, but they require specialized software matched to highly-specialized hardware; Grandma's old iPad isn't in the picture.
reddit Viral AI Reaction 1776614378.0 ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:06:44.921194
Raw LLM Response
[ {"id":"rdc_oh3h3dn","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_oh3jnee","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_oh3ltog","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"rdc_oh3tb1s","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"indifference"}, {"id":"rdc_oh5k7o1","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"} ]