Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
? obviously people making content by themselves and with their consent is differ…
ytr_Ugw65LcfC…
G
2:25 AI makes the product less maintainable over the long run.
Oh my gosh who wo…
ytc_UgxaxfZCH…
G
The problem is not the technology, or even fakes, it's people having irrational,…
ytc_Ugy5vuW3t…
G
A conversation between me and ChatGPT, in regards to the humor question:
Me: So…
ytc_UgwYjt35T…
G
There is no point in doing any of this if you use Gemini from Google and you tel…
ytr_Ugwvwuxhu…
G
As someone who works with AI it never will happen. AI will just show us some job…
ytc_UgwY9j7z6…
G
as a (not so good) artist who is pissed ai lowk draws better than me, i say don'…
ytc_UgzSwU57q…
G
Being knowledgeable of how LLM are built and work, one must be conscious that yo…
ytc_UgxzFOQ2D…
Comment
Can anyone explain why data center cooling requires "fresh water?" It seems more logical to use closed-loop cooling with something that's a better thermal conductor, such as propylene or ethylene glycol, then water-to-air intercoolers (aka radiators) to remove the heat. The pumps and fans for those would be electric, and that could use solar and/or wind, depending on the datacenter location. Closed loop cooling would be more reliable, since you won't introduce contaminants like minerals, which can clog the cooling systems over time, and a gylcol would also serve as a lubricant for the pump and corrosion inhibitor for the piping and radiators.
Even using water-to-water intercoolers would be better, because you can use the fresh potable water supply to essentially use geothermal to cool the COOLANT, and then let the water continue downstream to end-use sources. In that scenario, the fresh water is only run across a clean radiator, never coming in contact with the glycol.
youtube
Cross-Cultural
2025-09-03T13:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwCzYIaUy49g1cOf8F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyzhWmkaXTWHcGRIx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrGg8zTq0V4MmvhL14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw1tR6Fm4CO0Yt7hHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxVFnw--u3uYbPEMRl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyoy2JxbaBk1khysup4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw8bNaIxngVa9Y5gT94AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwWAbZnY7W2bzB7Pw14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzArZ0KaIbmCtAkdp14AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzB-yDv3OAVdEULtFF4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]