Way back I used a rather powerful peltier module to cool my GPU to sub zero temperatures.
A secondary watercooling loop removed the combined heat of the GPU + peltier (in)efficiency.
The GPU died as I didn't insulate it from humidity.
How's the situation nowadays, I wonder why peltier modules are not used more widely - it should be easier to remove 100W CPU+peltier combined heat at 100C than just the 65W of CPU heat alone.
Has anyone seen / read something recently?
Read responses in arstechnica.com