This idea just came to me...
While we already know that graphics cards eat much more power than CPUs...,
Why are the biggest graphics card coolers still way smaller than CPUs'?
How is this thing able to cool a GTX280?

Then I realize that gfx cards can run at much higher temperatures compared to CPU.
The amount of heat dissipated by a particular object is proportional to the temperature delta, or difference, between the object and the thing that it is supposed to lose heat to - air in this case.
The difference in temperature require to transfer energy at the rate of 1W, is called thermal resistance, measured in degrees per W
Which is the the strange °C/W symbol you see on all decent review sites of HSFs. It is a much more accurate show of performance rather than the stupid temperature which differs depending on the CPU and ambient temperature, which
always differs.
Since at twice the temperature difference the object dissipates twice the amount of heat
If the air inside the case is 40 degrees and a CPU runs at 50 degrees
The gfx card can run at 80 degrees to dissipate 4 times the heat, given the same heatsink. Meaning it can either dissipate the same amount of heat as the CPU using a heatsink 1/4 as effective or twice the heat with a half-as-effective heatsink, which is what is probably happening now.
Since the most powerful CPU HSF can cool a 78W processor with only 10 degrees differenceA powerful gfx using 150W and a half-as-effective heatsink - 40 degrees difference.
Or lets go lower - a mid-range gfx card with 75W power - and a heatsink 1/4 as effective = also 80 degrees.
CPUs die at 80 degrees, so people will complain. (Anything over 60 or 65 or 70 is bad already)
GFX card can go over 80. Even 100.
Hence the lousier heatsinks.
It all makes sense now.