How does FurMark work?

The program FurMark is commonly used as a stress test for graphics cards. It’s specifically designed to make them run at higher temperatures than normal. What I’d like to know is how it does this.

I realize, of course, that a CPU or GPU gets hotter the more calculations it performs. However, if I’m running Folding@Home on my GPU and the Nvidia System Monitor says the GPU is at 97% utilization, it will show a temperature of 65 C. If I run FurMark, the temp quickly climbs to 78 C or higher.

My question is, how can the specific type of calculations used in FurMark generate substantially more heat than the calculations used in Folding@Home?

Off to General Questions.

Per this

it does not sound like folding@home is necessarily all that wildly stressful for a modern GPU.

Re furmark