My huge G5 tower with a 1000W power supply sleeps at 5W - “off” is around 2W.
I never turn off my machines - my kitchen computer has been on (and not been rebooted) for almost 3 years, although it has spent most of that time asleep.
If you leave a computer on all the time, there’s one set of things trying to kill it. Anything that moves (such as fans and disk drives) wears out faster while the machine is left running. Integrated circuits also wear out as you use them.
If you turn the computer off, or let it sleep, you end up with a different set of things trying to kill it. The biggest of these is called thermal cycling. When parts heat up and cool down, the different bits expand and contract at different rates. This can lead to parts failures, the most common of which is that the itty bitty wires inside the chips lift up off of their pads internally and the part fails.
Figuring out which of these effects is worse isn’t trivial, and you can’t just say letting the computer sleep is better because it saves on wear and tear. It’s entirely possible that you are doing it more damage by letting it heat up and cool down.
Also, how well a computer sleeps has nothing at all to do with the OS. It has to do with the particular hardware designs of that computer. In general, laptops tend to use somewhere between about 20 to 50 watts while running, and maybe 1 to 5 watts while sleeping. It’s pretty close to “off” when sleeping.
Minor nit: Beowulff’s 1000w supply isn’t using 1000w while running either (beowulff probably knows this, but the way it’s written it’s like your comparing 1000 to 5). It’s probably half of that at most. But the basic idea is right. The computer will be pretty close to “off” while sleeping.