I am looking at a chart depicting heat index and notice that for a person to perceive that the heat index is the same as the air temperature. the humidity has to get lower and lower as the temperature goes up. This seems counterintuitive to me.
For example, for it to feel like it’s 80 (F) when the air is 80, the relative humidity must be about 45%. But if it hits 100, the relative humidity has to be down around 25% to feel like 100. And if it hits 120, the humidity has to be way down to 12% to feel like 120.
I would have expected there to be some baseline humidity level such that if the relative humidity is x%, the heat index would be the same as the actual temperature, regardless of what that temperature is.
Also in some very humid or very arid climates it would be unusual to have a heat index that matches the actual air temperature. Isn’t this all relative to the baseline conditions of where you’re located? Today’s Houston forecast is for 88 degrees at 62%, and 61% is the average relative humidity for July. So 88 degrees at 62% in Houston is what 88 degrees feels like. But the forecast is telling people they should feel like it’s 96.
What is the underlying logic of how the heat index works mathematically?
It’s actually ridiculously complicated (PDF, subscription may be required), but the question you’re asking has a fairly straightforward answer: the heat index equals the actual air temperature when the vapour pressure of water in the atmosphere is 1.6 kPa, the wind speed is 2.5 m/s (about 5.6 mph), the overall atmospheric pressure is 101.3 kPa (the standard pressure at sea level), and you’re in the shade. (So no, the heat index isn’t measured relative to local norms.) The vapour pressure is essentially the “absolute humidity” rather than the “relative humidity”: it’s a measure of the actual amount of water in the air, rather than the amount relative to the maximum possible at that temperature.
You also note that the relative humidity needed to have the heat index equal the air temperature decreases as the temperature rises. This is because the heat index is calculated for a fixed “absolute humidity”. As the temperature rises, the maximum possible amount of water in the air goes up, so the relative humidity corresponding to this particular “absolute humidity” goes down.
As far as why the heat index uses the “absolute humidity” rather than the relative humidity, I would guess that the absolute vapour pressure of water in the air is the primary factor in determining the rate of evaporation from the skin, rather than the relative humidity. Since evaporation is the primary way the body cools itself, this would be the relevant quantity if you’re trying to figure out how hot it “feels”.