It’s probably not intuitive but think of it this way :
Say the temperature is 80F and the dew point is 60F
So if a drop of water (1g) cools from 98F (your body temperature) to 80F (the air temperature), you will lose 98-80 = 18 calories
If the same drop (1g) were to evaporate, you will lose 540 calories !! That’s about 30 times the amount of cooling than the first case
So evaporation of water from your skin cools you more than an order of magnitude faster than just conductive cooling.
This is the reason why when the dew point is high (hence less evaporation from skin), you’re body is unable to cool fast.
Yes, that can be said, clinically. But really, at room temperature and above, one doesn’t really need to concern oneself with how close they are to each other as far as a practical use of dew point as a yardstick as to the effect of felt humidity. That’s the simplistic beauty of dew point vs relative humidity: It needs no qualifiers. With ( a dew point of ) the 50s being comfortable, to 60s being progressively uncomfortable, and the 70’s being oppressive to fucking unbearable.
Below room temperatures, especially further below, who gives a damn? It’s cool/cold: You’ll need a jacket or coat regardless or how “humid” it is or isn’t. But at say, for example 85 degrees, a dew point of 53 would mean I could wear a long sleeve button down shirt and feel comfortable, at least out of direct sunlight. If, at the same temperature the dew point was 75, I would be in a loose fitting short sleeve shirt, and even then, even in shade, I’d feel clammy and likely have a sweat splotch on my back the size of a basketball.