As I left work early yesterday morning it was already damn hot in New York. 90 Fahrenheit (32 for my fellow civilized comrades) before 10 in the morning, and the thermometer eventually climbed to just shy of 100 in the now-tropical oasis that is Central Park.
Of course, it’s also ridiculously humid too, but there’s always room to impair your water evaporation cooling system a little more, and on my way home I passed dozens of maintenance guys determined to do just that.
You see, about every other block I’d pass some guy busy hosing down the sidewalk. Ostensibly these people are cleaning debris from in front of their workplace and they’re far too lazy to wield a broom, but in the process they are completely saturating the concrete footpath with untold gallons of water.
Now, I could swear that the wet walkways felt much hotter than the dry thoroughfares, but I lack proof. I do however have a theory.
The wet sidewalk is darker, and has a much lower albedo than nearby dry concrete, and thus retains more heat. The water it contains has to go somewhere, and it ain’t running down the drain. I hypothesize that as the concrete heats up, water evaporates, thus creating a microclimate of higher humidity… directly in front of building entrances where these schmucks have spent hours getting everything wet.
In other words, these hosers are making the heat and humidity that much worse!
But that’s just my theory, and while I hope it holds water, I’m not certain.
Can any meteorologists, geologists or climatologists back me up on this? Does hosing down a sidewalk actually increase relative humidity? Or is there a hole in my bucket?
I don’t know about the higher heat absorption. Water may appear to make the concret darker, but it’s also adding reflectivity to the surface, since water is shiny.
But you don’t need a higher temperature to feel hotter. Just raising the relative humidity in that local area byhaving a lot of evaporating water is going to raise the heat index.
As a sample calulation: Weather Underground says the high in Central Park yesterday was 99 F. The average humidity was 57%. This results in a heat index of 122 F.
If you raise the local humidity to 65% (say, by leaving a wet sidewalk), the heat index jumps to 132 F. It may not actually be any hotter, but it feels a full 10 degrees more miserable, just from the marginal increase in relative humidity.
Reminds me of the noodniks moving to Arzona from the east for the low humidity. So many of them missed their old green lawns and planted them in Arizona that the humidity is going up.
Very possible. Some of the highest dewpoints (a more absolute measure of humidity) occur in the Corn Belt region. There are many stations in Iowa during the peak of summer that will have temperatures in the 90s, with dewpoints in the low 80s. This is all due to the evapo-transpiration of corn, which is one of the most efficient plants there is. In the example above of Central Park of 99 degrees, a 57% relative humidity would give a dewpoint of 80, most likely due to vegetation.
Bottom line, while wetting the sidewalks will cool the surface, the solar insolation will evaporate the water instead of heating the pavement surface. At high temperatures, this added moisture is much more noticeable (as has been mentioned). Also, wetting the sidewalks won’t noticeably cool the air (probably just the sidewalk) due to wind and air movement, but it will likely increase the local moisture content. So, while I can’t actually prove your theory is correct, it makes a good deal of logical/meteorological sense.
The water evaporating from the sidewalk cools the sidewalk. The heat doesn’t just disapper, it is carried away by hot water vapor, which passes you as it rises due to convection. Picture yourself in a 100 degree environment standing on a 145 degree concrete surface. Dump water on the concrete. You’re going to be standing in a cloud of hot vapor.