Friday, May 29, 2020

Dew Point vs. Relative Humidity

If you've been outside today you've probably thought to yourself: Wow! It's humid out! That is quite true, as dew points throughout the region are in the upper 60s.

Here is a 1:00 p.m. eastern time look at dew points throughout the country. Notice that the entire eastern seaboard is experiencing dew points in the 60s or 70s, making the air feel muggy and oppressive.
1:00 p.m. EST dew points (Mesonet)

One commonly misunderstood meteorological concept that I frequently hear is the difference between dew point and relative humidity. 

The dew point is the temperature to which air must be cooled to be saturated with water vapor. In other words, at the dew point, the air cannot hold any more water vapor, so it will release it as rain or snow. Generally speaking, the higher the dew point is, the more "humid" it feels. Essentially, the dew point is a measure of humidity.

Relative humidity, on the other hand is the percentage for humidity that you will see on the Apple iPhone weather app. Put simply, relative humidity is the fraction relating the current humidity in the air (the numerator) to how much there could be (the denominator). If the dew point is 65 °F and the temperature is also 65 °F, then you will have a relative humidity of 100%, and it will likely be raining.

This is one of the main reasons that I find relative humidity misleading. Say the dew point is 65 °F, which would fall in that muggy to oppressive category. And if the temperature is 90 °F, you'd have a relative humidity of around 45%, which does not sound crazy high. However, given this temperature and dew point combination, it will certainly feel oppressive out. This is the reason that I find the dew point to be the best measure of humidity and what we should report (not relative humidity as the Apple iPhone app does).

No comments:

Post a Comment