When you watch your First Alert Forecast, you may hear the meteorologist reference the dew point temperature in reference to how humid or how comfortable the air might feel to you. Viewers have asked us the fine question, "When discussing humidity or comfort levels, why don't you give the relative humidity instead of the dew point?" The reason: dew point temperature is a much more consistent and reliable gauge of your comfort level than relative humidity. In short: dew point is a fixed target, relative humidity is a moving target. Here is a more in-depth discussion...
WHY IS THE DEW POINT TEMPERATURE THE PREFERRED GAUGE OF COMFORT?
The dew point temperature is an absolute measure of moisture in the air at a given time. Think of it as the temperature to which you'd have to cool the air to get it to saturate. The lower the dew point temperature, the drier the air. The higher the dew point temperature, the more humid the air.
In southeastern North Carolina, dew point temperatures can dip well below 30 degrees in the wintertime, while summertime dew point temperatures usually soar to well over 70 degrees. In the attached graph, you'll notice that in situations when the dew point temperature is exceeds 70, you almost certainly would say "it feels very humid" outside. But, in situations when the dew point temperature dips below 55, you'd likely say "it feels crisp" outside. Of course, every person's tolerance to humidity is slightly different, but this scale of dew point with respect to comfort is almost universally applicable for southeastern North Carolina.
SO WHY IS RELATIVE HUMIDITY A LESS-RELIABLE GAUGE OF COMFORT?
Relative humidity is a much more variable, much less reliable gauge of comfort because it is, well, relative to temperature. When the air's temperature matches its dew point temperature, relative humidity is 100%. But, the higher the air's temperature rises above its dew point, the lower the relative humidity gets.
Take a standard sultry August day in southeastern North Carolina as an example. Let's say there is a lot of moisture in the air on this day, and the air's dew point temperature is a stifling 75 degrees. At the dawn of this day, the air temperature is a balmy 75 degrees too. "75 over 75" produces a relative humidity of 100%. Now, in the afternoon, there is still a ton of moisture in the air, and the dew point temperature remains a sweaty 75 degrees. But, by that time, the actual temperature of the air has heated up with the daytime sun to a sizzling 97 degrees. "97 over 75" produces a relative humidity of only 50%. While the actual amount of moisture in the air has not changed from morning to afternoon, the relative humidity has dropped by half! Would you be any less inclined to break a sweat, need an extra glass of water, or seek an air-conditioned space in the afternoon versus the morning? No way! To you, the air on this summer day is consistently oppressively humid. The dew point temperature told that story consistently. The relative humidity did not.
Let's make another example. Picture a cool, crisp October day in southeastern North Carolina. In the morning, there are ribbons of fog on the horizon and dew on your grass. The temperature and dew point temperature match at 52 degrees. "52 over 52" produces a relative humidity of 100%. In the afternoon, the sun warms the temperature to a mild 72, but the dew point stays locked at a crisp 52. "72 over 52" produces a relative humidity of a much, much lower 50%! Notice, the relative humidity values in this October example exactly match those of the August example. But your comfort level is better determined by the other numbers: temperature (how warm it feels) and dew point temperature (how humid it feels).