Why do meteorologists use dew points and not humidity in their daily reporting? Can you explain?
Meteorologists use both terms. While relative humidity and dew point are both concerned with the amount of water vapor in the air, there are differences. Dew point is the temperature at which the air becomes saturated (100 percent relative humidity). It is dependent on only the amount of moisture in the air. As temperature rises, its capacity to hold water increases exponentially, which explains why you don’t feel like a wet rag when air is saturated at 10 degrees, but you do at 75. Relative humidity is the percent of saturation at a given temperature. If air is at 100 percent relative humidity at 60 degrees but is heated to 93 degrees, its relative humidity decreases to about 33 percent, though its dew point remains at 60 degrees.