First on the agenda, is why the dew point is a better indicator of how sticky it feels outside rather than relative humidity.
Relative humidity is actually the ratio of the water vapor we actually have around us compared to the water vapor we need to put into the air in order to have complete saturation.
Dew point is a better "absolute" measure of the air's moisture content because it's a temperature we need to cool the air in order for saturation to occur.
If you've ever had an icy cold drink with moisture on the outside that happens because the ice has cooled that glass down to or below the dewpoint, so any moisture that's in the air condenses on that glass.
To better understand--let's look at 2 different types of air masses. One at 40 degrees and one at 90 degrees. So, one a cool day and the other a hot day. Forty degree air cannot hold as much moisture as 90 degree air can.
Ninety degree air has higher capacity to hold moisture than 40 degree air. Let's say the relative humidity outside is 50 percent--not too bad--that means that the air is holding about half of the total amount of moisture that it can.
Just knowing the humidity in percentage doesn't tell the whole story because it's relative to the actual air temperature. Warmer air can hold more moisture than cool air, so if we say the air is filled with half the amount of moisture it can totally hold.
So, at 90 degrees 50 percent humidity is going to feel a lot worse than it would on a 40 degree day. The dewpoint on the 40 degree day would be 29 degrees and on the 90 degree day the dewpoint would be 69 degrees.
The higher the dewpoint the more uncomfortable it feels. So the dewpoint is a much more accurate measurement of moisture that's in the air compared to relative humidity.