Water boils at 100 degrees C ….. Or does it?


Recently we decided to play around with measuring the boiling point of water, now of course that should be easy, put some water in a jar and heat it until it boils, measure the temperature and naturally you would expect to see 100 degrees C. Even allowing for the minor error in even the most accurate probe, and even if you know the error of your super accurate probe and make a compensation for it you might be surprised to find that 100 degrees C is impossible to achieve.


There are several reasons for this but the most obvious would be the purity of the water, after all tap water is far from pure with many naturally occurring impurities that are left in during the cleaning process in order that your tap water actually has some taste, not to mention the numerous chemicals dissolved into the water in order to keep it clean all the way to your tap. Any impurity will alter the boiling point, but by how much? Well to some degree that’s going to depend on where you live, water in different areas of the country (not to mention the world) have different levels of impurities and in some cases different minerals dissolved into the water, so the actual boiling point of tap water will vary region to region.


So let’s remove that variable from the equation. Let’s try water of very high purity 99.99% pure in this instance. That should give us the magical 100 degrees C, but no it doesn’t. Even a school child can tell us that water boils at different temperatures depending on height. On top of mount Everest for example water boils at about 75 degrees C, so your altitude above sea level will have an effect, that a fairly easy calculation to do. Yet there is still a good chance that after compensating for the probe error, using super pure water and making an adjustment for altitude you still won’t get the magical 100 degrees C.  Why?


Well we need to go back to Everest, why does the water boil at a lower temperature the higher you go.  It’s not really anything to do with height but instead the vapour pressure of the water, the lower the vapour pressure the lower the boiling point and of course the opposite is true.  The amount of  air pressure pushing down on the water determines at what point the water will boil, obviously at 30,000 feet the amount of air above you is less than at say sea level (0 feet) and so there is less air pushing down on you and so there is less pressure and so the water boils at a much lower temperature. Absolute pressure then becomes another variable that we need to account for.


Absolute pressure at any given point on the planet is not fixed, unlike gravity which remains fairly constant. We all watch the weather and we see high fronts and low fronts moving across the map, these highs and lows are pressure variations that move around the planet, it’s also another variable in the equation.  Surely the weather can’t affect the temperature of boiling water? Well the ambient temperature doesn’t assuming your test equipment is good enough, but the changes in absolute air pressure do. With an accurate pressure gauge we can measure absolute pressure changes and calculate the difference it causes in the boiling point of water. At 998 millibars pure water boils at 99.591 degrees C but at 1028 millibars it boils at 100.400 degrees C, almost a full degree difference and these pressure changes are not extreme, at least not in the UK. 998 millibars can become 1028 millibars within the course of a day quite easily.


So with pure water, a known and compensated for probe and an accurate absolute pressure reading combined with a decent calibrated and accurate temperature measurement system you can get water to boil at 100 degrees C……. but only on paper which is why using boiling water to “check” a 100 degree C calibration is probably not a very good idea !

 

Download PDF