On a recent trip down a long and lonely highway, I found myself being issued a warning from a courteous highway patrol officer for traveling 77 mph in a 75 mph zone.
But this got me thinking. With his radar gun (regularly calibrated and checked), he was able to determine my speed to probably quite a good precision - perhaps fractions of a mile per hour. So he knew how fast I was going. However, that doesn't mean that I knew how fast I was going.
There is a well-known source of uncertainty in measurement, which states that the precision of any measurement can't be better than half of the smallest increment of the measuring device. Picture a ruler. Maybe it's a good one and it has increments marked on it down to every 16th of an inch. If you were to measure the length of something with this ruler, you wouldn't be able to say that you measured that length to better than half of 1/16th inches. The same is true for your liquid measuring cup: if the divisions on the cup are ounces, then your volume is only known to half an ounce. Even though you may be able to estimate the measurement better than that, it is invariably (and incurably) subjected to that imprecision. Consider a meter stick, which has 1 mm divisions. If you measure the length of a piece of metal with that meter stick to be 179.3 mm, you would still have to report the length as 179.3 +/- 0.5 mm. Because the precision of the meter stick is only half the smallest division (so half of 1 mm).
The speedometer in my car has 5 mph increments. So even if, in practice, I can estimate from the location of the needle that I was going 77, I must report that number as 77 +/- 2.5 mph. Which means that it's entirely possible that I was, in fact, going the speed limit, or even slightly under it (77-2.5 = 74.5). Because of the inherent precision in my speedometer, I simply can't know to any closer than 2.5 mph.
This, additionally, isn't the only problem I encounter in wishing to know my speed. While the highway patrolman's radar was probably calibrated recently, my speedometer may never have been calibrated (more likely, it was calibrated once, on the factory floor when the car was brand new). All sorts of things can affect the overall calibration of a car's speedometer, including the size and shape of the tires, the car's age, and the device originally used to perform the calibration. This kind of uncertainty (inaccuracy, as opposed to an imprecision) is referred to as systematic (recall my discussion of the faster-than-light neutrinos), and it's the hardest kind to find and quantify. Even if my speedometer says I'm going 75, I might be going 73, or 77. I might even be going 80. The only way I can tell is to compare my result (my speedometer reading) with one or more simultaneous external results (like the patrolman's radar measurement).
In other words, I need an external reference to check my speedometer's accuracy, while the speedometer's precision is determined through the speedometer itself; and both are needed to fully understand the speedo's functionality and operation.
Fortunately, in this particular instance, I just happened to have a GPS unit with me. And the GPS unit agreed with both the radar and my speedometer: I was driving 77 mph. (In my defense, I was coasting down a small hill. If he'd seen me going up the hill instead, he'd likely have clocked me going 72.)
So from a simple traffic warning, I was able to learn that my car's speedometer is accurate to better than 1 mph and precise to 2.5 mph. My conclusion: don't let nerds go on road trips!
3 hours ago