For this post, we will discuss the definition of accuracy for weighing scales and balances. Accuracy is often used mistakenly to mean precision. The correct definition is: “Accuracy is the ability to display a value that matches the ideal value for a known weight”. In simpler words, it’s how close the measured value is to the actual value. This picture explains accuracy and how it differs from precision best:
So an accurate balance that is not precise would have various values like 49 g, 51 g, 50 g and 49.5 g for a 50g weight while an inaccurate balance would have results that stray further from the true value.
Why is accuracy important?
While people might think accuracy is the most important feature of a weighing scale or balance, in reality it is more like the sum of many variables in a formula or one of many cogs in a wheel. It is important not only by itself, but because it can affect other variables. It doesn’t matter how precise your scale is if it isn’t accurate, and its repeatability is flawed, its accuracy is. Keep in mind most measurements obtained from scales and balances, no matter the industry, aren’t simply recorded. They are used to do various things from analysis to creating recipes. If the data recorded is not accurate, none of the following work will be. That can go from a failed science fair project to bad food or consumer dissatisfaction. For example, remember when it came out that Subway sandwiches weren’t the length advertised. Now imagine you bought 2 lbs. of meat at the supermarket, only to find that your home food scale says it’s more like 1.6 lbs. You will be angry, and possibly wonder if you were cheated before. Or you get 2.3 lbs. Now the supermarket is losing a lot of money. Baking and science require very specific quantities of ingredients, and an inaccurate weighing scale can mean a formula could go dangerously wrong for a chemist. For people and animals, being weighed inaccurately could mean a botched diagnosis for health, or getting too much or too little medication. Entire studies could show misleading trends because of inaccurate measurements. Not only can this cost money, but it can be dangerous. Imagine a study says people need X amount of vitamin A in their diet when they really need Y. If the study is shared widely, people could change their intake to match the study and have their health suffer as a consequence.
How to keep your scale accurate
Accuracy is measured by a series of tests under controlled conditions. Tolerances (“an allowable amount of variation of a specified quantity”) are set and serve as a standard where the results must fall. There are many tests that can help determine accuracy; remember that accuracy is not a standalone value, but rather the summation of these specifications. Earlier in the post, we mentioned repeatability will not be correct is accuracy is not. That makes it a testable variable. Linearity is another one. Linearity and repeatability are the most common specifications used when determining and comparing accuracy between scales or balances. Keep your instrument’s applications and purpose in mind when conducting tests. Depending on the weighing needs, you might need to conduct more tests, or different ones. For example, creep (the change over time when the object being weighed stays on a load cell for a long time) is a significant factor in a farmer’s silos, but not so much in applications like shipping. It is one of many factors that ultimately affect accuracy, but it might also not be one that is important to you.
Calibration is a great measure of your scale or balance’s accuracy because if you take proper care of the weights, you already know their exact value and you can check your scale. So if your 1 g weight is measured as 1.2 g, you know you have a problem. Calibration weights are part of a well-controlled test environment: they can be used to check creep, and to measure repeatability and linearity among other things. One thing to keep in mind is the sensitivity of your instrument. While industrial scales might be built to resist tough conditions, they need to be calibrated in these same conditions to take various factors into account. If you’re taking a precision balance into the field with you, when it usually stays in the laboratory, you should calibrate it in the field. Make sure to calibrate your balance before starting measurements, especially if you’re weighing in a new environment.
For optimal calibration and measurements, check out our Anti-Vibration Table.