Weighing accuracy

scientific definition, the accuracy of a measurement system is how close a result comes to the true value or a standard.

In our daily life, when we see a Measurement like a speed of 25 kmph on a speedometer or a weight of 12.2 kg on a weighing scale, we regard this value as correct without thinking about the errors those values may have. It is a common perception that ‘what we SEE is the CORRECT & ACCURATE value’.

This perception is further strengthened in today’s ‘Digital’ indication age, where instruments produce a direct readout of numeric values, removing the ambiguity of analog indication such as needle/pointer of earlier times. However, the fact remains that each measurement, whether done by an analog or a digital instrument, has an error and we cannot say how accurate a measurement is unless we know the true value to compare it with.

Weighing equipment as a ‘system’

A measurement system can have many components, but there is always at least one critical component which dictates and limits the overall accuracy of the measuring system.

In modern-day electronic weighing systems, this most critical measuring component is the transducer which converts applied load into a proportionate electrical signal. The majority of weighing machines used for commercial purposes have strain-gauge based Load Cells as the transducer.

The voltage signal generated is processed and converted into digital form by the weighing instrumentation, commonly known as ‘digitizer’, for displaying the weight and further use. The weighing Scale system needs to be calibrated using standard weights before putting to use.A weighing system cannot have accuracy more than the accuracy of load cell/s used in it.

The role of least count

All measuring instruments have a calibrated range known as ‘span’, with a Min. and a Max. limit. This range or span is a graduated scale and the minimum value of displayed graduation is the ‘least count’ or ‘resolution’ of the instrument.

For example a weighing scale with a least count of 10 kg will show weight only in steps of 10 kg, i.e. if the weight of an object is measured as 1016 kg, the scale can show it either as 1010 kg or 1020 kg. Here it does not have any relation whether the measurement of 1016 kg was correct or not. It is only about displaying the result. The least count of a scale can only be 1, 2, 5, 10 and their multiples.

The least-count/resolution relates more to the readability of a weighing instrument, rather than accuracy.

The Legal Metrology perspective

The majority of scales which we see and use every day, such as weighbridges, platform scales, bench/counter scales etc, are classified as non-automatic weighing machines by Indian Legal Metrology. These are further classified into four accuracy classes – I, II, III, and IV depending on permissible errors for measurement, with class I being most accurate and class IV the least.

Most of the scales used for ‘legal for trade’ purpose are certified for a min. class III.

All weighing machines used for trade purposes are required to be verified and stamped every year by Legal Metrology, according to their accuracy class.

The accuracy classes for scales

One important specification valid for a class I and II only are that the accuracy of the scale can be 1, 2, 5 or 10 times the least count of the scale. For example, a scale of 10 kg x 0.1 g can have accuracy 10 times the resolution which is 1 g (10 x0.1 g)  i.e. a reading of 5000.1 g  may have an error of up to 1 g.

This specification is not applicable to Class III and Class IV machines. For these machines, the accuracy of readings is between 0.5x to 1.5x of the scale resolution, or 1x as an average (simplified for ease of understanding). For example: in a scale with 50000 kg x 10 kg, a reading of 25050 may have a maximum error of 10 kg, i.e. the object’s true weight could be anywhere from 25040 to 25060 kg.

For class III machines, the accuracy is generally considered as +/- 1 division (least count).

Spread the love