The accuracy of dial thermometers is crucial in industrial processes and decision-making.

However, this important step is often overlooked in our daily use of temperature-measuring tools, with many assuming that temperature sensors are always accurate — a common misconception.

Whether in energy supply, medical and food technology, chemical and pharmaceutical industries, or shipping and automotive, precise and reliable temperature measurement is vital in all rigorously controlled processes. 

If there are doubts about the accuracy of dial temperature instruments, we recommend calibrating them by a professional instrumentation supplier.

Interested in learning more about dial thermometer calibration?

SJ Gauge will introduce the Accuracy Class and Limits of Error of dial thermometers below, explaining why dial temperature instruments need calibration. We'll help you understand the calibration process through simple and understandable examples. 

For more information, continue reading our article or contact us directly. Our professional sales team is always ready to provide the most suitable temperature measurement solutions for your applications!

Contents

1. What does "Accuracy" mean in dial thermometers? Understanding the definitions of terms related to temperature calibration.

  1. Accuracy

  2. Accuracy Class

  3. Limits of Error

2. What is temperature calibration? How should temperature calibration be conducted?

1. What does "Accuracy" mean in dial thermometers? Understanding the definitions of terms related to temperature calibration.

  1. Accuracy

What does "Accuracy" mean in dial thermometers? 

The accuracy of a dial thermometer refers to the degree of proximity or deviation between the measured value (observed or read value) and it’s true value. The default unit for indicating the accuracy of a dial thermometer is ℃.

 

  1. Accuracy Class

What does "Accuracy Class" mean in dial thermometers? 

The accuracy classes are divided into Class 1 and Class 2 according to BS EN13190:2001 (British Standard). Different temperature ranges (temperature measurement ranges) and dial diameters correspond to different accuracy classes.

 

  1. Limits of Error

The values must fall within the permissible error range, which is determined based on the nominal range and measuring range under different conditions, with corresponding accuracy classes and limits of error.

The measuring range is established based on the commonly used temperatures in your application. It should extend at least two-thirds of the dial size, ensuring that your measuring range never exceeds the nominal range.

BS EN13190:2001 provides the following table
 

 

Nominal Range

Measuring Range

Limits of Error

±

Class 1

Class 2

-20 ... +40

-20 ... +60

-20 ... +120

-10 ... +30

-10 ... +50

-10 ... +110

1

1

2

2

2

4

-30 ... +30

-30 ... +50

-30 ... +70

-20 ... +20

-20 ... +40

-20 ... +60

1

1

1

2

2

2

-40 ... +40

-40 ... +60

-100 ... +60

-30 ... +30

-30 ... +50

-80 ... +40

1

1

2

2

2

4

0 ... 60

0 ... 80

0 ... 100

10 ... 50

10 ... 70

10 ... 90

1

1

1

2

2

2

0 ... 120

0 ... 160

0 ... 200

10 ... 110

20 ... 140

20 ... 180

2

2

2

4

4

4

0 ... 250

0 ... 300

0 ... 400

30 ... 220

30 ... 270

50 ... 350

2.5

5

5

5

10

10

0 ... 500

0 ... 600

0 ... 700

50 ... 450

100 ... 500

100 ... 600

5

10

10

10

15

15

50 ... 650

100 ... 700

150 ... 550

200 ... 600

10

10

15

15

*The measuring range should extend at least two-thirds of the nominal range.
*If the nominal range exceeds the upper limit, the next temperature range should be applied.
 

For example,

  • Nominal range: 0 ... 100 ℃
  • Measuring range: 10 ... 90 ℃

According to the table above, the values are as follows:

Class 1

  • Limits of error: ± 1

This means that when the actual temperature is 20 ℃, the limits of error is between 19 ... 21 ℃

Class 2

  • Limits of error: ± 2
This means that when the actual temperature is 20 ℃, the limits of error is between 18 ... 22 ℃
 

2. What is temperature calibration? Why do thermometers need calibration?

Why do thermometers need calibration?

Ensuring precise temperature measurements is necessary for enhancing production efficiency, minimizing the chance of unplanned downtime, and meeting environmental as well as legal standards —, leading to optimized profits. 

Even with the highest-quality temperature instruments, accuracy may diminish over time. Regular verification and calibration are recommended to uphold instrument reliability.

Also, temperature-measuring instruments are generally recommended to undergo calibration annually.

Any of the above scenarios could result in erroneous observational data. We advise clients to promptly send instruments to a reliable provider, such as SJ Gauge, for pressure calibration and maintenance.

 

How should temperature calibration be conducted?

The British Standards Institution in suggests that the testing should be carried out using a test instrument with an accuracy of at least four times that of the thermometer being tested. The test instrument should be traceable to the National Metrology Standard Laboratory - Measurement Traceability System

 

The following are three common testing methods for the calibration of dial thermometers as proposed by BS EN 13190-2001:

a. The temperature detecting element under test shall be exposed for 20 min to a temperature corresponding to its maximum scale value, or +60 ℃, whichever is the greater.

b. The thermometer shall then be tested for accuracy and hysteresis using at least three temperatures selected at uniform intervals over the measuring range. The test shall be carried out under reference conditions, both up and down the scale.

c. In the case of thermometers which can be mounted at any angle, the variation in indication occurring when the thermometer is turned through 90 degrees along its longitudinal and transverse axes shall be determined.