Electrical Calibration


Electrical calibration refers to the process of verifying the performance of, or adjusting, any instrument that measures or tests electrical parameters. This discipline is usually referred to as dc and low frequency electrical metrology. Principal parameters include voltage, current, resistance, inductance, capacitance, time and frequency. Other parameters, including electrical power and phase, are also in this segment of metrology. Ratio metric comparisons of similar parameters are often performed to compare a known parameter to an unknown similar parameter.

Electrical calibration involves the use of precise devices that evaluate the performance of key properties for other devices called units under test (UUTs). Because these precise devices have thoroughly known performance characteristics compared to the UUT, performance evaluation and/or calibration adjustment of the UUT to identify or minimize errors is possible. Typically, the performance of such precision devices should be four or more times better than the UUT.

These precision devices fall into two broad categories. Electrical signal sources are often referred to as either calibrators or standards. Precision measurement devices are often classified as precision digital multimeters, measurement standards, or ratio bridges.
This depends on how important the measurements being made are to your product or service; the degree of wear and tear that the instrument will experience in service; the stability of the instrument itself and a review of the calibration records that already exist to determine whether adjustment has been needed previously. OTC recommends a starting periodicity of 12 months for most instruments with an increase in calibration frequency (to 6 or 9 months) if adjustment is required, and a reduction in periodicity to 2 years after a sequence of annual calibrations has shown that adjustment has not been needed.

How is calibration done?

By checking the instrument against known reference standards that have themselves been calibrated in a chain of measurements that can be traced back to agreed International Standards – the system of SI units – for example the Volt; Ampere; Watt; metre; litre.
Do you need your equipment calibrated, contact us for single unit programme pricing.
Calibration is defined as an association between measurements – one of a scale or accuracy made or set with one piece of equipment and another measurement made in as similar a way as possible with a second piece of equipment. The piece of equipment or device with the known or assigned accuracy is called the standard.Standards vary from country to country depending upon the type of industry whilst manufacturers designate their measurement criterion and recommend the frequency and level of calibration, depending upon industry requirements, how often the device is used and the specific application.Some companies will offer a pre-calibration test where they test equipment first, to determine whether it is suitable for calibration, whilst others will submit all equipment for calibration whether or not it is working properly.In general use, calibration is often regarded as including the process of adjusting the output or indication on a measurement instrument to agree with value of the applied standard, within a specified accuracy however this is actually two processes: calibration and adjustment. It is important therefore to understand exactly what service you require.It is also important to understand what is being calibrated and how the calibration is being performed. As an example, consider a digital thermometer that uses an external temperature probe. Many companies are surprised to learn that their calibration is performed using a simulated temperature value that is applied to the thermometer only. Here, a test instrument is attached to the digital thermometer and a voltage equivalent to a specific temperature is applied to the digital thermometer. The result is then recorded and the thermometer considered to be calibrated.
Many users require, and probably expect, a more rigorous calibration to be performed that reflects real world usage. Here, the preferred method is to test both the digital thermometer and the temperature probe together (In other words a system test) and to use a real heat source. The value displayed by the system being tested is then compared against the standard (The system with a known or assigned accuracy from the first paragraph!).

Why is Calibration So Important?

Calibration defines the accuracy and quality of measurements recorded using a piece of equipment. Over time there is a tendency for results and accuracy to ‘drift’ particularly when using particular technologies or measuring particular parameters such as temperature and humidity. To be confident in the results being measured there is an ongoing need to service and maintain the calibration of equipment throughout its lifetime for reliable, accurate and repeatable measurements.
The goal of calibration is to minimise any measurement uncertainty by ensuring the accuracy of test equipment. Calibration quantifies and controls errors or uncertainties within measurement processes to an acceptable level.
So if you know that a particular food product needs to be kept above 68°C and the instrument system you are using displays a figure of 68.8°C then provided the system is calibrated to be accurate within 0.5°C at 68°C you can be confident the food is safe, if the system has an accuracy of 1°C though then you cannot be certain that the food’s temperature has been correctly controlled. Food is, of course, only one example of why it is essential to have a confirmed calibrated level of accuracy. Manufacturing processes that require specific controlled curing temperatures are another in fact the list goes on.

Carelabs