Unit 1 Introduction To Instrumentation-Part 2
Unit 1 Introduction To Instrumentation-Part 2
Course Coordinator
Mrs. P. D. Khurpade
Unit I: Content- Lecture 3
• Performance characteristics : Static and Dynamic
Characteristics
• Examples
• Error in measurement and Calibration
• Industrial Sensor Specification
Learning outcomes:
At the end of lecture, students should be able to
1) State and describe major static characteristics for sensor or
instrument
2) Understand the process of calibration and subsequent
characterization of errors
3) Interpret typical industrial sensor specifications
Performance Characteristics of Instruments
1. Static characteristics Indicating the capability
of instrument including
2. Dynamic characteristics limitations for particular
application
1. Range 6. Sensitivity
2. Span 7. Resolution
4. Precision 9. Hysteresis
conditions.
• Many environmental
factors which affect the
drift: ambient
temperature, pressure,
supply voltage, stray
electric or magnetic field,
wear and tear, corrosion
and mechanical vibrations
etc.
Performance Characteristics of Instruments
1. Speed of response
2. Fidelity
3. Lag
4. Dynamic error
Performance Characteristics of Instruments
𝑋 𝑛 − 𝑋´ 𝑛
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛= 𝑃=1 −
(𝑋𝑛
´ )
× 100
For this midpoint value of input, two out values in increasing and decreasing
values are then used to calculate the Hysteresis
𝑌𝑚𝑖 − 𝑌𝑚𝑑
(
𝑀𝑎𝑥 . 𝐻𝑦𝑠𝑡𝑒𝑟𝑒𝑠𝑖𝑠 𝑎𝑠 % 𝑓 . 𝑠 . 𝑑 = ) 𝑌 𝑚𝑎𝑥 − 𝑌 𝑚𝑖𝑛
× 100
𝑁
^
𝑀𝑎𝑥 .𝑛𝑜𝑛 −𝑙𝑖𝑛𝑒𝑎𝑟𝑖𝑡𝑦 𝑎𝑠 % 𝑓 . 𝑠 .𝑑 =( ) ×100
𝑂 𝑚𝑎𝑥 −𝑂 𝑚𝑖𝑛
Types of Errors
• Static errors are generally of three types;
1. Gross error (human mistakes)
2. Systematic errors
– Instrumental
– Environmental Errors
3. Random or accidental errors
Sources of errors:
4. Poor design
5. Poor maintenance
6. Insufficient knowledge of process parameters & design conditions
7. Certain design limitations
Gross errors
• Gross errors arise due to human mistakes, such as,
reading of the instrument value before it reaches steady
state, mistake of recording the measured data in
calculating a derived measured, etc.
• Parallax error in reading on an analog scale is also is also
a source of gross error.
• Careful reading and recording of the data can reduce the
gross errors to a great extent.
• At least three or even more readings must be taken.
Gross errors
Parallax error
Systematic errors
• Systematic errors are those that affect all the readings in a
particular fashion.
1. Instrumental :It may arise due to different reasons:
shortcomings of the instrument or the sensor, improper design
of the measuring scheme, improper selection of the sensor.
2. Environmental: the sensor characteristics may change with
temperature or other environmental conditions.
• The major feature of systematic errors : sources of errors are
recognizable and can be reduced to a great extent by carefully
designing the measuring system and selecting its components.
• By placing the instrument in a controlled environment may
also help in reduction of systematic errors.
• They can be further reduced by proper and regular calibration
of the instrument.
Calibration
• Calibration is the application of a known value (electrical,
mechanical, etc.) to a device and determining that its
output accurately represents that value (at least within the
stated performance parameters of the device).
• Calibration may also include the activity of adjusting or
correcting the performance of a device in order to have its
output accurately represent its input.
• This leads to another question and answer: what is a
calibrator?
• A calibrator is a device of known accuracy which
generates (simulates) and/or measures a known value
(electrical, mechanical, etc.).
Calibration and error reduction
• The alternative way to reduce the systematic error is to calibrate the instrument
for different known inputs.
• Calibration involves comparing the measured value with the standard
instruments derived from comparison with the primary standards kept at
Standard Laboratories.
• It is carried out by comparing its reading with those given by another
instrument that is adopted as a standard.
» National Standard
» Calibration Centre Standard
» Instrument manufacture standard
» In company standard
» The instrument with the user
• The calibration can be done for all the points, and then for actual
measurement, the true value can be obtained from a look-up table prepared
and stored before hand. This type of calibration, is often referred as software
calibration.
Calibration frequency
The question is when to calibrate.
• The characteristics of an instrument change with time. So even it is
calibrated once, the output may deviate from the calibrated points
with time, temperature and other environmental conditions.
• So the calibration process has to be repeated at regular intervals if
one wants that it should give accurate value of the measurand
through out.
• Manufacturer-recommended calibration interval.
• Per requirements
• Monthly, quarterly, or semiannually.
• Annually
• National Standard:
• National Physical Laboratory in Great Britain
• National Bureaux of Standard in United State
• Alternatively, a more popular way is to calibrate the
instrument at one, two or three points of measurement
and trim the instrument through independent adjustments,
so that, the error at those points would be zero.
• It is then expected that error for the whole range of
measurement would remain within a small range.
• These types of calibration are known as single-point, two-
point and three-point calibration.
Traceability
• Traceability is nothing but the knowledge of the full chain
of instruments involved in the calibration procedure of a
laboratory process instrument.
• Traceability is the ability to relate individual
measurement results through an unbroken chain of
calibrations.
• It shows that the instrument is calibrated using standard
instruments which are linked by a chain of increasing
accuracy up to the international standards.
• All your process instruments are located in the lowest
level, their traceability is dependent on all the levels
above.
• In your plant, you have many process instruments, such as
transmitters, that are calibrated regularly using measurement
standard or working standard.
• In the plant, working standard is referred as a higher level
reference standard.
• The highest level reference standard(s) of your plant are sent out
to an external calibration laboratory, preferably an accredited one,
to be calibrated.
• The external calibration laboratory will calibrate their references
to assure traceability to National Calibration laboratory, or similar.
• The National Calibration laboratories work with International
level laboratories and make international comparisons with each
other’s assuring that their calibrations are on the same level.
• The International level laboratories base their measurements on
international comparisons, international definitions and
realization of the International System of Units (SI system).
Instrument Calibration Chain
International bureau of
weights & measures:
International standards France
In company
Working standards standard
Laboratory
standard
Process standards
Random error
• Random error: they affect the readings in a random way
(not exactly known).
• Random errors can never be corrected, the can only be
reduced by averaging, or error limits can be estimated by
using some statistical operations.
• If we measure the same input variable a number of times,
keeping all other factors affecting the measurement same,
the same measured value would not be repeated, the
consecutive reading would rather differ in a random way.
• But fortunately, the deviations of the readings normally
follow a particular distribution (mostly normal
distribution) and we may be able to reduce the error by
taking a number of readings and averaging them out.
• Few terms are often used to chararacterize the distribution of
the measurement,
Microprocessor-based:
that can perform
calculations, produce
diagnostics.
They are useful in
remote area
communication and
calibration process.