Está en la página 1de 12

Calibration

From Wikipedia, the free encyclopedia


"Zeroing" redirects here. For method used by the U.S. government to
calculate foreign antidumping duties, see Zeroing (trade).
Calibration in measurement technology and metrology is the process of
comparison of measurement values delivered by a device under test with
acalibration standard of known accuracy. Such a standard could be another
measurement device of known accuracy, a device generating the quantity to
be measured such as a voltage, or a physical artefact, such as a metre ruler.
The outcome of the comparison can result in no significant error being noted
on the device under test, a significant error being noted but no adjustment
made, or an adjustment made to correct the error to an acceptable level.
Strictly, the term calibration means just the act of comparison, and does not
include any subsequent adjustment.
The calibrating standard is normally traceable to a national standard held by a
National Metrological Institute and can be a physical artefact for direct
measurement, a custom built device for indirect measurement, or a similar
device to that which is under test.
Contents
[hide]
1BIPM Definition
2Modern calibration processes
o 2.1Quality
3Instrument calibration prompts
4Basic calibration process
o 4.1Purpose and scope
o 4.2Frequency
o 4.3Standards required and accuracy
o 4.4Manual and automatic calibrations
4.4.1Manual
4.4.2Automatic calibration
o 4.5Process description and documentation
o 4.6Success factors
5Historical development
o 5.1Origins
o 5.2Calibration of weights and distances (c.1100 CE)
o 5.3The Industrial Revolution and the calibration of pressure
6See also
7References
8External links
BIPM Definition[edit]
The formal definition of calibration by the International Bureau of Weights and
Measures (BIPM) is the following: "Operation that, under specified conditions,
in a first step, establishes a relation between the quantity values with
measurement uncertainties provided by measurement standards and
corresponding indications with associated measurement uncertainties (of the
calibrated instrument or secondary standard) and, in a second step, uses this
information to establish a relation for obtaining a measurement result from an
indication."[1]
This definition states that the calibration process is purely a comparison, but
introduces the concept of Measurement uncertainty in relating the accuracies
of the device under test and the standard.
Modern calibration processes[edit]
The increasing need for known accuracy and uncertainty and the need to
have consistent and comparable standards internationally has led to the

establishment of National laboratories. In many countries a National


Metrology Institute (NMI) will exist which will maintain primary standards of
measurement (the main SI units plus a number of derived units) which will be
used to provide traceability to customer's instruments by calibration.
The NMI supports the metrological infrastructure in that country (and often
others) by establishing an unbroken chain, from the top level of standards to
an instrument used for measurement. Examples of National Metrology
Institutes are NPL in the UK, NIST in the United States, PTB inGermany and
many others. Since the Mutual Recognition Agreement was signed it is now
straightforward to take traceability from any participating NMI and it is no
longer necessary for a company to obtain traceability for measurements from
the NMI of the country in which it is situated., such as the National Physical
Laboratory in the UK
Quality[edit]
To improve the quality of the calibration and have the results accepted by
outside organizations it is desirable for the calibration and subsequent
measurements to be "traceable" to the internationally defined measurement
units. Establishing traceability is accomplished by a formal comparison to
a standard which is directly or indirectly related to national standards ( such
as NIST in the USA), international standards, or certified reference materials.
This may be done by national standards laboratories operated by the
government or by private firms offering metrology services.
Quality management systems call for an effective metrology system which
includes formal, periodic, and documented calibration of all measuring
instruments. ISO 9000[2] and ISO 17025[3] standards require that these
traceable actions are to a high level and set out how they can be quantified.
To communicate the quality of a calibration the calibration value is often
accompanied by a traceable uncertainty statement to a stated confidence
level. This is evaluated through careful uncertainty analysis. Some times a
DFS (Departure From Spec) is required to operate machinery in a degraded
state. Whenever this does happen, it must be in writing and authorized by a
manager with the technical assistance of a calibration technician.
Measuring devices and instruments are categorized according to the physical
quantities they are designed to measure. These vary internationally,
e.g., NIST 150-2G in the U.S.[4] and NABL-141 in India.[5] Together, these
standards cover instruments that measure various physical quantities such
as electromagnetic
radiation (RF
probes),
time
and
frequency
(intervalometer), ionizing radiation (Geiger counter), light (light meter),
mechanical quantities (limit switch, pressure gauge, pressure switch), and,
thermodynamic or thermal properties (thermometer, temperature controller).
The standard instrument for each test device varies accordingly, e.g., a dead
weight tester for pressure gauge calibration and a dry block temperature
tester for temperature gauge calibration.
Instrument calibration prompts[edit]
Calibration may be required called for the following reasons:
a new instrument
after an instrument has been repaired or modified
when a specified time period has elapsed
when a specified usage (operating hours) has elapsed
before and/or after a critical measurement
after an event, for example
after an instrument has been exposed to a shock, vibration, or
physical damage, which might potentially have compromised
the integrity of its calibration
sudden changes in weather
whenever observations appear questionable or instrument indications

do not match the output of surrogate instruments


as specified by a requirement, e.g., customer specification, instrument
manufacturer recommendation.
In general use, calibration is often regarded as including the process
of adjusting the output or indication on a measurement instrument to agree
with value of the applied standard, within a specified accuracy. For example,
a thermometer could be calibrated so the error of indication or the correction
is determined, and adjusted (e.g. via calibration constants) so that it shows
the true temperature in Celsius at specific points on the scale. This is the
perception of the instrument's end-user. However, very few instruments can
be adjusted to exactly match the standards they are compared to. For the
vast majority of calibrations, the calibration process is actually the comparison
of an unknown to a known and recording the results.
Basic calibration process[edit]
Purpose and scope[edit]
The calibration process begins with the design of the measuring instrument
that needs to be calibrated. The design has to be able to "hold a calibration"
through its calibration interval. In other words, the design has to be capable of
measurements that are "within engineering tolerance" when used within the
stated environmental conditions over some reasonable period of time.
[6]
Having a design with these characteristics increases the likelihood of the
actual measuring instruments performing as expected. Basically,the purpose
of calibration is for maintaining the quality of measurement as well as to
ensure the proper working of particular instrument.
Frequency[edit]
The exact mechanism for assigning tolerance values varies by country and
industry type. The measuring equipment manufacturer generally assigns the
measurement tolerance, suggests a calibration interval (CI) and specifies the
environmental range of use and storage. The using organization generally
assigns the actual calibration interval, which is dependent on this specific
measuring equipment's likely usage level. The assignment of calibration
intervals can be a formal process based on the results of previous calibrations.
The standards themselves are not clear on recommended CI values:[7]
ISO 17025[3]
"A calibration certificate (or calibration label) shall not contain any
recommendation on the calibration interval except where this has been
agreed with the customer. This requirement may be superseded by legal
regulations.
ANSI/NCSL Z540[8]
"...shall be calibrated or verified at periodic intervals established and
maintained to assure acceptable reliability..."
ISO-9001[2]
"Where necessary to ensure valid results, measuring equipment shall...be
calibrated or verified at specified intervals, or prior to use...
MIL-STD-45662A[9]
"... shall be calibrated at periodic intervals established and maintained to
assure acceptable accuracy and reliability...Intervals shall be shortened or
may be lengthened, by the contractor, when the results of previous
calibrations indicate that such action is appropriate to maintain acceptable
reliability."

Standards required and accuracy[edit]


The next step is defining the calibration process. The selection of a standard
or standards is the most visible part of the calibration process. Ideally, the
standard has less than 1/4 of the measurement uncertainty of the device
being calibrated. When this goal is met, the accumulated measurement

uncertainty of all of the standards involved is considered to be insignificant


when the final measurement is also made with the 4:1 ratio. [10] This ratio was
probably first formalized in Handbook 52 that accompanied MIL-STD-45662A,
an early US Department of Defense metrology program specification. It was
10:1 from its inception in the 1950s until the 1970s, when advancing
technology made 10:1 impossible for most electronic measurements.[11]
Maintaining a 4:1 accuracy ratio with modern equipment is difficult. The test
equipment being calibrated can be just as accurate as the working standard.
[10]
If the accuracy ratio is less than 4:1, then the calibration tolerance can be
reduced to compensate. When 1:1 is reached, only an exact match between
the standard and the device being calibrated is a completely correct
calibration. Another common method for dealing with this capability mismatch
is to reduce the accuracy of the device being calibrated.
For example, a gage with 3% manufacturer-stated accuracy can be changed
to 4% so that a 1% accuracy standard can be used at 4:1. If the gage is used
in an application requiring 16% accuracy, having the gage accuracy reduced
to 4% will not affect the accuracy of the final measurements. This is called a
limited calibration. But if the final measurement requires 10% accuracy, then
the 3% gage never can be better than 3.3:1. Then perhaps adjusting the
calibration tolerance for the gage would be a better solution. If the calibration
is performed at 100 units, the 1% standard would actually be anywhere
between 99 and 101 units. The acceptable values of calibrations where the
test equipment is at the 4:1 ratio would be 96 to 104 units, inclusive.
Changing the acceptable range to 97 to 103 units would remove the potential
contribution of all of the standards and preserve a 3.3:1 ratio. Continuing, a
further change to the acceptable range to 98 to 102 restores more than a 4:1
final ratio.
This is a simplified example. The mathematics of the example can be
challenged. It is important that whatever thinking guided this process in an
actual calibration be recorded and accessible. Informality contributes
to tolerance stacks and other difficult to diagnose post calibration problems.
Also in the example above, ideally the calibration value of 100 units would be
the best point in the gage's range to perform a single-point calibration. It may
be the manufacturer's recommendation or it may be the way similar devices
are already being calibrated. Multiple point calibrations are also used.
Depending on the device, a zero unit state, the absence of the phenomenon
being measured, may also be a calibration point. Or zero may be resettable by
the user-there are several variations possible. Again, the points to use during
calibration should be recorded.
There may be specific connection techniques between the standard and the
device being calibrated that may influence the calibration. For example, in
electronic calibrations involving analog phenomena, the impedance of the
cable connections can directly influence the result.
Manual and automatic calibrations[edit]
Calibration methods for modern devices can be both manual and automatic,
depending on what kind of device is being calibrated.
Manual calibration - US serviceman calibrating a temperature gauge. The
device under test is on his left and the test standard on his right.
Manual[edit]
The first picture shows a U.S. Navy Airman performing a manual calibration
procedure on a pressure test gauge. The procedure is complex, [12] but overall
it involves the following: (i) depressurizing the system, and turning the screw,
if necessary, to ensure that the needle reads zero, (ii) fully pressurizing the
system and ensuring that the needle reads maximum, within acceptable
tolerances, (iii) replacing the gauge if the error in the calibration process is
beyond tolerance, as this may indicate signs of failure such

as corrosion or material fatigue.


Automatic calibration[edit]
Automatic calibration - A U.S. serviceman using a 3666C auto pressure
calibrator
The second picture shows the use of a 3666C automatic pressure calibrator,
[13]
which is a device that consists of a control unit housing the electronics that
drive the system, a pressure intensifier used to compress a gas such
as Nitrogen, a pressure transducer used to detect desired levels in a hydraulic
accumulator, and accessories such as liquid traps and gauge fittings.
Process description and documentation[edit]
All of the information above is collected in a calibration procedure, which is a
specific test method. These procedures capture all of the steps needed to
perform a successful calibration. The manufacturer may provide one or the
organization may prepare one that also captures all of the organization's other
requirements. There are clearinghouses for calibration procedures such as the
Government-Industry Data Exchange Program (GIDEP) in the United States.
This exact process is repeated for each of the standards used until transfer
standards, certified reference materials and/or natural physical constants, the
measurement standards with the least uncertainty in the laboratory, are
reached. This establishes the traceability of the calibration.
See Metrology for other factors that are considered during calibration process
development.
After all of this, individual instruments of the specific type discussed above
can finally be calibrated. The process generally begins with a basic damage
check. Some organizations such as nuclear power plants collect "as-found"
calibration data before any routine maintenance is performed. After routine
maintenance and deficiencies detected during calibration are addressed, an
"as-left" calibration is performed.
More commonly, a calibration technician is entrusted with the entire process
and signs the calibration certificate, which documents the completion of a
successful calibration.
Success factors[edit]
The basic process outlined above is a difficult and expensive challenge. The
cost for ordinary equipment support is generally about 10% of the original
purchase price on a yearly basis, as a commonly accepted rule-of-thumb.
Exotic
devices
such
as scanning
electron
microscopes, gas
chromatograph systems and laser interferometer devices can be even more
costly to maintain.
The extent of the calibration program exposes the core beliefs of the
organization involved. The integrity of organization-wide calibration is easily
compromised. Once this happens, the links between scientific theory,
engineering practice and mass production that measurement provides can be
missing from the start on new work or eventually lost on old work.
The 'single measurement' device used in the basic calibration process
description above does exist. But, depending on the organization, the majority
of the devices that need calibration can have several ranges and many
functionalities in a single instrument. A good example is a common
modernoscilloscope. There easily could be 200,000 combinations of settings
to completely calibrate and limitations on how much of an all inclusive
calibration can be automated.
Every organization using oscilloscopes has a wide variety of calibration
approaches open to them. If a quality assurance program is in force,
customers and program compliance efforts can also directly influence the
calibration approach. Most oscilloscopes are capital assets that increase the
value of the organization, in addition to the value of the measurements they
make. The individual oscilloscopes are subject to depreciation for tax

purposes over 3, 5, 10 years or some other period in countries with


complex tax codes. The tax treatment of maintenance activity on those assets
can bias calibration decisions.
New oscilloscopes are supported by their manufacturers for at least five
years, in general. The manufacturers can provide calibration services directly
or through agents entrusted with the details of the calibration and adjustment
processes.
Very few organizations have only one oscilloscope. Generally, they are either
absent or present in large groups. Older devices can be reserved for less
demanding uses and get a limited calibration or no calibration at all. In
production applications, oscilloscopes can be put in racks used only for one
specific purpose. The calibration of that specific scope only has to address
that purpose.
This whole process in repeated for each of the basic instrument types present
in the organization, such as the digital multimeter pictured below.

An instrument rack with tamper-indicating seals


To prevent unauthorised access to to an instrument tamper-proof seals are
usually applied after calibration. The picture of the oscilloscope rack shows
these, and prove that the instrument has not been removed since it was last
calibrated as they will possible unauthorised to the adjusting elements of the
instrument. There also are labels showing the date of the last calibration and
when the calibration interval dictates when the next one is needed. Some
organizations also assign unique identification to each instrument to
standardize the record keeping and keep track of accessories that are integral
to a specific calibration condition.
When the instruments being calibrated are integrated with computers, the
integrated computer programs and any calibration corrections are also under
control.
Historical development[edit]
Main article: History of measurement
Origins[edit]
The words "calibrate" and "calibration" entered the English language as
recently as the American Civil War, [14] in descriptions of artillery, thought to
be derived from a measurement of the calibre of a gun.
Some of the earliest known systems of measurement and calibration seem to
have
been
created
between
the
ancient
civilizations
of Egypt, Mesopotamiaand the Indus Valley, with excavations revealing the
use of angular gradations for construction.[15] The term "calibration" was likely
first associated with the precise division of linear distance and angles using
a dividing engine and the measurement of gravitational mass using aweighing
scale. These two forms of measurement alone and their direct derivatives
supported nearly all commerce and technology development from the earliest
civilizations until about AD 1800.[16]
Calibration of weights and distances (c.1100 CE)[edit]
See also: Weights and Measures Act

An example of a weighing scale with a ounce calibration error at zero. This


is a "zeroing error" which is inherently indicated, and can normally be
adjusted by the user, but may be due to the string and rubber band in this
case
Early measurement devices were direct, i.e. they had the same units as the
quantity being measured. Examples include length using a yardstick and mass
using a weighing scale. At the beginning of the twelfth century, during the
reign of Henry I (1100-1135), it was decreed that a yard be "the distance from
the tip of the King's nose to the end of his outstretched thumb." [17] However, it
wasn't until the reign of Richard I (1197) that we find documented evidence.
[18]

Assize of Measures
"Throughout the realm there shall be the same yard of the same size and it
should be of iron."
Other standardization attempts followed, such as the Magna Carta (1225) for
liquid measures, until the Mtre des Archives from France and the
establishment of the Metric system.
The Industrial Revolution and the calibration of pressure[edit]

Direct reading design of a U-tube manometer


One of the earliest pressure measurement devices was the Mercury
barometer, credited to Torricelli (1643),[19] which read atmospheric pressure
using Mercury. Soon after, hydrostatic manometers were designed, with a
linear calibration for measuring lower pressures ranges. The Industrial
Revolution (c.1760 CE c.1840 CE) saw widespread use of indirectmeasuring
devices, in which the quantity being measured was derived functionally based
on direct measurements of dependent quantities. [20] During this time,
scientists discovered the energy stored in compressed steam and other gases,

leading to the development of gauges more practical than hydrostatic


manometers at measuring higher pressures.[21] One such invention
was Eugene Bourdon's indirect design Bourdon tube.

Indirect reading design showing a Bourdon tube from the front (left) and the
rear (right).
In the direct reading hydrostatic manometer design on the left, an unknown
applied pressure Pa pushes the liquid down the right side of the manometer Utube, while a length scale next to the tube measures the pressure, referenced
to the other, open end of the manometer on the left side of the U-tube (P 0).
The resulting height difference "H" is a direct measurement of the pressure or
vacuum with respect to atmospheric pressure. The absence of pressure or
vacuum would make H=0. The self-applied calibration would only require the
length scale to be set to zero at that same point.
This direct measurement of pressure as a height difference depends on both
the density of the manometer fluid, and a calibrated means of measuring the
height difference.
In a Bourdon tube (shown in the two views on the right), applied pressure
entering from the bottom on the silver barbed pipe tries to straighten a
curved tube (or vacuum tries to curl the tube to a greater extent), moving the
free end of the tube that is mechanically connected to the pointer. This is
indirect measurement that depends on calibration to read pressure or vacuum
correctly. No self-calibration is possible, but generally the zero pressure state
is correctable by the user, as shown below.
See also[edit]
Calibration curve
Calibrated geometry
Calibration (statistics)
Color calibration used to calibrate a computer monitor or display.
Deadweight tester
EURAMET Association of European NMIs
Measurement Microphone Calibration
Measurement uncertainty
Musical tuning tuning, in music, means calibrating musical
instruments into playing the right pitch.
Precision measurement equipment laboratory
Scale test car a device used to calibrate weighing scales that
weigh railroad cars.
Systems of measurement
References[edit]
Crouch, Stanley & Skoog, Douglas A. (2007). Principles of Instrumental

Analysis. Pacific Grove: Brooks Cole. ISBN 0-495-01201-7.


1. Jump up^ JCGM 200:2008 International vocabulary of
metrology Basic and general concepts and associated terms
(VIM)
2. ^ Jump up to:a b ISO 9001: "Quality management systems
Requirements" (2008), section 7.6.
3. ^ Jump up to:a b ISO 17025: "General requirements for the
competence of testing and calibration laboratories" (2005),
section 5.
4. Jump up^ Faison, C. Douglas; Brickenkamp, Carroll S. (March
2004). "Calibration Laboratories: Technical Guide for Mechanical
Measurements" (PDF).NIST
Handbook
150-2G. NIST.
Retrieved 14 June 2015.
5. Jump up^ "Metrology, Pressure, Thermal & Eletrotechnical
Measurement and Calibration". Fluid Control Research Institute
(FCRI), Ministry of Heavy Industries & Public Enterprises, Govt.
of India. Archived from the original on 14 June 2015.
Retrieved 14 June 2015.
6. Jump up^ Haider, Syed Imtiaz; Asif, Syed Erfan (16 February
2011). Quality Control Training Manual: Comprehensive Training
Guide for API, Finished Pharmaceutical and Biotechnologies
Laboratories. CRC Press. p. 49. ISBN 978-1-4398-4994-1.
7. Jump up^ Bare, Allen (2006). Simplified Calibration Interval
Analysis (PDF). Aiken, SC: NCSL International Workshop and
Symposium, under contract with the Office of Scientific and
Technical Information, U.S. Department of Energy. pp. 12.
Retrieved 28 November 2014.
8. Jump up^ "ANSI/NCSL Z540.3-2006 (R2013)". The National
Conference of Standards Laboratories (NCSL) International.
Retrieved 28 November 2014.
9. Jump
up^ "Calibration Systems Requirements (Military
Standard)" (PDF). Washington, DC: U.S. Department of Defense.
1 August 1998. Retrieved28 November 2014.
10. ^ Jump up to:a b Ligowski, M.; Jaboski, Ryszard; Tabe, M.
(2011), Jaboski, Ryszard; Bezina, Toma, eds., Procedure for
Calibrating Kelvin Probe Force Microscope, Mechatronics: Recent
Technological and Scientific Advances, p. 227, doi:10.1007/9783-642-23244-2, ISBN 978-3-642-23244-2,LCCN 2011-935381
11. Jump up^ Military Handbook: Evaluation of Contractor's
Calibration System (PDF). U.S. Department of Defense. 17
August 1984. p. 7. Retrieved28 November 2014.
12. Jump up^ Procedure for calibrating pressure gauges (USBR
1040) (PDF). U.S. Department of the Interior, Bureau of
Reclamation. pp. 7073. Retrieved28 November 2014.
13. Jump up^ "KNC Model 3666 Automatic Pressure Calibration
System" (PDF). King Nutronics Corporation. Retrieved 28
November 2014.
14. Jump up^ http://dictionary.reference.com/browse/calibrate
15. Jump up^ Baber, Zaheer (1996). The Science of Empire:
Scientific Knowledge, Civilization, and Colonial Rule in India.
SUNY Press. pp. 2324.ISBN 978-0-7914-2919-8.
16. Jump up^ Franceschini, Fiorenzo; Galetto, Maurizio; Maisano,
Domenico; Mastrogiacomo, Luca; Pralio, Barbara (6 June
2011). Distributed Large-Scale Dimensional Metrology: New
Insights. Springer Science & Business Media. pp. 117
118. ISBN 978-0-85729-543-9.

17. Jump up^ Ackroyd, Peter (16 October 2012). Foundation: The
History of England from Its Earliest Beginnings to the Tudors. St.
Martin's Press. pp. 133134. ISBN 978-1-250-01367-5.
18. Jump up^ Bland, Alfred Edward; Tawney, Richard Henry
(1919). English Economic History: Select Documents. Macmillan
Company. pp. 154155.
19. Jump up^ Tilford, Charles R (1992). "Pressure and vacuum
measurements" (PDF). Physical methods of chemistry: 106173.
Retrieved 28 November 2014.
20. Jump up^ Fridman, A. E.; Sabak, Andrew; Makinen, Paul (23
November 2011). The Quality of Measurements: A Metrological
Reference. Springer Science & Business Media. pp. 10
11. ISBN 978-1-4614-1478-0.
21. Jump up^ Cusc, Laurence (1998). Guide to the Measurement
of Pressure and Vacuum. London: The Institute of Measurement
and Control. p. 5. ISBN 0 904457 29 X.
IS:ISO:ISI:17025:2005
External links[edit]
Master Calibrators Association - Industry Experts on Electrical
Calibration, Adjustment and Metrology
Categories:
Accuracy and precision
Standards
Measurement
Metrology
Navigation menu
Not logged in
Talk
Contributions
Create account
Log in
Article
Talk
Read
Edit
View history
Search
Go

Main page
Contents
Featured content
Current events
Random article
Donate to Wikipedia
Wikipedia store
Interaction
Help
About Wikipedia
Community portal
Recent changes
Contact page
Tools
What links here
Related changes

Upload file
Special pages
Permanent link
Page information
Wikidata item
Cite this page
Print/export
Create a book
Download as PDF
Printable version
In other projects
Wikimedia Commons
Languages
Afrikaans

Catal
etina
Deutsch
Eesti
Espaol

Franais
Gaeilge


Bahasa Indonesia
Italiano

Nederlands

Norsk nynorsk
Polski
Portugus
Romn

Basa Sunda
Suomi
Svenska

Trke

Edit links
This page was last modified on 22 September 2016, at 02:09.
Text is available under the Creative Commons Attribution-ShareAlike
License; additional terms may apply. By using this site, you agree to
the Terms of Use and Privacy Policy. Wikipedia is a registered
trademark of the Wikimedia Foundation, Inc., a non-profit organization.
Privacy policy
About Wikipedia
Disclaimers
Contact Wikipedia

Developers
Cookie statement
Mobile view

También podría gustarte