P. 1
Beamex_Book - Ultimate Calibration 2nd Edition

Beamex_Book - Ultimate Calibration 2nd Edition

|Views: 3|Likes:
Publicado porJulia Steger

More info:

Published by: Julia Steger on Oct 30, 2013
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less

10/30/2013

pdf

text

original

Sections

  • Traceable and efficient calibrations in the process industry
  • How often should calibrators be calibrated
  • Calibration of weighing instruments Part 1
  • Calibration of weighing instruments Part 2

Ultimate Calibration 2nd Edition

Ultimate Calibration 2nd Edition

Beamex is a technology and service company that develops, manufactures and markets high-quality calibration equipment, software, systems and services for the calibration and maintenance of process instruments. The company is a leading worldwide provider of integrated calibration solutions that meet even the most demanding requirements. Beamex offers a comprehensive range of products and services-from portable calibrators to workstations, calibration accessories, calibration software, industry-specific solutions and professional services. Through Beamex’s global and competent partner network, their products and services are available in more than 60 countries. As a proof of Beamex’s success, there are more than 10,000 companies worldwide utilizing their calibration solutions. Several companies have been Beamex’s customer since the establishment of the company over 30 years ago. For more information about Beamex and its products and services, visit www.beamex.com

Beamex has used reasonable efforts to ensure that this book contains both accurate and comprehensive information. Notwithstanding the foregoing, the content of this book is provided “as is” without any representations, warranties or guarantees of any kind, whether express or implied, in relation to the accuracy, completeness, adequacy, currency, quality, timeliness or fitness for a particular purpose of the content and information provided on this book. The contents of this book are for general informational purposes only. Furthermore, this book provides examples of some of the laws, regulations and standards related to calibration and is not intended to be definitive. It is the responsibility of a company to determine which laws, regulations and standards apply in specific circumstances.

Ultimate Calibration 2nd Edition Copyright © 2009–2012 by Beamex Oy Ab. All rights reserved. No part of this publication may be reproduced or distributed in any form or by any means, or stored in a database or retrieval system, without the prior written permission of Beamex Oy Ab. Requests should be directed to info@beamex.com. Beamex is a trademark of Beamex Oy Ab. All other trademarks or trade names mentioned in this book are the property of their respective holders.

Graphic design: Studio PAP Photos: Mats Sandström and image bank Printed by: Fram in Vaasa 2012, Finland

REGULATIONS AND TRACEABILITY Quality standards and industry regulations  11 A basic quality calibration program  35 Traceable and efficient calibrations in the process industry  57 CALIBRATION MANAGEMENT AND MAINTENANCE Why Calibrate? What is the risk of not calibrating?  73 Why use software for calibration management?  79 How often should instruments be calibrated?  89 How often should calibrators be calibrated?  97 Paperless calibration improves quality and cuts costs  101 Intelligent commissioning  107 Successfully executing a system integration project  115 CALIBRATION IN INDUSTRIAL APPLICATIONS The benefits of using a documenting calibrator  125 Calibration of weighing instruments Part 1  131 Calibration of weighing instruments Part 2  137 Calibrating temperature instruments  143 Calculating total uncertainty of temperature calibration with a dry block  149 Fieldbus transmitters must also be calibrated  157 Configuring and calibrating smart instruments  163 Calibration in hazardous environments  169 The safest way to calibrate to calibrate Fieldbus instruments  175 APPENDIX:  Calibration terminology A to Z  181 .Contents Preface by the CEO of Beamex Group  7 QUALITY.

foreword 6 .

ceo. I would like to thank all of the people who have contributed to this book. the primary challenges of industrial metrology and calibration include how to simplify and streamline the entire calibration process. systems or work processes within a company or production plant. calibration software and other related equipment have developed significantly during the past few decades in spite of the fact that calibration of measurement devices as such has existed for several thousands of years. Just like any other business function.preface by the ceo of beamex group Preface C alibrators. beamex group 7 . This book is the 2nd edition of Ultimate Calibration. On behalf of Beamex. In this area. Presently. Calibration and calibrators can no longer be considered as isolated. calibration procedures need to be automated to a higher degree and integrated to achieve improvements in quality and efficiency. The new topics covered in the edition mainly discuss paperless calibration. temperature calibration and configuring. new ideas. I want to express my special thanks to Pamela at Beamex Marketing. All of these challenges can be tackled by improving the level of system integration and automation. and calibration of smart instruments. I hope this book will assist you in learning new things and in providing fresh. how to eliminate double work. This book is the result of work that has taken place between 2006 and 2012. Beamex aims to be the benchmark in the industry. and how to lower the risk of human errors. The main changes to this edition include numerous new articles and a new grouping of the articles to make it easier to find related topics. intelligent commissioning. A team of experts in industry and calibration worldwide has put forth effort to its creation. who was the key person in organizing and leading the project for the 2nd edition. Enjoy your reading! raimo ahola. stand-alone devices. how to reduce production down-time.

.

Regulations and Traceability .Quality.

.

1 One of the earliest records of precise measurement is from Egypt. 1st Baron Kelvin. and corn… throughout the whole of our kingdom. Immersion in water makes the straight seem bent. Lord Kelvin). OM. The Republic. Magna Carta. beer. (William Thomson. thus confused by false appearance. numbering and weighing.C.quality standards and industry regulations Calibration requirements according to quality standards and industry regulations B efore going into what the current standards and regulations actually state.A. you know something about it. (Plato. but you have scarcely. The Egyptians studied the science of geometry to assist them in the construction of the Pyramids. The “Royal Egyptian Cubit” was decreed to be equal to the length of the forearm from the bent elbow to the tip of the extended middle 11 . It is believed that about 3000 years B. is beautifully restored by measuring..C. PC. PRS.K. and a standard width of dyed russet and cloth.) There shall be standard measures of wine. 1215) When you can measure what you are speaking about and express it in numbers. GCVO. in your thoughts. 360 B. the Egyptian unit of length came into being. the computer. and there shall be standard weights also. It may be the beginning of knowledge. but when you cannot express it in numbers. (Clause 35. these drive vague notions of greater or less or more or heavier right out of the minds of the surveyor. A. but reason. advanced to the stage of science. your knowledge is of a meager and unsatisfactory kind. here is a reminder from times past about measurement practices and how important they really are. 26 June 1824–17 December 1907. Surely it is the better part of thought that relies on measurement and calculation. and the clerk of the scales.

On the other hand.43 centimeters.Volume 4 of “The rules governing medicinal products in the European Union”. if a company is manufacturing a drug that must meet regulatory requirements. They were required to bring back their cubit sticks at each full moon to be compared to the Royal Cubit Master. standards. temples. Through the use of cubit sticks.05%. the Egyptians had anticipated the spirit of the present day system of legal metrology. and regulatory requirements are mandatory. the Egyptians achieved surprising accuracy.quality standards and industry regulations finger plus the width of the palm of the hand of the Pharaoh or King ruling at that time. there are basically two types of requirements: ISO standards and regulatory requirements. they are inspected by government inspectors for compliance to federal regulations. Though the punishment prescribed was severe. they were within 4. In Europe. The need for calibration has been around for at least 5000 years. If an organization volunteers to meet ISO 9000 standards. The Royal Architect or Foreman of the construction site was responsible for maintaining & transferring the unit of length to workers instruments.5 inches or 11.36276 meters. detailed information for achieving regulatory compliance is provided in Eudralex . Thousands of workers were engaged in building the Great Pyramid of Giza. Workers engaged in building tombs. pyramids. In today’s calibration environment. In roughly 756 feet or 230. However. The biggest difference between the two is simple – ISO standards are voluntary. With this standardization and uniformity of length. the federal regulations specify in greater detail what a company must do to meet the requirements set forth in the Code of Federal Regulations (CFRs). a set of guidelines are used to write their quality manual and other standard operating procedures (SOPs) and they show how they comply with the standard. they achieved an accuracy of 0. In the case of ISO standards. The Pharmaceutical Inspection Convention and Pharmaceutical Inspection Co-operation Scheme (PIC/S) aims to improve harmonisation of Good Manufacturing Practice (GMP) standards and guidance documents.2 The “Royal Cubit Master” was carved out of a block of granite to endure for all times. were supplied with cubits made of wood or granite. Failure to do so was punishable by death. etc. 12 . they pay a company to audit them to that standard to ensure they are following their quality manual and are within compliance. traceability and calibration recall.

3 – Definitions.22 – Quality audit.75 – Process validation. and what they say about calibration and what must be accomplished to meet the CFRs. TITLE 21 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER H – MEDICAL DEVICES PART 820 QUALITY SYSTEM REGULATION 22 Subpart A – General Provisions § 820.70 – Production and process controls. Subpart C – Design Controls § 820.quality standards and industry regulations Calibration requirements according to the U. Subpart F – Identification and Traceability § 820. Subpart B – Quality System Requirements § 820.60 – Identification. § 820. § 820.1 – Scope. § 820.40 – Document controls. § 820.65 – Traceability.30 – Design controls.20 – Management responsibility. Please note that European standards are similar to FDA requirements. Subpart G – Production and Process Controls § 820. § 820.72 – Inspection. Subpart E – Purchasing Controls § 820. Listed below are several different parts of 21CFR.5 – Quality system. Subpart D – Document Controls § 820. and test equipment. S. § 820. that relate to the calibration of test equipment in different situations and environments.25 – Personnel.50 – Purchasing controls. measuring. § 820. Food and Drug Administration (FDA) Following are examples of some of the regulations required by the FDA. 13 .

§ 820. Subpart M – Records § 820. Storage. Subpart K – Labeling and Packaging Control § 820. § 820. in-process.150 – Storage.181 – Device master record.250 – Statistical techniques. and Installation § 820. Subpart L – Handling. Subpart J – Corrective and Preventive Action § 820. § 820.72] TITLE 2 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER H – MEDICAL DEVICES 14 . § 820.120 – Device labeling.140 – Handling.86 – Acceptance status. 2012] [CITE: 21CFR820. § 820.186 – Quality system record.184 – Device history record. § 820. § 820.quality standards and industry regulations Subpart H – Acceptance Activities § 820.90 – Nonconforming product. [Code of Federal Regulations] [Title 21.170 – Installation.160 – Distribution. § 820. and finished device acceptance. Volume 8] [Revised as of April 1. Subpart O – Statistical Techniques § 820.200 – Servicing.198 – Complaint files. Distribution.130 – Device packaging. Subpart N – Servicing § 820. § 820.80 – Receiving.180 – General requirements.100 – Corrective and preventive action. Subpart I – Nonconforming Product § 820.

If no applicable standard exists. and test equipment. there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality. and maintained. The procedures shall include provisions for handling. inspected. measuring.quality standards and industry regulations PART 820–QUALITY SYSTEM REGULATION Subpart G–Production and Process Controls Sec. If national or international standards are not practical or available. measuring. (2)   Calibration records. Each manufacturer shall establish and maintain procedures to ensure that equipment is routinely calibrated. measuring. and test equipment shall be traceable to national or international standards. (1)   C alibration standards. and the next calibration date shall be documented. including mechanical. Calibration standards used for inspection. These activities shall be documented. and test equipment. measuring. (a)  Control of inspection. and test equipment. calibration dates. (b)   Calibration. These activities shall be documented. so that its accuracy and fitness for use are maintained. is suitable for its intended purposes and is capable of producing valid results. The equipment identification. preservation. and storage of equipment. the manufacturer shall establish and maintain an in-house standard. the manufacturer shall use an independent reproducible standard. the individual performing each calibration. 820.72 Inspection. automated. Calibration procedures shall include specific directions and limits for accuracy and precision. or electronic inspection and test equipment. checked. These records shall be displayed on or near each piece of equipment or shall be readily available to the personnel using such equipment and to the individuals responsible for calibrating the equipment. Each manufacturer shall ensure that all inspection. 15 . When accuracy and precision limits are not met.

or related systems that will perform a function satisfactorily. sampling plans. 16 . If such equipment is so used. and test procedures designed to assure that components. labeling. and holding of a drug product. and electronic equipment. including computers. standards. and purity.160 General requirements. quality. Sec. inspected. Laboratory controls shall include: (1)  Determination of conformity to applicable written specifications for the acceptance of each lot within each shipment of components. 2012] [CITE: 21CFR211] TITLE 21 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER C – DRUGS: GENERAL PART 211 CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Subpart D – Equipment Sec. (b)  Laboratory controls shall include the establishment of scientifically sound and appropriate specifications. and drug products conform to appropriate standards of identity. in-process materials. may be used in the manufacture. 211. closures. or checked according to a written program designed to assure proper performance. packing.quality standards and industry regulations [Code of Federal Regulations] [Title 21. Written records of those calibration checks and inspections shall be maintained. Volume 4] [Revised as of April 1.68 Automatic. mechanical. (a)  Automatic. or electronic equipment or other types of equipment. closures. strength. mechanical. drug product containers. processing. it shall be routinely calibrated. 211. drug product containers.

8. apparatus. Such samples shall be representative and properly identified. Such procedures shall also require appropriate retesting of any component. (d)  Complete records shall be maintained of the periodic calibration of laboratory instruments. and recording devices not meeting established specifications shall not be used.160(b)(4). limits for accuracy and precision. packing. apparatus.194 Laboratory records. Instruments. Sept.quality standards and industry regulations and labeling used in the manufacture. and recording devices required by 211. apparatus. (2)  Determination of conformance to written specifications and a description of sampling and testing procedures for in-process materials. gauges. or holding of drug products. 211. Sept. [43 FR 45077. gauges. The specifications shall include a description of the sampling and testing procedures used. or closure that is subject to deterioration. Such samples shall be representative and properly identified. drug product container. as amended at 73 FR 51932. 29. (3)  Determination of conformance to written descriptions of sampling procedures and appropriate specifications for drug products. (4)  The calibration of instruments. 17 . 2008] Sec. Samples shall be representative and adequately identified. gauges. schedules. 1978. processing. and provisions for remedial action in the event accuracy and/or precision limits are not met. and recording devices at suitable intervals in accordance with an established written program containing specific directions.

quality standards and industry regulations

TITLE 2 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER A – GENERAL PART 11 ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Subpart A – General Provisions Sec. 11.1 Scope. (a)  The regulations in this part set forth the criteria under which the agency considers electronic records, electronic signatures, and handwritten signatures executed to electronic records to be trustworthy, reliable, and generally equivalent to paper records and handwritten signatures executed on paper. (b)  This part applies to records in electronic form that are created, modified, maintained, archived, retrieved, or transmitted, under any records requirements set forth in agency regulations. This part also applies to electronic records submitted to the agency under requirements of the Federal Food, Drug, and Cosmetic Act and the Public Health Service Act, even if such records are not specifically identified in agency regulations. However, this part does not apply to paper records that are, or have been, transmitted by electronic means. (c)  Where electronic signatures and their associated electronic records meet the requirements of this part, the agency will consider the electronic signatures to be equivalent to full handwritten signatures, initials, and other general signings as required by agency regulations, unless specifically excepted by regulation(s) effective on or after August 20, 1997. (d)  Electronic records that meet the requirements of this part may be used in lieu of paper records, in accordance with 11.2, unless paper records are specifically required.

18

quality standards and industry regulations

(e)  Computer systems (including hardware and software), controls, and attendant documentation maintained under this part shall be readily available for, and subject to, FDA inspection. (f)  This part does not apply to records required to be established or maintained by 1.326 through 1.368 of this chapter. Records that satisfy the requirements of part 1, subpart J of this chapter, but that also are required under other applicable statutory provisions or regulations, remain subject to this part. [62 FR 13464, Mar. 20, 1997, as amended at 69 FR 71655, Dec. 9, 2004] Sec. 11.2 Implementation. (a)  For records required to be maintained but not submitted to the agency, persons may use electronic records in lieu of paper records or electronic signatures in lieu of traditional signatures, in whole or in part, provided that the requirements of this part are met. (b)  For records submitted to the agency, persons may use electronic records in lieu of paper records or electronic signatures in lieu of traditional signatures, in whole or in part, provided that: (1)  The requirements of this part are met; and (2)  The document or parts of a document to be submitted have been identified in public docket No. 92S-0251 as being the type of submission the agency accepts in electronic form. This docket will identify specifically what types of documents or parts of documents are acceptable for submission in electronic form without paper records and the agency receiving unit(s) (e.g., specific center, office, division, branch) to which such submissions may be made. Documents to agency receiving unit(s) not specified in the public docket will not be considered as official if they are submitted in electronic form; paper forms of such documents will be considered as official and must accompany any electronic records. Persons are expected to consult with the intended agency receiving unit for details on how (e.g., method of transmission, media, file formats, and technical protocols) and whether to proceed with the electronic submission. 19

quality standards and industry regulations

TITLE 21--FOOD AND DRUGS CHAPTER I--FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER A--GENERAL PART 11 ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Subpart C – Electronic Signatures Sec. 11.100 General requirements. (a)  Each electronic signature shall be unique to one individual and shall not be reused by, or reassigned to, anyone else. (b)  Before an organization establishes, assigns, certifies, or otherwise sanctions an individual`s electronic signature, or any element of such electronic signature, the organization shall verify the identity of the individual. (c)  Persons using electronic signatures shall, prior to or at the time of such use, certify to the agency that the electronic signatures in their system, used on or after August 20, 1997, are intended to be the legally binding equivalent of traditional handwritten signatures. (1)  The certification shall be submitted in paper form and signed with a traditional handwritten signature, to the Office of Regional Operations (HFC-100), 12420 Parklawn Drive, RM 3007 Rockville, MD 20857. (2)  Persons using electronic signatures shall, upon agency request, provide additional certification or testimony that a specific electronic signature is the legally binding equivalent of the signer`s handwritten signature.

20

quality standards and industry regulations

Calibration requirements according to the European Medicines Agency (EMA) Following are examples of some of the regulatory requirements of the EMA, and what they say about calibration and what must be accomplished to meet the GMPs. Eudralex Volume 4 Chapter 3: Premises and Equipment Equipment 3.41 Measuring, weighing, recording and control equipment should be calibrated and checked at defined intervals by appropriate methods. Adequate records of such tests should be maintained. Chapter 4: Documentation Manufacturing Formula and Processing Instructions Approved, written Manufacturing Formula and Processing Instructions should exist for each product and batch size to be manufactured. 4.18 The Processing Instructions should include: a) A statement of the processing location and the principal equipment to be used; b) The methods, or reference to the methods, to be used for preparing the critical equipment (e.g. cleaning, assembling, calibrating, sterilising); c) Checks that the equipment and work station are clear of previous products, documents or materials not required for the planned process, and that equipment is clean and suitable for use; d) Detailed stepwise processing instructions [e.g. checks on materials, pre-treatments, sequence for adding materials, critical process parameters (time, temp etc)]; e) The instructions for any in-process controls with their limits; f ) Where necessary, the requirements for bulk storage of the products; including the container, labeling and special storage conditions where applicable; g) Any special precautions to be observed.

21

•  Equipment assembly and calibration. 4. •  Complaints.31 Logbooks should be kept for major or critical analytical testing. reports and the associated records of actions taken or conclusions reached. equipment/method. •  Personnel matters including signature lists. including the dates and identity of people who carried these operations out. equipment and systems.29 There should be written policies. and areas where product has been processed. product quality review). •  Internal quality/GMP compliance audits. •  Recalls.g. where appropriate.quality standards and industry regulations Procedures and records Other 4. for the following examples: •  Validation and qualification of processes. maintenance. production equipment. •  Returns. procedures. •  Maintenance. training in GMP and technical matters. cleaning or repair operations. as appropriate. •  Supplier audits. •  Environmental monitoring. cleaning and sanitation. •  Pest control. 22 . •  Change control. protocols. any use of the area. calibrations. clothing and hygiene and verification of the effectiveness of training. •  Technology transfer. They should be used to record in chronological order. •  Summaries of records where appropriate (e. •  Investigations into deviations and non-conformances.

•  data from environmental monitoring. but not be limited to the following: (a)  installation of equipment. (c)  icalibration requirements.7 Laboratory documentation should follow the principles given in Chapter 4. •  validation records of test methods. where applicable. An important part of this documentation deals with Quality Control and the following details should be readily available to the Quality Control Department: •  specifications. systems and equipment. (b)  icollection and collation of supplier operating and working instructions and maintenance requirements.quality standards and industry regulations Chapter 6 Quality Control Good Quality Control Laboratory Practice Documentation 6. •  sampling procedures. Installation qualification (IQ ) should be performed on new or modified facilities. •  testing procedures and records (including analytical worksheets and/ or laboratory notebooks). where required. Annex 15 to the EU Guide to Good Manufacturing Practice Title: Qualification and validation QUALIFICATION Installation qualification 11. piping. 23 . •  analytical reports and/or certificates. (d)  verification of materials of construction. 12. IQ should include. services and instrumentation checked to current engineering drawings and specifications. •  procedures for and records of the calibration of instruments and maintenance of equipment.

Qualification of established (in-use) facilities. PROCESS VALIDATION Prospective validation 24. Prospective validation should include. (b)  summary of the critical processing steps to be investigated. but not be limited to the following: (a)  short description of the process. (h)  sampling plan. (i)  methods for recording and evaluating results (j)  functions and responsibilities. (f)  proposed in-process controls with acceptance criteria. cleaning. as appropriate. preventative maintenance. The completion of a successful Operational qualification should allow the finalisation of calibration. as appropriate. operating and cleaning procedures. Evidence should be available to support and verify the operating parameters and limits for the critical variables of the operating equipment. Additionally. (e)  list of analytical methods. (k)  proposed timetable. the calibration. operator training and preventative maintenance requirements. systems and equipment 19. (c)  list of the equipment/facilities to be used (including measuring/ monitoring/recording equipment) together with its calibration status (d)  finished product specifications for release. systems and equipment. with acceptance criteria and analytical validation. (g)  additional testing to be carried out.quality standards and industry regulations Operational qualification 15. It should permit a formal “release” of the facilities. 24 . operating procedures and operator training procedures and records should be documented.

Volume 4 Good manufacturing practice (GMP) Guidelines: http://ec.pdf EUROPEAN COMMISSION HEALTH AND CONSUMERS DIRECTORATE-GENERAL Public Health and Risk Assessment Pharmaceuticals Brussels.europa.htm PDF of Annex 11: http://ec. and applies to all forms of computerised systems used as part of GMP regulated activities. Main page for the EudraLex . This document provides guidance for the interpretation of the principles 25 .europa.quality standards and industry regulations EU GMP Annex 11 The EU GMP Annex 11 defines EU requirements for computerised systems. SANCO/C8/AM/sl/ares(2010)1064599 EudraLex The Rules Governing Medicinal Products in the European Union Volume 4 Good Manufacturing Practice Medicinal Products for Human and Veterinary Use Annex 11: Computerised Systems Legal basis for publishing the detailed guidelines: Article 47 of Directive 2001/83/EC on the Community code relating to medicinal products for human use and Article 51 of Directive 2001/82/EC on the Community code relating to veterinary medicinal products.eu/health/documents/eudralex/vol-4/index_en.eu/health/files/eudralex/vol-4/annex11_01-2011_ en.

B-1049 Brussel . The application should be validated. There should be no increase in the overall risk of the process. there should be no resultant decrease in product quality. IT infrastructure should be qualified. Where a computerised system replaces a manual operation. Further information can be found at the PIC/S Web site (http://www. process control or quality assurance. Status of the document: revision 1 Reasons for changes: the Annex has been revised in response to the increased use of computerised systems and the increased complexity of these systems.Belgium Telephone: (32-2) 299 11 11 Principle This annex applies to all forms of computerised systems used as part of a GMP regulated activities.). 26 . A computerised system is a set of software and hardware components which together fulfill certain functionalities.org/. B-1049 Bruxelles / Europese Commissie.quality standards and industry regulations and guidelines of good manufacturing practice (GMP) for medicinal products as laid down in Directive 2003/94/EC for medicinal products for human use and Directive 91/412/EEC for veterinary use. PIC/S The abbreviation PIC/S describes both the Pharmaceutical Inspection Convention (PIC) and the Pharmaceutical Inspection Co-operation Scheme (PIC Scheme) which operate together. Deadline for coming into operation: 30 June 2011 Commission Européenne. Consequential amendments are also proposed for Chapter 4 of the GMP Guide. It aims to promote harmonisation of global regulations for the pharmaceutical industry. picscheme.

the organization entered into a partnership with the ISPE and published its first GAMP® guidelines. GAMP® Francophone. Austria. GAMP® Italiano and GAMP® Japan. including: •  GAMP ® Good Practice Guide: A Risk-Based Approach to Calibration Management (Second Edition) •  GAMP® Good Practice Guide: A Risk-Based Approach to GxP Compliant Laboratory Computerized Systems (Second Edition) •  GAMP® Good Practice Guide: A Risk-Based Approach to GxP Process Control Systems (Second Edition) •  GAMP® Good Practice Guide: A Risk-Based Approach to Operation of GxP Computerized Systems . They also bring the GAMP® community closer to its members. such as GAMP® Americas. GAMP® Nordic. Switzerland). GAMP® itself was founded in 1991 in the United Kingdom to deal with the evolving FDA expectations for Good Manufacturing Practice (GMP) compliance of manufacturing and related systems. The most well known GAMP® publication is GAMP ® 5 A RiskBased Approach to GxP Computerized Systems. GAMP® Europe. produce technical content and translate ISPE technical documents. in collaboration with ISPE’s local Affiliates in these regions. The GAMP® COP organizes discussion forums for its members and ISPE organises GAMP® related training courses and educational seminars. GAMP® DACH (Germany.A Companion Volume to GAMP® 5 •  GAMP® Good Practice Guide: Electronic Data Archiving •  GAMP® Good Practice Guide: Global Information Systems Control and Compliance 27 . GAMP® Japan. and GAMP® Americas support the GAMP® Council which oversee the operation of the COP and is the main link to ISPE.quality standards and industry regulations GAMP® GAMP® is a Community of Practice (COP) of the International Society for Pharmaceutical Engineering (ISPE). Several local GAMP® COPs. There is also a series of related GAMP® guidance on specific topics. The GAMP® COP aims to provide guidance and understanding concerning GxP computerized systems. COPs provide networking opportunities for people interested in similar topics. This is the latest major revision and was released in January 2008. Since 1994. Three regional Steering Committees.

All calibrations must be traceable to a national or international standard or artifact. and analytical instrumentation. and corrective actions vital to regulatory compliance.  •  Calibrate your monitoring and measuring equipment using a period schedule to ensure that results are valid (you should also perform a yearly evaluation of your calibration results to see if there is a need to increase or decrease your calibration intervals on calibrated test equipment). laboratory. documentation. it requires periodic calibration).  •  Establish monitoring and measuring processes (calibration procedures and calibration record templates for recording your calibration results). The second edition of the guide has been significantly updated to address the change in regulatory expectations and in associated industry guidance documents.  28 . The scope now includes related industries.6  CONTROL MONITORING AND MEASURING EQUIPMENT •  Identify your organization’s monitoring and measuring needs and requirements (if your test instrument makes a quantitative measurement. calibration program management. providing a structured approach to instrument risk assessment. The Guide describes the principles of calibration and presents guidance in setting up a calibration management system.quality standards and industry regulations •  GAMP® Good Practice Guide: IT Infrastructure Control and Compliance •  GAMP® Good Practice Guide: Legacy Systems The GAMP® Good Practice Guide: A Risk-Based Approach to Calibration Management (second edition) was developed by ISPE’s GAMP® COP Calibration Special Interest Group (SIG) in conjunction with representatives from the pharmaceutical industry and input from regulatory agencies. ISO 9001:2008 Basically. this is what is required according to ISO 9001:2008 7. A set of associated attachments are also available through the ISPE website. and select test equipment that can meet those monitoring and measuring needs and requirements.

preservation.  •  Confirm that monitoring and measuring software is capable of doing the job you want it to do (your software needs to be validated before being used. second-. and may have affected numerous items of test equipment over a period of time).3-2006 – American National Standard for Calibration-Requirements for the Calibration of Measuring and Test Equipment. rather it be for compliance to an ISO standard (ISO 9001:2008 or ISO 13485) or an FDA requirement (cGMP. and your calibration standards).quality standards and industry regulations •  Protect your monitoring and measuring equipment (this includes during handling. your test instruments may need to be qualified prior to use). 29 . transportation. and laboratories where testing and/or calibration forms part of inspection and product certifications. ISO 17025 ISO 17025 – General requirements for the competence of testing and calibration laboratories. and when required. then you do not have any obligation to meet the ISO 17025 standard. and shipping of all test instruments – to include your customer’s items.). Please keep in mind that if your calibration function and/or metrology department fall under the requirements of your company. there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality”. etc. especially when the out of tolerance item is a calibration standard. storage. “When accuracy and precision limits are not met. QSR. According to ISO 17025.  •  Evaluate the validity of previous measurements whenever you discover that your measuring or monitoring equipment is out-ofcalibration (as stated in the FDA regulations. You already fall under a quality system that takes care of your calibration requirements. this standard is applicable to all organizations performing tests and/or calibrations. this is just as applicable when dealing with ISO as with any other standard or regulation.3-2006 ANSI/NCSL Z540. and third-party laboratories. ANSI/NCSL Z540. These include first-.

this National Standard includes and updates the relevant calibration system requirements for measuring and test equipment described by the previous standards. retailer. certification. enduser. In the development of this National Standard attention has been given to: •  expressing the technical requirements for a calibration system supporting both government and industry needs. Collectively. industry. The “Customer” may be a consumer. and •  balancing the needs and interests of all stakeholders. •  the compatibility of measurements with the National Measurement System. distributor. vendor. This is done through the use of a system of functional components. Part 11 of ANSI/NCSL Z540. these components are used to manage and assure that the accuracy and reliability of the measuring and test equipment are in accordance with identified performance requirements. In addition. and other evaluations of calibration systems and their components. 30 . •  applying best practices and experience with related national.quality standards and industry regulations The objective of this National Standard is to establish the technical requirements for the calibration of measuring and test equipment. Reference to this National Standard may be made by: •  customers when specifying products (including services) required. and •  assessment organizations in the audit. and government standards. this National Standard describes the technical requirements for establishing and maintaining: •  the acceptability of the performance of measuring and test equipment. •  legislative or regulatory bodies. The “Supplier may be a producer.1 (R2002) and Military Standard 45662A. each term being interpreted in the broadest sense. client. or information. •  the suitability of a calibration for its intended application. service. and •  the traceability of measurement results to the International System of Units (SI). or purchaser that receives a product or service. •  agencies or organizations as a contractual condition for procurement. In implementing its objective. or a provider of a product. international. •  suppliers when specifying products offered. This National Standard is written for both Supplier and Customer.

quality standards and industry regulations

This National Standard is specific to calibration systems. A calibration system operating in full compliance with this National Standard promotes confidence and facilitates management of the risks associated with measurements, tests, and calibrations.8 Equipment intended for use in potentially explosive atmospheres (ATEX) What are ATEX and IECEx? ATEX (“ATmosphères EXplosibles”, explosive atmospheres in French) is a standard set in the European Union for explosion protection in the industry. ATEX 95 equipment directive 94/9/EC concerns equipment intended for use in potentially explosive areas. Companies in the EU where the risk of explosion is evident must also use the ATEX guidelines for protecting the employees. In addition, the ATEX rules are obligatory for electronic and electrical equipment that will be used in potentially explosive atmospheres sold in the EU as of July 1, 2003. IEC (International Electrotechnical Commission) is a nonprofit international standards organization that prepares and publishes International Standards for electrical technologies. The IEC TC/31 technical committee deals with the standards related to equipment for explosive atmospheres. IECEx is an international scheme for certifying procedures for equipment designed for use in explosive atmospheres. The objective of the IECEx Scheme is to facilitate international trade in equipment and services for use in explosive atmospheres, while maintaining the required level of safety. In most cases, test equipment that is required to be operated in an explosive environment would be qualified and installed by the company’s facility services department and not the calibration personnel. One must also keep in mind that there would be two different avenues for the calibration of those pieces of test equipment: on-site and off-site. If the test instrument that is used in an explosive environment must be calibrated on-site (in the explosive environment), then all the standards used for that calibration must also comply with explosive environment directives. However, if it were possible to remove the test equipment from the explosive environment when due for their period calibration, then there would be no requirement for the standards used for their calibration to meet the explosive

31

quality standards and industry regulations

environment directives, saving money on expensive standards and possibly expensive training of calibration personnel in order for them to work in those conditions. Having said that, there may be a need for the calibration personnel to be aware of the ATEX regulations. An informative website for information on ATEX can be found by typing in the following link: http://ec.europa.eu/enterprise/atex/indexinfor.htm. Several languages are available for retrieving the information. Another informative website is the International Electrotechnical Commission Scheme for Certification to Standards Relating to Equipment for use in Explosive Atmospheres (IECEx Scheme). The link is: http://www.iecex.com/guides.htm. 1.  Bucher, Jay L. 2007. The Quality Calibration Handbook. Milwaukee: ASQ Quality Press. 2.  The Story of the Egyptian Cubit. http://www.ncsli.org/misc/ cubit.cfm. (18 October, 2008) 3.  21CFR Part 211.68, 211.160: http://www.accessdata.fda.gov/ scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=211/ (5 July, 2012) 4.  21CFR Part 11. http://www.fda.gov/downloads/ RegulatoryInformation/Guidances/ucm125125.pdf (5 July, 2012) and http://www.fda.gov/RegulatoryInformation/Guidances/ ucm125067.htm?utm_campaign=Google2&utm_ source=fdaSearch&utm_medium=website&utm_term=21 CFR part 11&utm_content=3 5.  GAMP. http://en.wikipedia.org/wiki/Good_Automated_ Manufacturing_Practice (5 July, 2012) 6.  NCSL International. 2006. ANSI/NCSL Z540.3-2006. Boulder, CO.

32

34

The amount of drugs confiscated in a raid determines whether the offense is a misdemeanor or a felony. or by using incorrect calibrations. the cure for cancer is one of them. for food weighed incorrectly at the checkout counter. This scenario is repeated every day throughout the world. Without calibration. If all test equipment were calibrated to a traceable standard. Airliners fly into mountaintops and off the ends of runways because they don’t know their altitude and/or speed. Measurements made by R&D are different than those made by the operations section. then repeatable results would ensure that what’s made in one part of the company is also repeated in another part of the company. criminals are either not convicted or are released on bad evidence. Their R&D section sends the formula to their operations & manufacturing division. which weight is correct? As one can see. and possibly the ability to stay in business simply because they do not use calibrated test equipment. Because of poor or incorrect calibration. Incorrect amounts of ingredients in your prescription and over-the-counter (OTC) drugs can cost more. money. The cure cannot be replicated with consistent results. and for manufactured goods that do not meet their stated specifications.a basic quality calibration program A basic quality calibration program R &D departments are tasked with coming up with the answers to many problems. having the correct measurements throughout any and all industries is critical to national and international trade and commerce. or even cause illness or death. has found the cure for cancer. The company loses time. Babies are not correctly weighed at birth. They are not using calibrated test instruments in the company. all of us pay more at the gas pump. 35 . Crime labs cannot identify the remains of victims or wrongly identify victims in the case of mass graves. Let’s    imagine that the Acme Biotech Co. their reputation. A fairy tale? Not hardly.

nor if you cannot adjust or align it. or certain parts of the world. but differ depending on where you are in the chain. so a graphic illustration of these pyramids is important. publication reference (EAL-G12) Traceability of Measuring and Test Equipment to National Standards. no matter the circumstances is called calibration. According to the European cooperation for Accreditation of Laboratories. and the unit under test (UUT) or test instrument that is being calibrated (the uncertainty is unknown. we should explain two different traceability pyramids. one of known uncertainty (your standard) and one of unknown uncertainty (your test equipment or instrument). However. It is as simple as that. before we go any further. and using a test uncertainty ratio (TUR) of ≥ 4:1. The comparison to a standard that is more accurate. usually national or international standards. and that is why it is being calibrated). the ‘everyday calibration technician’ is usually situated close to the bottom of the pyramid. we need to clarify two definitions that are critical to this subject – calibration and traceability. First. There are basically two ways to maintain traceability during calibration – the use of an uncertainty budget (performing uncertainty calculations for each measurement). Traceability is the property of the result of a measurement or the value of a standard whereby it can be related to stated references. let’s discuss the use of uncertainty budgets. The two examples in figures 1 and 2 are similar. Nothing could be further from the truth. the purpose of which is to give guidance on the calibration and maintenance of measuring 36 . When we talk about traceability to a national or international standard. Before we can get any deeper into what traceability is.a basic quality calibration program The bottom line is this – all test equipment that make a quantitative measurement require periodic calibration. through an unbroken chain of calibrations all having stated uncertainties. Many people are under the misconception that an item must be adjusted or aligned in order to be calibrated. The calibration of any piece of equipment or system is simply a comparison between the standard being used (with its known uncertainty). By definition: Calibration is a comparison of two measurement devices or systems. It does not make any difference if you adjust. align or repair the item.

37 . paragraphs 4 and 5 are very specific in their requirements: 4  Why are calibrations and traceability necessary? 4. Secondary standards Reference standards Working standards User’s test equipment Figure 2 Note: NMI = National Metrology Institute equipment in meeting the requirements of the ISO 9000 series of standards for quality systems.a basic quality calibration program BIPM NMIs Reference standards Working metrology labs General purpose calibration labs (inside a company) User’s test equipment Figure 1 SI units Primary stds. supplier firms that make products. and customers who install them with other parts. must measure with the ‘same measure’.1  Traceability of measuring and test equipment to national standards by means of calibration is necessitated by the growing national and international demand that manufactured parts be interchangeable. and the EN 45001 standard for the operation of testing laboratories.

4. e. by reference to a systematic and fully documented system.4  There are similar technical and legal reasons why calibration and testing laboratory operators should have consistent control of measuring and test equipment in the manner described. uncertainty required. the length of these intervals will depend on a number of variables. (f)  re-calibrations. (c)  documentation.1  Traceability is characterised by a number of essential elements: (a)  an unbroken chain of comparisons going back to a standard acceptable to the parties. calibrations must be repeated at appropriate intervals. the measurement uncertainty for each step in the traceability chain must be calculated according to agreed methods and must be stated so that an overall uncertainty for the whole chain may be calculated. the chain of comparisons must end at primary standards for the realization of the SI units. (b)  measurement uncertainty. (d)  competence. that adequate measuring and test equipment was chosen. Note: If binding requirements for the accuracy of measuring  and test equipment have been stipulated. if they are used properly. Relevant laws and regulations have to be complied with just as much as the contractual provisions agreed with the purchaser of the product (guarantee of product quality) and the obligation to put into circulation only products whose safety. 4. (e)  reference to SI units. by demonstrating that they are accredited. 5  Elements of traceability 5. failure to meet these requirements means the absence of a warranted quality with considerable consequent liability.a basic quality calibration program 4. the laboratories or bodies performing one or more steps in the chain must supply evidence for their technical competence. the producer must be able to demonstrate. each step in the chain must be performed according to documented and generally acknowledged procedures. e. is not affected by defects.2  There are legal as well as technical reasons for traceability of measurement. stability of the equipment. usually a national or international standard. the results must equally be documented.g. was in proper working order and was used correctly for controlling a product. frequency of use.3  If it becomes necessary to prove absence of liability. way of use. 38 .g.

where the customer’s test 39 . Expression of the Uncertainty of Measurement in Calibration. Where does this ratio of four to one (4:1) come from? It comes from the American National Standard for Calibration – (ANSI/NCSL Z540. if a TUR of equal to or greater than 4:1 is maintained. The purpose of this document is to harmonise evaluation of uncertainty of measurement within EA. reference materials take the position of physical reference standards. As the rules laid down in this document are in compliance with the recommendations of the Guide to the Expression of Uncertainty in Measurement. then traceability is assured. in addition to the general requirements of EAL-R1. This is correct and acceptable.a basic quality calibration program 5. It is equally important that such reference materials are traceable to relevant SI units. To maintain traceability. to set up.2 By understanding and following both of these documents. without using uncertainty budgets or calculations.” So. you must ensure your standards are at least four times (4:1) more accurate than the test equipment being calibrated. a calibration function can easily maintain traceable calibrations for the requirements demanded by their customers and the standard or regulation that their company needs to meet. the implementation of EA-4/02 will also foster the global acceptance of European results of measurement. where the need to maintain a TUR of 4:1 comes into play.32006) which states: “Where calibrations provide for verification that measurement quantities are within specified tolerances…Where it is not practical to estimate this probability. the specific demands in reporting uncertainty of measurement on calibration certificates issued by accredited laboratories and to assist accreditation bodies with a coherent assignment of best measurement capability to calibration laboratories accredited by them. In most circumstances. published by seven international organisations concerned with standardisation and metrology. the TUR shall be equal to or greater than 4:1. Certification of reference materials is a method that is often used to demonstrate traceability to SI units.1 The other document that goes hand-in-hand with this is EA 4/02. is at the company or shop level.2  In many fields. Keep in mind that a TUR of 4:1 somewhere along the chain of calibrations may not have been feasible. and uncertainty calculations were performed and their uncertainty stated on the certificate of calibration.

the other two have far more experience and through no fault of their own do not use the calibration procedures that are required by their quality system. However. If a change has been made to that procedure. Process improvements cannot take place across the department if everyone is not doing the job the same way each and every time they perform a calibration. We are not ignorant enough to believe that when calibration technicians have performed a particular calibration hundreds or even thousands of times that they are going to follow calibration procedures word for word. the calibration technician must be trained on the change before they can perform the calibration. But they must have their calibration procedure on hand each time they are performing the calibration.a basic quality calibration program equipment is usually used for production or manufacturing purposes only. therefore. Four of them work in another facility calibrating the same types of equipment as the other two.3 Let’s take for example a calibration program that has six calibration technicians on staff. If everyone was using the calibration procedures like they were supposed to. The effective operation of such a system will hopefully result in stable processes and. They have calibrated the same items for several years and feel there is nothing new to learn. in a consistent output from those processes. But especially true in a metrology department. One of the four calibration technicians (who are always following the calibration procedures) finds there is a fast. more economical way to perform a specific calibration. and the appropriate documentation completed to show that training 40 . The four calibration technicians that have been following the calibration procedure improve their production and save the company money. The two ‘old timers’ have a reduction in their production and actually cost the company money. then this would not have happened. So how does calibration and traceability fit into the big picture? What does the big picture look like? Why do you need a quality calibration program? You need to establish a quality calibration program to ensure that all operations throughout the metrology department occur in a stable manner. They submit a change proposal for the calibration procedure and everyone is briefed and trained on the new technique. Once stability and consistency are achieved. it is possible to initiate process improvements. This is applicable in every phase of a production and/or manufacturing program. Of course not.

then it is the same as if the training never happened. “Record what you did” means that you must record the results of your measurements and adjustments.3 “Say what you do” means write in detail how to do your job. change a process.a basic quality calibration program was accomplished and signed off. One of the most stringent requirements can be found in the current Good Manufacturing Procedures (GMP). When the proper training is not documented and signed off by the trainer and trainee. What is a quality calibration program? A quality calibration program consists of several broad items referred to in the Quality System Regulation (QSR) from the Food and Drug Administration (FDA). or perform a function that follows specific written instructions. Here is an example of common formatting for SOPs: 41 . “Check the results” means make certain the test equipment meets the tolerances. etc. “Say what you do” means write in detail how to do your job. This includes calibration procedures. “Do what you say” means follow the documented procedures or instructions every time you calibrate. Do what you say. Record what you did. including what your standard(s) read or indicated both before and after any adjustments might be made. work instructions and standard operating procedures (SOPs).) and regulations throughout most industries that regulate or monitor production and manufacturing of all types of products. and Act on the difference”. Check the results. you’re required to inform the user/owner of the equipment because they may have to re-evaluate manufactured goods. These items are also referred to by other standards (ISO 9000. The basic premise and foundation of a quality calibration program is to “Say what you do. All of your calibration procedures should be formatted the same as other SOPs within your company. work instructions and SOPs. This includes calibration procedures. or upper/lower limits specified in your procedures or instructions. accuracies. Let’s break these down into simple terms. “Act on the difference” means if the test equipment is out of tolerance. or recall a product.

along with their range and tolerances.  Scope 3. Then the actual calibration procedure starts in section 5.  Forms and Records 8. power supplies and water baths. Specific: spectrophotometers.  Document History After section 4. There are. micrometers. Procedures. Definitions. They should show what standards accomplish the calibration of a specific range and/or function. and removes doubt by the calibration technician on what data goes into which data field. “Do what you say” means follow the documented procedures or instructions every time you calibrate. Have a system in place for updating your 42 .a basic quality calibration program 1. Possibly. and balances/scales.  Procedures 2. After that you should have a list of the standards to be used to calibrate the items. thermal cyclers. pipettes.  Related Procedures 7. Manufacturer’s manuals usually provide an alignment procedure that can be used as a template for writing a calibration procedure. pressure and vacuum gages. two types of calibration procedures: Generic: temperature gages and thermometers.  Procedure 6. This table should also include the standard’s range and specifications.  Responsibilities 4. Specific SOPs are written to show stepby-step procedures for each different type of test instrument within a group of items. generally speaking. Generic SOPs are written to show how to calibrate a large variety of items in a general context. you should have a table listing all of the instruments or systems that would be calibrated by that procedure. the calibration form is designed to follow specific steps (number wise). Have the latest edition of the procedure available for use by your calibration technicians. or perform a function that follows specific written instructions. This means following published calibration procedures every time you calibrate a piece of test equipment. A complete calibration must be performed prior to any adjustment or alignment. An alignment procedure and/ or preventive maintenance inspection (PMI) may be incorporated into your SOP as long as it is separate from the actual calibration procedure.  Definitions 5.

What do you do when you need to make an improvement. their part number and range/tolerance. as well as the test instrument’s ‘As Found’ and when applicable ‘As Left’ readings. A history of each calibration and a traceability statement or uncertainty budget must be included. and the next time it will be due calibration should be on the form. Of course there are many ways to accomplish this. The entire calibration is performed to see any part of the calibration is out of tolerance. The date of calibration. Excel. The ‘As Found’ readings are what the test instrument read the first time that a calibration is performed. written process must be in place. including what your standard(s) read or indicated both before and after any adjustments are made. Train your technicians on the changes made to your procedures every time the procedure is changed or improved – and document the training. or update your calibration procedures and/or forms? A formal.g. the last time it was calibrated. If an out-of-tolerance (OOT) condition is found. Access •  calibration module of a computerized maintenance management system (CMMS) •  calibration software specifically designed for that purpose These include the identification of the test instrument with a unique identification number. to include: •  Who can make changes •  Who is the final approval authority •  A revision tracking system •  A process for validating the changes •  An archiving system for old procedures •  Instructions for posting new/removal of old procedures •  A system for training on revisions •  A place to document that training was done “Record what you did” means that you must record the results of your measurements and adjustments.a basic quality calibration program procedures. including: •  pen and paper •  “do-it-yourself ” databases. e. and keep your calibration records in a secure location. The location of where the test instrument can be found should also be on the record. 43 . There should be a place to show what the standard read. Certain requirements must be documented in each calibration record. adjustment or repair. prior to alignment.

make an adjustment. and an ‘As Left’ set of data is collected. This is a best practice policy that has been in use in the metrology community since calibration started. If one were to stop at the point where an OOT is found. Not all UUTs would be considered OOT when “As Left’ readings are taken. If the item is found to be out-of-tolerance at that time. there would not be a problem since it was found to be in tolerance during the first calibration. a final ‘As Left’ calibration would be performed. The standard reading. a complete calibration is again performed. The calibration is performed. and the UUT is read to see how much it deviates from the standard. the standard is set at a predetermined output. it might be metrology department policy to adjust an item if it is more than ½ beyond its in-tolerance range. or adjustment. while still meeting its specifications. It just dies and cannot be calibrated. Then the PMI is completed.a basic quality calibration program record the reading (on the standard and the UUT) and continue with the rest of the calibration to the end of the calibration procedure. The water bath is set to a predetermined temperature. Compare this to the calibration of pressure gages where a pressure standard is set to a standard pressure. It would be obvious that something happened during the cleaning. “As Left” readings are taken after repair. However. there is a good possibility that the adjustment affected other ranges or parts of the calibration. an ‘As Found’ calibration is performed. 44 . after the UUT is adjusted to be as close to optimum as possible. In some circumstances. There will be times when an instrument has a catastrophic failure. Then. but you would not have been able to collect the original “As Found’ readings. Another example would be when preventive maintenance inspection is going to be performed on an item. Usually. alignment. once the problem is found and repaired. then proceed with the calibration. This is why the entire calibration is performed prior to adjustment or alignment. One example when it would not be practical to set the standard and take a reading is during the calibration of water baths. will also be recorded on the calibration form. and the temperature standard is used to record the actual reading. collecting the ‘As Found’ data. there will be times when this is not possible. collecting the ‘As Left’ readings for the final calibration record. The UUT is treated the same as any OOT unit. alignment or adjustment and that after a final adjustment was completed to bring the unit back into tolerance. from the working or reference standard you are using to calibrate the UUT. This should be noted in the calibration record. In this type of situation.

or to any artifact that was used as a standard. and their pressures recorded on the calibration record. In other case. to NIST. they are set to complete a sterilization cycle and a temperature device records all of the temperature readings throughout the cycle and the readings are checked to see if the autoclave met its specifications.a basic quality calibration program and the gage(s) under test are then read. paragraph 5. along with their next calibration due date(s). it is not always possible to set the standard and take a reading from the UUT. Calibration shall be stopped when the adverse effects of the influence factors ad conditions jeopardize the results of the calibration. and compared to the standard to see if they are in or out of tolerance.”5 If the conditions within the area that calibrations are being performed require monitoring according to the standard or requirements that must be met. As can be seen by the above examples. vibration. There must be a statement showing traceability to your NMI. then a formal program must be in place for tracking those conditions and reviewing the data. electromagnetic interference. dust. There should also be a place to identify which calibration procedure was used. Note: Influencing factors and conditions may include temperature. The same happens when calibrating thermometers. Also on the calibration form should be an area to identify the standard(s) that were used. and a determination is made as to the in or out of tolerance of the UUT. recorded. List environment conditions when appropriate and show if they pass or fail. monitored. and mitigated to meet calibration process requirements. paragraph 5.”4 According to ANSI/NCSL Z540. humidity. The UUT is compared to the reference after equilibration. According to NCSL International Calibration Control Systems for the Biomedical and Pharmaceutical Industry – Recommended Practice RP-6. etc. or in the case of most companies in the USA. along with the procedure’s revision number. then there should be a place in the calibration form for showing that those conditions were 45 .6 Influence factors and conditions: “All factors and conditions of the calibration area that adversely influence the calibration results shall be defined. They.3. You should include any uncertainty budgets if used. along with the standard. If this is the case.11: “The calibration environment need be controlled only to the extent required by the most environmentally sensitive measurement performed in the area. just as the calibration of autoclaves. or at least a statement that a TUR of ≥ 4:1 was met. are placed in a dry block and a particular temperature is set.3-2006. plus their specifications and range.

This is called for in all of the standards and regulations. both hard copy 46 . The user/customer. There should be only one way to file your records. Do not use white-out. What do you do if. and then place your initials and date next to the data using black ink. the calibration record must be signed and dated by the technician performing the calibration. There should be an area set aside in the calibration form for making comments or remarks. must be informed if their test equipment does not meet their specifications. and date and sign accordingly. You should indicate on the form if the calibration passed or failed. or are not applicable to that calibration. This means that an individual higher up the chain of command (supervisor. draw a single line through the entry. audited. and want to correct the error? For hard copy records. If the UUT had an out-of-tolerance condition. OOT conditions. or inspected before it is considered a completed record. were not met. •  The user/customer was notified and the UUT was taken out of service and tagged as unusable. the calibration record requires a ‘second set of eyes’. after recording your results.a basic quality calibration program either met. use whatever tracking system the software uses. what was accomplished if an OOT was found. then there should be a place to show what happened to the UUT. write the correct data. there should be a place on the form for the final reviewer to sign and date. For making corrections to electronic records (eRecords). you find that you have made an error. or erase the original data. etc. •  The user/customer was notified and the UUT was given a ‘limited calibration’ with their written approval. If this is the case. or transposed the wrong numbers. In some instances. with the following possibilities as an example:  •  The user/customer was notified and the UUT was adjusted and meets specifications. And finally. or make a duplicate record from scratch with the correct data and explain in the comments block what happened. QA inspector. even if internal to the company performing the calibrations.) must review the calibration record and also sign and date that it has been reviewed. Enough space should be available for the calibration technician to include information about the calibration. etc. manager. Notice that in each circumstance that the user/customer must be notified of any and all OOTs.

manage. accuracies. data lines and storage systems when going entirely electronic with your calibration records and data management. But the collection of data by electronic means. How is the data collected and stored? Is it in its native format or dumped into a spreadsheet for analysis? All of these need to be considered to allow for review. Hard copy records collected during the calibration of test instruments have been discussed in detail already. The use of computerized data collection brings with it not only increased productivity and savings in time and effort. 47 . There are various ways to do this. Is the system validated and instrumentation qualified prior to use? If you are using any type of computerized system. analysis. and eventual storage. put it into your written procedures. or upper/lower limits specified in your procedures or instructions. etc. In some instances it is apparent what the tolerances will be for the items being calibrated. and/or compilation into your forms. validation of that software is mandatory. In other cases it is not quite so apparent. An example for filing hard copy records: •  Each record is filed by its unique ID number •  Records are filed with the newest in the front •  Records are filed within a specified time frame An example for filing eRecords: •  Filed by ID number. “Check the results” means make certain the test equipment meets the tolerances. should also be considered. process controllers. calibration certificate number and calibration date •  Placed on a secure drive that has regular backup •  eRecords are filed within a specified time frame There are many different ways to manage your calibration data since there are a variety of ways to collect that data. It cannot be emphasized enough the criticality of validating your software. or through the use of calibration software. review and store the data. Calibration forms should have the range and their tolerances listed for each piece of test equipment being calibrated.. but also new problems in how to collect.a basic quality calibration program and eRecords – no matter which system you use.

or recall product. there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality. and a paper trail for historical reference? After owner/user notification. or upper management notification in case of ‘in action’? 48 . According to the FDA: “When accuracy and precision limits are not met. change a process or procedure. supervisor or manager? •  Is a data base maintained on all OOT test equipment? •  Is the customer/user required to reply to the OOT notification. is the calibration department responsible for anything else? •  Is the final action by the owner/user sent back for filing or archiving? •  Usually the department that generates an action item is responsible for final archiving.” You should have a written procedure in place that explains in detail: •  What actions are to be taken by the calibration technician? •  What actions to be taken by the department supervisor and/or manager? •  What actions to be taken by the responsible owner/user of the OOT test equipment? You should have an SOP that explains the responsibilities of the calibration technician: •  Do they have additional form(s) to complete when OOT conditions are found? •  Do they require a ‘second set of eyes’ when/if an OOT is found? •  Have they been trained and signed off that they know all the proper procedures when an OOT has been found? You should have an SOP that explains the responsibilities of the supervisor/manager: •  Who notifies the customer – the technician. you must inform the user because they may have to re-evaluate manufactured goods. •  Are there any databases that need to be updated.a basic quality calibration program “Act on the difference” means if the test equipment is out of tolerance. if so is there a time limit.

Here is a hypothetical example: from an historical perspective. generally 85% of test equipment passes calibration.a basic quality calibration program Do you have a database of all OOT test equipment for various activities? •  The database can be used for accessing yearly calibration interval analysis. •  Among the 15% that are found to be OOT some will be due to Start Typical calibration process as shown in a flow chart ‘As found’ test Save ‘As found’ results NO Adjustment required? YES Adjust as needed ‘As Left’ test YES Within limits? NO Save ‘As Left’ results End 49 . •  During an audit/inspection (both internal and external) access to past OOT data should be easily available. •  Access to OOT data can assist in determining reliability of test equipment.

By knowing what is coming due calibration.). reverse traceability. and the ability to see all items that are currently overdue calibration. along with traceability should be apparent to an auditor or inspector. a quality calibration program would be impossible to maintain. and Act on the difference” portion. items that are OOT. According to most standards and regulations. bad cables/accessories. correct and properly used calibration and equipment labels. scheduling and calibration management software. Do what you say. •  If a higher fail rate is noticed. Any calibration management program worth the money should have a few critical areas built into their basic program. Only about 10% can be easily seen by the casual observer. Check the results. The same can be said of a quality calibration program. an effective training program. the ability to see a 30 day schedule of items coming due calibration. Without any one of these programs. bad standards. any software program used must be validated prior to implementation. However. the calibration management program should also be able to show calibrations and repairs by individual items. It is hard to keep your overdue calibrations at a minimum when all of your time is spent reacting to items that keep coming due without your prior knowledge. But the different parts that keep a quality calibration program running efficiently consist of elements from a continuous process improvement program.a basic quality calibration program operator error. groups of items by location/part number. From a managerial standpoint. a comprehensive calibration analysis program. you can schedule your technicians. Developing a world-class calibration program A quality calibration program might be compared to an iceberg. The “Say what you do. standards. Those include: a master inventory list. This can be accomplished using the manufacturer’s 50 . before changing calibration intervals. poorly written calibration procedures. time and other resources to the best advantage. the unseen portion is what keeps the iceberg afloat and stable in the ocean. and other listings that help to manage your department. environmental conditions (vibration. Record what you did. Having an effective calibration management program is usually the difference between being proactive and reactive to performing your routine calibrations. check that the proper specifications are being used. This can be compared to the person who is trying to drain the swamp while fighting off the alligators. etc. and a visible safety program.

return to finish the pipette or balance calibrations. say outdoors in severe weather. depending on usage. or floor of a building. Also. if calibrations were to be performed in a ‘clean room’ environment. then complete the water baths at their final setting. return to set another temperature in the water baths (doing a few at a time). Also. environment. and keeping the calibration technician involved and focused instead of bored. Combining the calibration of like items and mixing and matching items could reduce the task of mundane and boring calibrations. then perform several pipette or balance calibrations. Another critical yet often times misunderstood program is calibration interval analysis. and the calibration technician is required to gown-up prior to entry every time then go into the clean-room. A best practice among experienced calibration practitioners is the calibration of like items. Each situation should be considered carefully so that they can be reviewed in the appropriate light. An example would be to start all temperature calibrations (set water baths up for their initial temperature readings). A particular item used in a controlled environment should be more reliable that one used in a harsher situation. increasing productivity. By not having to stand around to wait for the water baths to equilibrate. Either way. your validation paperwork needs to be available for inspection during audits and inspections. etc.a basic quality calibration program system. then scheduling all of the calibrations in that area could increase production and reduce down time from multiple entries and exits. or used as an item that is coded as “No Calibration Required” on a loading dock. and using your scheduling software to also perform calibrations in geographical areas or combining calibrations in local areas. Calibration interval analysis software can be purchased commercially 51 . handling. you are using your time more efficiently. This would be using your time to the best advantage. you must consider if the test equipment is used to determine final product where specifications are very tight. How often should each type of test equipment be calibrated? Should the manufacturer’s recommended interval be the determining factor? Or should the criticality of how the test equipment is used in your particular production or manufacturing line be the deciding vote? Your specific situation should be the driving factor in deciding calibration interval analysis. An example of this would be to calibrate all pressure gages that were shown to be stored or used in a specific area. or by incorporating an in-house validation system. Most manufacturers recommend a 12 month calibration interval.

set a cut off for increasing or decreasing calibration intervals •  Consider increasing a calibration interval if the pass rate ≥ 95% (by ½ up to double the current calibration interval) •  Consider decreasing a calibration interval if the pass rate ≤ 85% (by ¾ to ½ of the current calibration interval) No matter which route you take for calibration interval analysis – ensure you are on the cutting-edge – not on the ragged-edge by extending your intervals too fast without solid data. but one of the foundations for any quality system in the 21st century. adjustment and evaluation programs. It provides information needed to design. and to your company’s reputation! The cost and risk of not calibrating Are there costs and/or risks associated to not calibrating your test equipment? This is a double edged sword. Also. in time and money. collect data over a one year period on: number of calibrations and number of items OOT •  Take the number of calibrations minus the number of OOTs. Here is an example. Both management and technical information are presented in this RP. Not only is calibration a requirement. Several methods of calibration interval analysis and adjustment are presented. but 52 . or are on a tight budget and are willing to do their own computations.a basic quality calibration program and used to evaluate your test equipment. A company could also do their own analysis if they support a limited number of items. Establishment & Adjustment of Calibration Intervals. The advantages and disadvantages of each method are described and guidelines are given to assist in selecting the best method for a requiring organization. This Recommended Practice (RP) is intended to provide a guide for the establishment and adjustment of calibration intervals for equipment subject to periodic calibration. divide result by the number of calibrations. It isn’t a question of do you have a quality calibration program in place. recalls can be very expensive. NCSL International has RP-1. industries and even countries. •  For each type of equipment. implement and manage calibration interval determination. On one side we have the requirement of standards and regulations that govern various companies. then take the result times 100 for the pass rate •  Make a risk assessment of each item for your company’s needs.

what are the responsibilities of a quality calibration department and also those of their customer? A calibration/metrology department should be responsible for: •  Listening to their customers to understand their requirements and needs 53 . or intrinsic). and reputation. no matter the size. pharmaceutical drugs and products that impact human safety – the cost could be immeasurable…with the possibility of death among the results. you must also have all the parts needed to support traceable calibration: calibration procedures. But it doesn’t end there.a basic quality calibration program does it comply with all the requirements of the appropriate standard or regulation to which your company must conform? The other side of the double edged sword is having a calibration program in place without any type of quality. Referring to a double edged sword. Then you can say you have a quality calibration program. This would equate to not having any type of calibration program at all. there can be no repeatability. a training program. continuous process improvements. time. money. The basic belief is this – it is absolutely essential to have a quality calibration program in place to make a quality product. then it is assumed that to have a quality calibration program. a comprehensive calibration management software package. Therefore there can be no quality in the product. so the company would never be able to stay in business long enough to impact their market segment. Without calibration to a traceable standard (national. So is there cost and risk? Absolutely. documented training for all your calibration technicians. or quantity. traceable documentation. The cost is huge in terms of lost production. calibration interval analysis. If a manufacturer produces any type of product or service where repeatable measurements take place then their test equipment/instruments need to have repetitive outputs. traceability or documentation. and the ability to provide quality customer service in a timely manner. document control procedures. international. shape. The question that should be asked is: “Do you have a quality calibration program that has traceable results to a national or international standard”? If the answer is yes. calibration records. an out-oftolerance program and procedures. In the case of companies that have untraceable calibration in the production of medical devices.

ANSI/NCSL Z540. 1999. 2007.  Bucher. Boulder. 2006.” 1. CO. CO. Calibration Control Systems for the Biomedical and Pharmaceutical Industry. malfunctioning.  EA-4/02. contaminating or damaging it under normal operating conditions •  Using their work order system for requesting service when equipment is broken.  NCSL International. 4. or in need of calibration As Lord Kelvin was quoted as saying. 3.3-2006. “If you cannot measure it. Expression of the Uncertainty of Measurement in Calibration. you cannot improve it. Jay L. RP-6. Boulder.  EAL-G12. 54 .  NCSL. Traceability of Measurement. The Quality Calibration Handbook. November 1995. December 1999 rev00. Milwaukee: ASQ Quality Press. 5. Edition 1.a basic quality calibration program •  Translating those requirements to the accuracy and specifications of the test equipment and support services that meet or exceed their quality expectations •  Delivering test equipment that consistently meets requirements for reliable performance •  Providing knowledgeable and comprehensive test equipment support •  Continuously reviewing and improving their services and processes Your customers should be responsible for: •  Informing Metrology of their requirements and needs •  Getting the proper training in the correct and safe usage of test equipment •  Maintaining their test equipment without abusing. 2.

55 .

.

confidence levels etc.traceable and efficient calibrations Traceable and efficient calibrations in the process industry T oday’s modern process plants. i. such as the ISO9000 and ISO14000 series of quality standards. calibrator. 57 . which means that the calibrator used should have a valid. put new and tight requirements on the accuracy of process instruments and on process control. Calibration can briefly be described as an activity where theinstrument being tested is compared to a known reference value. call for systematic and well-documented calibrations. Does this mean that the electricians and instrumentation people should be calibration experts? Not really. However. calibrator. i. traceable calibration certificate. To be able to answer the question why calibrate. production processes and quality systems.e. with regard to accuracy. The keywords here are ‘known reference’. modern calibration techniques and calibration systems have made it easier to fulfill the requirements on instrumentation calibration and maintenance in a productive way. we must first determine what measurement is and why measuring is necessary. but this topic should not be ignored. Fortunately.  What is calibration and why calibrate Calibration can brief ly be described as an activity where the instrument being tested is compared to a known reference value. Quality systems. some understanding of the techniques. 1.e. uncertainty. terminology and methods involved in calibration must be known and understood in order to perform according to International Quality Systems. repeatability.

Note: the true value of a quantity is an ideal concept and.” Therefore all instruments display false indications! A set of experimental operations for the purpose of determining the value of a quantity. HIERARCHY OF ACCURACY TRUE VALUE International National standard Authorized Laboratories Instr. Departments House and working standards Process instrumentation 58 . it cannot be known.traceable and efficient calibrations WHAT IS MEASUREMENT? In technical standards terms the word measurement has been defined as: “A set of experimental operations for the purpose of determining the value of a quantity. in general.” What is then the value of quantity? According to the standards the true value of a quantity is: “The value which characterizes a quantity perfectly defined during the conditions which exist at the moment when the value is observed.

This conversion always involves optimizing. they drift and lose their ability to give accurate measurements.  Why measure? The purpose of a process plant is to convert raw material. 3. in other words. in other words. the control cannot be better than the quality of measurements from the process.traceable and efficient calibrations 2. which must be done better than the competitors. In practice. regardless of how advanced the process automation system is. Anyhow. energy. The primary reason for calibrating is based on the fact that even the best measuring instruments lack in absolute stability. optimization is done by means of process automation. EVERYTHING IS BASED ON MEASUREMENTS PROCESS CONTROL SYSTEM MEASUREMENTS CONTROLS INSTRUMENTATION MEASUREMENTS ADJUSTMENTS Production Factors PROCESS Products 59 .  Why calibrate The primary reason for calibrating is based on the fact that even the best measuring instruments lack in absolute stability. manpower and capital into products in the best possible way.

Environment conditions. other quality systems and regulations The ISO9000 and ISO14000 can assist in guiding regular. elapsed time and type of application can all affect the stability of an instrument. Even instruments of the same manufacturer. QUALITY MAINTENANCE QUALITY QP C1 C2 C1–C7 CALIBRATIONS C3 C4 “GOOD AS NEW” C5 C6 C7 QM LOWER TOLERANCE Q1 Q2 Q3 QZM PURCHASE T1 T2 T3 TIME QP – PURCHASED QUALITY QZM – ZERO MAINTAINED QUALITY QM – MAINTAINED QUALITY 60 .traceable and efficient calibrations they drift and lose their ability to give accurate measurements. which produces uniform quality and minimizes the negative impacts on the environment. This drift makes recalibration necessary. systematic calibrations. Other good reasons for calibration are: •  To maintain the credibility of measurements •  To maintain the quality of process instruments at a good-as-new level •  Safety and environmental regulations •  ISO9000. Environment conditions. while another performs differently. One unit can be found to have good stability. elapsed time and type of application can all affect the stability of an instrument. type and range can show varying performance.

adjusted or re-adjusted as necessary.traceable and efficient calibrations 4. as well as the measuring devices needed to provide evidence of a product’s conformity to determined standards. In addition. identified for the determining of the calibration status. measuring equipment is calibrated or verified with measurement standards traceable to national or international standards at specified intervals. to ensure valid results. 5. the basis used for calibration or verification is recorded. Traceability is a declaration stating to which national standard a certain instrument has been compared. protected from damage and deterioration during handling. the ability of computer software to satisfy the intended application is confirmed. If no such standards exist.1  ISO9001: 2008 The organization determines the monitoring and measurements to be performed.  Traceability Calibrations must be traceable. Note: See ISO 10012 for further information. The organization then takes appropriate action on the equipment and any product affected.  Regulatory requirements for calibration 5. SI-UNITS International standards National standards Reference standards Working standards Process standards 61 . the organization assesses and records the validity of the previous measuring results when the equipment is found not to conform to requirements. This is done prior to initial use and reconfirmed as necessary. The organization establishes the processes for ensuring that measurements and monitoring are carried out and are carried out in a manner consistent with the monitoring and measurement requirements. When used in the monitoring and measurement of specified requirements. safeguarded against adjustments that would invalidate the measurement result. maintenance and storage. Where necessary. Records of the calibration and verification results are then maintained.

•  Calibration standards should be traceable to national and international standards. process and safety instruments should be physically tagged. •  There should be a record of the history of each instrument. In such a system. the Electronic Signature is considered equivalent to a hand-written signature. all product. •  Calibrations must be done in accordance with written. •  All instrumentation should have a unique ID. Users must understand their responsibilities once they give an electronic signature. which add. User Management. •  A calibration period and error limits should be defined for each instrument. U. Electronic Signatures” Software systems need features such as Electronic Signature. Audit Trails should record all modifications. edit. 62 . Food and Drug Administration) Any pharmaceutical company that sells their products in the USA must comply with FDA regulations. •  All of the above should be implemented in conjunction with following regulations: –  21 CFR Part 211 – “Current Good Manufacturing Practice for Finished Pharmaceuticals” –  21 CFR Part 11 – “Electronic Records.S. approved procedures. •  All instruments used must be fit for purpose. •  Calibration records must be maintained. •  All electronic systems must comply with FDA’s 21 CFR Part 11. User Management. or delete data from an electronic record. and Security System to be able to comply with these regulations. regardless of where the products are manufactured.traceable and efficient calibrations 5.2  PHARMACEUTICAL (FDA. •  Calibration standards must be more accurate than the required accuracy of the equipment being calibrated. •  There must be documented evidence that personnel involved in the calibration process have been trained and are competent. Audit Trail. Audit Trail. •  Documented change management system must be in place. Software systems need features such as Electronic Signature. An Audit Trail is required to support change management. and Security System to be able to comply with these regulations.

Calibration An unknown measured signal is compared to a known reference signal. 63 . including Annex 11. Non-linearity Non-linearity is the maximum deviation of a transducer’s output from a defined straight line. regardless of where the products are manufactured.  DEFINITIONS OF METROLOGICAL TERMS Some metrological terms in association with the concept of calibration are described in this section. 6. The requirements for EU GMPs are similar to those of the US FDA.2.3 PHARMACEUTICAL (EU GMPs) Any pharmaceutical company that sells their products in the European Union must comply with EU GMPs. Please note that the definitions listed here are simplified. Validation of measure­ ment and test methods (procedures) is generally necessary to prove that the methods are suitable for the intended use. as described in Section 5. Validation Validation of measurement and test methods (procedures) is generally necessary to prove that the methods are suitable for the intended use. Quite a few of the following terms are also used on specification sheets for calibrators. Non-linearity is specified by the Terminal Based method or the Best Fit Straight Line method.traceable and efficient calibrations 5. Resolution Resolution is the smallest interval that can be read between two readings.

Accuracy Generally accuracy figures state the closeness of a measured value to a known reference value. Hysteresis The deviation in output at any point within the instrument’s sensing range. under normal operating conditions. Temperature coefficient The change in a calibrator’s accuracy caused by changes in ambient temperature (deviation from reference conditions). The accuracy of the reference value is generally not included in the figures. which can be detected as an output. usually 90 days to 12 months. hysteresis. The only way to compare accuracy presented in different ways is to calculate the total error at certain points. Stability Often referred to as drift. Repeatability is often expressed in the form of standard deviation.S. It must also be checked if errors like non-linearity. under normal operating conditions. usually 90 days to 12 months. and then with decreasing values.traceable and efficient calibrations Sensitivity Sensitivity is the smallest variation in input. The temperature coefficient is usually expressed as % F.  64 Stability is expressed as the change in percentage in the calibrated output of an instrument over a specified period. . Drift is usually given as a typical value. / °C or % of RDG/ °C. are included in the accuracy figures provided. Good resolution is required in order to detect sensitivity. or % of RDG + adder.S. The difference between these two expressions is great. stability is expressed as the change in percentage in the calibrated output of an instrument over a specified period. Repeatability Repeatability is the capability of an instrument to give the same output among repeated inputs of the same value over a period of time. when first approaching this point with increasing values. temperature effects etc. Accuracy is usually expressed % F.

Type B uncertainty is used when the uncertainty of a single measurement is expressed. uncertainty is calculated using Type A uncertainties. ambient conditions. errors due to observer fallibility cannot be accommodated within the calculation of uncertainty. Examples of systematic effects include errors in reference value. Type B uncertainty Type B evaluation of uncertainty involves the use of other means to calculate uncertainty.e. in general. set-up of the measuring. which contain the true value. Type A uncertainty The type A method of calculation can be applied when several independent measurements have been made under the same conditions. Type B. etc. Examples of such errors include: errors in recording data. The standard deviation. or the use of inappropriate technology. It involves the evaluation of uncertainty using scientific judgement based on all available information concerning the possible variables. If there is sufficient resolution in the measurement. Systematic errors or effects remain constant during the measurement. Type A involves the statistical analysis of a series of measurements. in general. In this case. 65 .traceable and efficient calibrations Uncertainty Uncertainty is an estimate of the limits. It should be noted that. Standard deviation is used as a measure of the dispersion of values. which can vary in magnitude and in sign. the effects of these components include measurement errors. often called the “root-mean-square repeatability error”. Uncertainty is evaluated according to either a “Type A” or a “Type B” method. there will be an observable difference in the values measured. rather than applying statistical analysis of a series of measurements. is used for calculation. could be said to be of a systematic nature. The other group of components. at a given cover factor (or confidence level). for a series of measurements under the same conditions. errors due to observer fallibility cannot be accommodated within the calculation of uncertainty. errors in calculation. Values belonging to this category may be derived from: It should be noted that. i. in an unpredictable manner.

A well-based Type B evaluation of uncertainty can be as reliable as a Type A evaluation of uncertainty. Expanded uncertainty The EA has decided that calibration laboratories accredited by members of the EA shall state an expanded uncertainty of measurement obtained by multiplying the uncertainty by a coverage factor k. The maintenance management system may alert when calibration is 66 . k=2.g. For uncertainty specifications. there must be a clear statement of cover probability ­ or confidence level. The expanded uncertainty corresponds to a coverage probability (or confidence level) of approximately 95%. in ISO9001: 2008. there must be a clear statement of cover probability or confidence level. Usually one of the following confidence levels are used: 1 s = 68% 2 s = 95% 3 s = 99% For uncertainty specifications. It is a skill that can be learnt with practice. In cases where normal (Gaussian) distribution can be assumed. should be used.traceable and efficient calibrations •  Experience with or general knowledge of the behavior and properties of relevant materials and instruments •  Ambient temperature •  Humidity •  Local gravity •  Atmospheric pressure •  Uncertainty of the calibration standard •  Calibration procedures •  Method used to register calibration results •  Method to process calibration results The proper use of the available information calls for insight based on experience and general knowledge.  CALIBRATION MANAGEMENT Many companies do not pay enough attention to calibration management although it is a requirement e. 7. the standard coverage factor. especially in a measurement situation where a Type A evaluation is based only on a comparatively small number of statistically independent measurements.

Thus the calibrator is able to detect if the calibration was passed or failed immediately after the last calibration point was recorded. the work order will close and the maintenance system will be satisfied. and the traceability chain is documented without requiring any further actions from the user. effort and money. it is usually in the form of a hand-written sheet that is then archived. Modern calibration management software can be a tool that automates and simplifies calibration work at all levels. are needed. All this saves an extensive amount of time and prevents the user from making mistakes. Depending on what process variable is calibrated and how many calibration points are recorded. the software automatically detects the calibrator that was used. Calibration records. Once the job has been done. An efficient calibration management system consists of calibration management software and documenting calibrators. Choosing professional tools for maintaining calibration records and doing the calibrations can save a lot of time. When the technician is about to calibrate an instrument. There is no need to make tricky calculations manually in the field. The instrument’s measurement ranges and error limits are defined in the software and also downloaded to the calibrator. (s)he simply downloads the instrument details from the calibration management software into the memory of a documenting calibrator. no printed notes. If the calibration results need to be examined at a later time. If the software is able to interface with other systems the scheduling of calibrations can be done in the maintenance system from which the work orders can be automatically loaded into the calibration management software. what happens between opening and closing of the work order is not documented very often. The “As Found” and “As Left” are saved in the calibrator’s memory. It automatically creates a list of instruments waiting to be calibrated in the near future. finding the sheets requires a lot of effort. While the calibration results are uploaded onto the database. etc. using automated tools can be 5 to 10 times faster compared to manual recording. and there is no need to write down anything with pen. 67 . The increase in work productivity allows for more calibrations to be carried out within the same period of time as before. Unfortunately. If something is documented.traceable and efficient calibrations needed and then opens up a work order. including the full calibration history of an The instrument’s measurement ranges and error limits are defined in the software and also downloaded to the calibrator.

When an instrument has been calibrated several times. There are probably thousands of instruments that need to be entered into the database and all the details must be checked and verified before the system is up and running. the users can concentrate on their primary job. printing them must. Although there is a lot of data involved. If the calibration management software includes the right tools. Implementing a modern calibration management system benefits everybody who has anything to do with instrumentation. which assists in determining whether or not the calibration period should be changed. software displays the “History Trend”. If the system manufacturer has paid attention usability. be possible. One of today’s trends is to move towards to a paperless office. the system is easy to learn and use. When an auditor comes for a visit. For instance the maintenance manager can use it as a calibration planning and decision-making tool for tracking and managing all calibration related activities. it is possible to use a “reverse traceability report” to get a list of instruments that have been calibrated with that calibrator. it is possible to manage calibration records on computer without producing any papers. instrument. When all calibration related data is located in a single database the software is obviously able to create calibration related reports and documents. When many tasks are automated. 68 . Transferring to a new calibration system may sound like a huge task and it can be a huge task. are kept in the database. If paper copies of certificates are preferred.traceable and efficient calibrations Implementing a modern calibration management system benefits everybody who has anything to do with instrumentation. modern calibrators can also read HART. Today’s documenting calibrators are capable of calibrating many process signals. In addition to the conventional mA output of a transmitter. If a calibrator drifts out of its specifications. Foundation Fieldbus or Profibus output of the transmitters. The requested calibration records can be viewed on screen with a couple mouse clicks. temperature and electrical signals including frequency and pulses. QA will find a calibration management system useful. it does not mean the job is an enormous one. It is not very uncommon to have a calibrator that calibrates pressure. of course. Good calibration tools help technicians work more efficiently and accurately. therefore accessing previous results is also possible in just a few seconds. and they can be even used for configuring these “smart” transmitters.

References [1] ISO9001: 2008 “Quality Management Systems. Maintenance databases. It assists in documenting. more accurately and with better results than what could be reached with a manual system. analyzing and finally optimizing the calibration work. Requirements” [2] 21 CFR Part 11: “Electronic Records. planning. 69 . etc. automated calibration system reduces workload because it carries out tasks faster.traceable and efficient calibrations Nowadays most companies have instrumentation data in some type of electronic format: as Excel spreadsheets. scheduling. automated calibration system reduces workload. The vendor of the calibration system is most likely able to import most of the existing data to the calibration database saving months of work. Electronic Signatures” [3] 21 CFR Part 211: “Current Good Manufacturing Practice for Finished Pharmaceuticals” A good. CONCLUSION A good.

Calibration Management and Maintenance

why calibrate

Why Calibrate? What is the risk of not calibrating?

C

alibration can be briefly described as an activity where the instrument being tested is compared to a known reference value. At the simplest level, calibration is a comparison between measurements – one of known magnitude or correctness made or set with one device, and another measurement made in as similar a way as possible with a second device. The device with the known or assigned correctness is called the standard. The second device is the unit under test or test instrument. Calibration is often required with a new instrument or when a specified time period or a specified number of operating hours has elapsed. In addition, calibration is usually carried out when an instrument has been subjected to an unexpected shock or vibration that may have put it out of its specified limits. Calibration in industrial applications When a sensor or instrument experiences temperature variations or physical stress over time, its performance will invariably begin to decline, which is known as ‘drift’. This means that measurement data from the sensor becomes unreliable and could even affect the quality of a company’s production. Although drift cannot be completely eliminated, it can be discovered and rectified via calibration. The purpose of calibration is to determine how accurate an instrument or sensor is. Although most instruments provide high accuracy these days, regulatory bodies often need to know just how inaccurate a particular instrument is and whether it drifts in and out of specified tolerance over time.

Although drift cannot be completely eliminated, it can be discovered and rectified via calibration.

73

harbors and retail outlets. neglecting calibration can lead to unscheduled production or machine downtime. Similarly. also need to calibrate their weighing instruments. product and process quality issues or even product recalls and rework.why calibrate The costs and risks of not calibrating Unfortunately. allowing that sensor to drift over time could potentially result in a risk to employee safety. calibration has costs associated with it and in uncertain economic times. However. chemical and pharmaceutical industries. The stability of an instrument very much depends on its application and the environment it operates in. It is therefore critical that all instruments are calibrated at appropriate intervals. Fluctuating temperatures. . an end product manufactured by a plant with poorly calibrated instruments could present a risk to both consumers and customers. Weighing instruments also need to be calibrated regularly. Furthermore. Even the highest quality instruments will drift over time and lose their ability to provide accurate measurements. Product manufacturing also depends on accurate masses and so laboratories and production departments in the food and beverage. power. this activity can often become neglected or the interval between calibration checks on instruments can be extended in order to cut costs or simply through a lack of resources or manpower. who invoice customers based on the mass of what they supply (fiscal metering). There is therefore a growing need to have the metrological quality of these weighing instruments confirmed by calibration. This is particularly true for the food and beverage sector and for pharmaceutical manufacturers. energy. paper and pulp. These companies need to prove not only that the mass is accurate but also that the equipment producing the readings was correctly calibrated. if the instrument is critical to a process or is located in a hazardous area. this may even lead to a company losing its license to operate due to company not meeting its regulatory requirements. harsh 74 Even the highest quality instruments will drift over time and lose their ability to provide accurate measurements. aviation companies. Why is calibration important? Calibration ensures that instrument drift is minimized. In certain situations. Invoicing in these industries is often based on process measurements. oil and gas. Determining the correct mass of a product or material is particularly important for companies that supply steel.

approved procedures. This affects all process manufacturers.4%. each instrument has a master history record and a unique ID. Safety is another important reason to calibrate instruments. instrument calibration can help to optimize a company’s production process or to increase the plant’s production capacity. Calibration also ensures that product or batch quality remains high and consistent over time. In the power generation. Armando Rivero Rubalcaba is head of Instrumentation at beer producer Heineken (Spain). Furthermore. with all electronic systems complying with FDA regulations 21 CFR Part 11. For example. The role of calibration is very important to ensure the quality and safety of the processes. well-documented calibrations with respect to accuracy. which has a significant effect on annual production capacity.why calibrate manufacturing conditions (dust and dirt) and elapsed time are all contributing factors here. He comments: “For Heineken. repeatability. energy and utilities industries. The company must also have a documented change management system in place. the quality of the beer is a number one priority. We must therefore ensure that all processes correspond to the planned characteristics. GMP. Incorrect measurements in a hazardous area could lead to serious consequences. Even instruments manufactured by the same supplier can vary in their performance over time. in addition to the BRC certificate of food safety. Standards must also be more accurate than the required accuracy of the equipment being calibrated.6%. process and safety instruments should also be physically tagged. there must be documented evidence that employees involved in the calibration process have been properly trained and competent. The role of calibration is very important to ensure the quality and safety of the processes. Production environments are potentially high risk areas for employees and can involve high temperatures and high pressures. Typically. Quality systems such as ISO 9001. requires that calibration records are maintained and calibrations have to be carried out in accordance with written. uncertainty and confidence levels. at the Almaraz Nuclear Power Plant in Spain. All product. enabled the reactor power in each unit to be increased by 1. a calibration interval and error limits should be defined for each instrument and standards should be traceable to national and international standards. by improving the measurement of reactor power parameters from 2% to 0. All the plants in Spain have received ISO 9001 and ISO 14001 certifications. 75 . On the people side. ISO 9002 and ISO 14001 require systematic.” Pharmaceutical manufacturers must follow current Good Manufacturing Practices.

especially from the viewpoint of production safety and quality of the final product. Proper invoicing is therefore critical to energy and utilities companies. The factory is also full of pressure instruments and so it is also important for the safety of the workers that those instruments show the right values. Therefore. Measuring instruments that yield wrong values could easily ruin the final product. poor quality calibration is on average costing manufacturers more than 1. Similarly. When only large companies with revenues of more than 1 billion US dollars are considered. This is particularly true if sales invoicing is based on accurate process measurements. accurate measurements ensure proper billing.” Today. The latest Government regulations relating to carbon emissions may also require that companies calibrate specific instruments on a regular basis. for example. As Jacek Midera. measurement specialist at Mazovian Gas Company states: “Most importantly. Preparation of the right rubber mixture is precision work and a sample is taken from each rubber mixture to ensure quality. calibration was mainly driven by economic motives: even the smallest of errors in delivery quantities are unacceptable in Shell’s operation due to the vast sums of money 76 . gas conversion devices must be extremely accurate in measuring delivered gas. particularly in the oil and gas. This means that requirements for the calibrators are especially high. Instrument Maintenance Engineer at Shell (Netherlands) explains: “Until recently. this figure rises dramatically to more than 4 million US dollars per year. The impact of even a small measurement error can be tremendous in terms of lost revenue. according to recent research by Nielsen Research/ ATS Studies. As he puts it: “Calibration is of great importance. Customers want to pay for the exact amount of gas they’ve received. petrochemicals and chemicals sectors. manufacturers of food and beverage or pharmaceutical products could put their customers’ lives at risk by neglecting to calibrate their process instruments. including sensors used for measuring CO2 and NOX emissions. weighing scales or gas conversion devices. Calibrating instruments can help to make combustion more efficient in industrial ovens and furnaces.7 million US dollars every year. controlling emissions is another critical factor for many process manufacturers. especially from the viewpoint of production safety and quality of the final product.” Neglecting to calibrate process instruments can also affect a company’s bottom line profits. Heikki Karhe is a measurement technician at the tyre manufacturer Nokian Tyres. As Ed de Jong. Indeed.why calibrate Calibration is of great importance.

highly stable sensors are not calibrated as often as those sensors that are more susceptible to drift. this is not true. In this way. this does not eliminate the need for calibration. How often they are calibrated depends on a number of factors. some manufacturers claim that they do not need to calibrate their fieldbus instruments because they are digital and so are always accurate and correct. Government regulations demand that specific instruments must be calibrated. Nowadays. Although fieldbus transmitters have been improved in terms of their measurement accuracy when compared to analogue transmitters. the manufacturer of the instrument will provide a recommended calibration interval. By calibrating an instrument before installation. 77 . This interval may be decreased if the instrument is being used in a critical process or application.” Common misconceptions There are some common misconceptions when it comes to instrument calibration. This is simply not true. calibration has an important role especially for the license to operate. Just because a sensor is newly installed does not mean that it will perform within the required specifications. instruments related to CO2 and NOX emissions. The most effective method of determining when an instrument requires calibrating is to use some sort of history trend analysis. Another common misunderstanding is that new instruments do not require calibration. for example. The optimal calibration interval for different instruments can only be determined with software-based history trend analysis. The main difference between fieldbus and conventional transmitters is that the output signal is a fully digital fieldbus signal. a company is able to enter all the necessary instrument data to its calibration database or calibration management software. as well as begin to monitor the stability or drift of the instrument over time. all instruments require calibrating at set intervals. For example. The most effective method of determining when an instrument requires calibrating is to use some sort of history trend analysis. Quality standards may also dictate how often a pressure or temperature sensor needs calibrating.why calibrate involved for both customers and governments [fiscal metering]. When to calibrate Due to drift. First. Again. Changing the output signal does not change the need for periodic calibration.

78 .

Once this has been agreed. it is important to consider the typical calibration management tasks that companies have to undertake. then classified into ‘critical’ and ‘non-critical’ devices. 79 . Decisions then need to be made regarding the calibration interval for each instrument. organisation. Plant instrumentation devices such as temperature sensors. Careful planning and decision-making is important. The creation and approval of standard operating procedures (SOPs) for each device is then required. the company must identify current calibration status for every instrument across the plant.why use software for calibration management Why use software for calibration management? E very manufacturing plant has some sort of system in place for managing instrument calibration operations and data. pressure transducers and weighing instruments – require regular calibration to ensure they are performing and measuring to specified tolerances. There are five main areas here. with documentation being a critical part of this. Calibration software is one such tool that can be used to support and guide calibration management activities. different companies from a diverse range of industry sectors use very different methods of managing these calibrations. quality. All plant instruments and measurement devices need to be listed. with documentation being a critical part of this. and accuracy of data and their level of automation. But in order to understand how software can help process plants better manage their instrument calibrations. Finally. documentation. Calibration software is one such tool that can be used to support and guide calibration management activities. execution. However. followed by the selection of suitable calibration methods and tools for execution of these methods. These methods differ greatly in terms of cost. and analysis. efficiency. comprising of planning and decision-making. the calibration range and required tolerances need to be identified.

Based on the calibration results. approved procedures. The next stage. calibration labels need to be created and pasted. Resources then have to be organised and assigned to actually carry out the scheduled calibration tasks. process and quality engineers and managers – in using the chosen tools and how to follow the approved SOPs. The next calibration tasks then have to be scheduled. involves training the company’s calibration staff – typically maintenance technicians. If. and documenting and archiving calibration data. the consequences could be disastrous for the plant. The documentation and storage of calibration results typically involves signing and approving all calibration records that are generated. Documentation Documentation is a very important part of a calibration management process. including any associated safety procedures. then created documents copied and archived. ISO 9001:2008 and the FDA both state that calibration records must be maintained and that calibration must be carried out according to written. resulting in costly production downtime. These intervals may need to be adjusted based on archived calibration history. a sensor drifts out of its specification range. The execution stage involves supervising the assigned calibration tasks. This means an instrument engineer can spend as much as 50 per cent of his or her time on documentation and paperwork – time that could be better spent on other value-added activities. then classified into ‘critical’ and ‘non-critical’ devices. The effectiveness of calibration needs to be reviewed and calibration intervals checked. although further instructions may need to be followed after calibration. a safety problem or leading to batches of inferior quality goods being produced. The calibration is then executed according to the plan. organisation. companies then have to analyse the data to see if any corrective action needs to be taken. Staff carrying out these activities must follow the appropriate instructions before calibrating the device. for example. service engineers. Imagine how long and difficult a task this is if the plant has thousands of instruments that require calibrating on at least a sixmonthly basis? The amount of manual documentation increases almost exponentially! 80 . This paperwork typically involves preparing calibration instructions to help field engineers. making notes of calibration results in the field. which may then have to be scrapped.why use software for calibration management All plant instruments and measurement devices need to be listed.

42 per cent of companies perform more than 2.000 calibrations each year. The percentage is still quite high in the food & beverage sector. Oil. different industry sectors have different requirements and regulations. In the highly regulated pharmaceuticals sector. let alone time consuming.000 instruments that require calibrating. 81 . Gas & Petrochemicals is similarly high. paper-based system. Only a quarter of companies use calibration software In Beamex’s own Calibration Study carried out recently. many of these firms said they were doing it without any sort of calibration software to assist them. with 55 per cent of companies performing more than 2. However. where 21 per cent of firms said they calibrated their instruments more than 2. just under a third of companies (with 500+ employees) typically have more than 5.000 calibrations per year. This equates to a huge amount of paperwork for any process plant. However.why use software for calibration management When it comes to the volume of documentation required. 40 per cent of companies surveyed said that they calculated calibration intervals by using historical trend analysis – which is encouraging. Many other companies said that they relied on generic spreadsheets and/or databases for this. In the Power & Energy sector.000 calibrations each year. Noting down calibration results by hand in the field and then transferring these results into a spreadsheet back at the office may seem archaic. A significant proportion (almost 20 per cent) of those surveyed said they used a manual. The other 60 per cent of companies determined This means an instrument engineer can spend as much as 50 per cent of his or her time on documentation and paperwork – time that could be better spent on other value-added activities. but many firms still do this. The figures outlined appear to suggest that companies really do require some sort of software tool to help them manage their instrument calibration processes and all associated documentation. Furthermore. a mere 25 per cent of companies with 500+ employees (across the industry sectors mentioned above) said that they did use specialist calibration management software. whilst others used a calibration module within an existing Computerised Maintenance Management System (CMMS). In a recent survey conducted by Control Magazine. a massive 75 per cent of companies carry out more than 2. Any type of paper-based calibration system will be prone to human error. for example. analysis of paper-based systems and spreadsheets can be almost impossible. the picture in reality can be very different.000 times every year.

by analysing the calibration history of a flow meter that is located in a ‘non-critical’ area of the plant. or by increasing the frequency where necessary.why use software for calibration management Using software for calibration management enables faster. or they used a uniform interval across the plant for all instruments. The number of instruments and the total number of periodic calibrations that these devices require can be several thousand per year. paper-based system requires little or no 82 . these notes are then tidied up or transferred to another paper document. Regardless of industry sector. saving time and resources. this might include engineers using pens and paper to record calibration results while out in the field. for improved safety. a process plant may find it necessary to increase the frequency of some sensors that are located in a hazardous. On returning to the office. easier and more accurate analysis of calibration records and identifying historical trends. For example. Instrument ‘drift’ can be monitored closely over a period of time and then decisions taken confidently with respect to amending the calibration interval. How to plan and keep track of each instrument’s calibration procedures means that planning and scheduling is important. after which they are archived as paper documents. While using a manual. the company may be able to decrease the frequency of calibration. potentially explosive area of the manufacturing plant. Rather than rely on the manufacturer’s recommendation for calibration intervals. Paper-based systems These systems typically involve hand-written documents. there seems to be some general challenges that companies face when it comes to calibration management. Typically. the plant may be able to extend these intervals by looking closely at historical trends provided by calibration management software. Companies could save so much time and reduce costs by using calibration management software to analyse historical trends and calibration results. instrument calibration intervals based on either the manufacturer’s own recommendation. Furthermore. easier and more accurate analysis of calibration records and identifying historical trends. Neither method is ideal in practice. Plants can therefore reduce costs and optimise calibration intervals by reducing calibration frequency when this is possible. every instrument calibration has to be documented and these documents need to be easily accessible for audit purposes. Using software for calibration management enables faster. Just as important.

The data is stored in electronic format. In addition. Calibration software With specialist calibration management software. but the recording of calibration information is still time-consuming and typing errors are common. analysis and optimisation of calibration frequency. FDA) for managing calibration records. automatic alarms cannot be set up on instruments that are due for calibration. CMM systems are not designed to manage calibrations and so often only provide the minimum calibration functionality. it is very labour-intensive and means that historical trend analysis becomes very difficult to carry out. The system is time consuming. but the calibration cannot be automated because the system is not able to communicate with ‘smart’ calibrators. Dual effort and re-keying of calibration data are also significant costs here. soaks up a lot of resources and typing errors are commonplace. production of reports. the calibration data is not easily accessible. certificates and labels. using an in-house legacy system to manage calibrations has its drawbacks. the level of automation is still low. communication with smart calibrators. calibration data is typically entered manually into a spreadsheet or database. The software manages and stores all instrument and calibration data. Plant hierarchy and works orders can be stored in the CMM system. This includes the planning and scheduling of calibration work. such as the scheduling of tasks and entry of calibration results. In these systems. Although instrument data can be stored and managed efficiently in the plant’s database. users are provided with an easy-to-use Windows Explorer-like interface. Calibration module of a CMMS Many plants have already invested in a Computerised Maintenance Management (CMM) system and so continue to use this for calibration management. etc.) Although certainly a step in the right direction.g. In-house legacy systems (spreadsheets. Also. Furthermore. databases. there seems to be some general challenges that companies face when it comes to calibration management. the calibration process itself cannot be automated. 83 . and Regardless of industry sector.why use software for calibration management investment. In addition. For example. the CMM system may not meet the regulatory requirements (e.

complies with FDA requirements. Execution is more efficient and errors are eliminated. Calibration certificates. easier and more accurate analysis of calibration records and identifying historical trends. Analysis becomes easier too. Calibration software has many functions that help in meeting these requirements. while automatic alerts for scheduled calibrations can be set up. automated calibration process. then automatically uploaded back to the calibration software. 84 . Locating records and verifying that the system works is effortless when compared to traditional calibration record keeping. calibration software can facilitate both the preparation and the audit itself. The system no longer requires pens and paper. Human error is minimised and engineers are freed up to perform more strategic analysis or other important activities. Regulatory organisations and standards such as FDA and ISO place demanding requirements on the recording of calibration data. Organisation also improves. Using software for calibration management enables faster. such as Change Management. These instructions can also be downloaded to a technician’s handheld documenting calibrator while they are in the field. Also. for example. Using software-based calibration management systems in conjunction with documenting calibrators means that calibration results can be stored in the calibrator’s memory. Benefits of using calibration software With software-based calibration management. reports and labels can all be printed out on paper or sent in electronic format. The Change Management feature in Beamex’s CMX software. when a plant is being audited. plant productivity and efficiency. The result is a streamlined. device and calibrator databases are maintained. Calibration instructions are created using the software to guide engineers through the calibration process. Audit Trail and Electronic Signature functions.why use software for calibration management easy integration with CMM systems such as SAP and Maximo. Documentation is also improved. There is no re-keying of calibration results from a notebook to a database or spreadsheet. which improves quality. Procedures and calibration strategies can be planned and all calibration assets managed by the software. enabling engineers to optimise calibration intervals using the software’s History Trend function. planning and decisionmaking are improved. Position. The software generates reports automatically and all calibration data is stored in one database rather than multiple disparate systems.

Business benefits For the business. For medium-to-large sized companies that have multiple users who have to deal with a large amount of instruments and calibration work. Benefits for all process plants Beamex’s suite of calibration management software can benefit all sizes of process plant. Engineers can analyse calibration results to see whether the calibration intervals on plant instruments can be altered. where calibration data is needed for only one location. The integration will save time. support and upgrades? •  Does the calibration software need to be scalable? •  Can data be imported to the software from the plant’s current systems? •  Does the software offer regulatory compliance? •  Supplier’s references and experience as a software developer? 85 . Beamex CMX Professional is ideal. only a few instruments require calibrating and where regulatory compliance is minimal. which is particularly beneficial if the company is replacing a lot of labour-intensive calibration activities. implementing software-based calibration management means overall costs will be reduced. is suitable for process manufacturers with multiple global sites. those instruments that perform better than expected may well justify a reduction in their calibration frequency. For example. calibration management software can be easily integrated to this system. a central calibration management database is often implemented that is used by multiple plants across the world. These savings come from the now-paperless calibration process. Beamex’s high-end solution. which simply is not possible with a standalone CMM system. Even if a plant has already implemented a CMM system. For relatively small plants. CHECKLIST Choosing the right calibration software •  Is it easy to use? •  What are the specific requirements in terms of functionality? •  Are there any IT requirements or restrictions for choosing the software? •  Does the calibration software need to be integrated with the plant’s existing systems? •  Is communication with smart calibrators a requirement? •  Does the supplier offer training. Beamex CMX Light is the most appropriate software. If the plant instruments are already defined on a database. implementation. Here. validated processes. as the entire calibration process is now streamlined and automated. Plant efficiencies should also improve. reduce costs and increase productivity by preventing unnecessary double effort and re-keying of works orders in multiple systems. Integration also enables the plant to automate its calibration management with smart calibrators. Costly production downtime will also be reduced. with no manual documentation procedures. as well as strict regulatory compliance. Manual procedures are replaced with automated. multilingual users and a very large amount of instruments that require calibration. the calibration management software can utilise the records available in the CMM system database. CMX Enterprise.

paper-based systems. increased productivity and reduced costs of the entire calibration process. in-house built legacy calibration systems or calibration modules with CMM systems. only one quarter of companies who need to manage instrument calibrations actually use software designed for that purpose. whilst 92% said that using CMX had improved the quality of their calibration system. regardless of industry sector. using dedicated calibration management software results in improved quality. Summary Every type of process plant. 94% of CMX users stated that using Beamex products had improved the efficiency of their calibration processes. The results showed that 82% of CMX Calibration software customers said that using Beamex products had resulted in cost savings in some part of their operations.why use software for calibration management SUMMARY Calibration software improves calibration management tasks in all these areas •  Planning & decision-making •  Organisation •  Execution •  Documentation •  Analysis Beamex users Beamex conducted recently a survey of its customers. Despite these benefits. can benefit from implementing specialist calibration management software. across all industry sectors. The business benefits of using software for calibration management •  Cost reduction •  Quality improvements •  Increase in efficiency 86 . Compared to traditional.

87 .

.

but for other sensors that are deemed less critical to production. with increasing demands and cost issues being placed on manufacturers these days. pressure transducers. a function available within Beamex® CMX calibration software. safety issues or possibly leading to batches of inferior quality goods being produced. If sensors drift out of their specification range. the consequences can be disastrous for a plant. Calibration history trend analysis is only possible with calibration software that provides this functionality. Most process manufacturing plants will have some sort of maintenance plan or schedule in place. 89 . resulting in costly production downtime. By doing it.how often should instruments be calibrated How often should instruments be calibrated P lants can improve their efficiency and reduce costs by performing calibration history trend analysis. being calibrated less frequently or not at all. which ensures that all instruments used across the site are calibrated at the appropriate times. a plant is able to define which instruments can be calibrated less frequently and which should be calibrated more frequently. which then have to be scrapped. However. with those deemed critical enough receiving the required regular checks. Plants can improve their efficiencies and reduce costs by using calibration ‘history trend analysis’. flow meters and the like – are performing and measuring to specified tolerances. Adjusting calibration intervals based on history trend analysis Manufacturing plants need to be absolutely confident that their instrumentation products – temperature sensors. the time and resources required to carry out these calibration checks are often scarce. This can sometimes lead to instruments being prioritised for calibration.

including pharmaceuticals. the study proved also that the pharmaceuticals sector typically possesses a significantly higher number of instruments per plant that require calibrating. food and beverage. in the pharmaceuticals sector. The analyses of historical trends and how a pressure sensor. First. oil and gas. chemicals. Current practices in process plants But in reality. Perhaps unsurprisingly. these plants also calibrate their instruments more frequently than other industry sectors. However. ensuring that instruments are checked and corrected before they drift out of tolerance. In addition. Interestingly. by calibrating instruments more often when they are located in critical areas of the plant. drifts in and out of tolerance over a given time period. The survey covered all industry sectors. the survey showed that from all industry sectors. for example.how often should instruments be calibrated Sensors that are found to be highly stable do not need to be re-calibrated as often as sensors that tend to drift. is only possible with calibration software that provides this type of functionality. how often do process plants actually calibrate their instruments and how does a maintenance manager or engineer know how often to calibrate a particular sensor? In March 2010. But plants can improve their efficiencies and reduce costs by using calibration ‘history trend analysis’. due to it being a highly regulated industry. Second. With this function. 56% of the respondents said they calibrated their instruments no more than once a year. paper and pulp. by calibrating less frequently where instruments appear to be highly stable according to their calibration history. Beamex conducted a survey that asked process manufacturing companies how many instruments in their plant required calibrating and the frequency with which these instruments had to be calibrated. This type of practise is common in companies that employ an effective ‘Preventive Maintenance’ regime. service. manufacturing. 59% said they calibrated once a year and 30% said they calibrated twice a year. the plant can analyze whether it should increase or decrease the calibration frequency for all its instruments. 90 . a function available within Beamex® CMX calibration software. Cost savings can be achieved in several ways. power and energy.

91 . weeks. without of course sacrificing the quality of the product or process or the safety of the plant and its employees. actually performed within the specified tolerance over a certain time period.” But that’s not all. If it hasn’t. Calibration schedules take into account the accuracy required for a particular sensor and the length of time during which it has previously been able to maintain that degree of accuracy. or even years of operation and which can be left for longer periods. With CMX’s History Trend Option. by analysing an instrument’s drift over time (ie. Sensors that are found to be highly stable do not need to be re-calibrated as often as sensors that tend to drift. enables maintenance staff to concentrate their efforts only where they are needed. can use the functionality to compare different sensor types to see which one best suits the new process. but all industry sectors can benefit from using the software tool. the supplier provides a technical specification that includes details on what the maximum drift of that sensor should be over a given time period. The function enables users to plan the optimal calibration intervals for their instruments. Pertti Mäki is Area Sales Manager at Beamex. the engineer now has data to present to the supplier to support his findings. without doubt. He specialises in selling the Beamex® CMX to different customers across all industry sectors. in setting up a new process line for example.” The trick. the engineer can now verify that the sensor he or she has purchased. As Mäki explains: “When an engineer buys a particular sensor. says Mäki. the historical trend) companies can reduce costs and improve their efficiencies.how often should instruments be calibrated The benefits of analyzing calibration history trends But regardless of the industry sector. He comments: “The largest savings from using the History Trend Option are in the pharmaceuticals sector. This makes it an invaluable tool for maintenance or quality personnel who. Doing this. But there are other. which helps companies identify the optimal calibration intervals for instruments. he says. therefore eliminating unnecessary calibration effort and time. is determining which sensors should be recalibrated after a few days. Calibration software such as CMX can also help with the planning of calibration operations. The History Trend function also means that a plant can now compare the quality or performance of different sensors from multiple manufacturers in a given location or set of process conditions. perhaps less obvious benefits of looking at the historical drift over time of a particular sensor or set of measuring instruments.

maintenance personnel. allowing to evaluate the calibrations of a position or a device for a longer time period compared to the normal calibration result view. For example. the user can get an overview of how a particular device drifts between calibrations and also whether the drift increases with time. Based on this information. using the ‘Report Design’ tool option. can analyze an instrument’s drift over a certain time period. the engineer can analyze how different devices are suited for use in a particular area of the plant or process. The ‘History Trend’ window enables users to view key figures of several calibration events simultaneously. it is then possible to make decisions and conclusions regarding the optimal calibration interval and the quality of the instruments with respect to measurement performance. Once implemented. for example. The History Trend function enables users to plan the optimal calibration intervals for their instruments. Reporting is straightforward and the user can even tailor the reports to suit his or her individual needs.how often should instruments be calibrated History Trend displays the instrument’s drift over a given period both numerically and graphically. History Trend displays the instrument’s drift over a given period both numerically and graphically. Also. 92 .

•  The Beamex® CMX stores every calibration event into the database. •  The Beamex® CMX also indicates when new devices have been installed and calibrated.how often should instruments be calibrated CALIBRATION HISTORY TREND ANALYSIS Calibration history trend analysis allows you to analyze the instrument’s drift over a certain time period. •  The graphical display of the history trend helps in visualizing and optimizing the calibration interval for the instruments. This helps in comparing differences between devices. the history trend is made automatically without any extra manual work. 93 . HISTORY TREND REPORT HISTORY TREND USER-INTERFACE The graphical display of the history trend helps in visualizing and optimizing the calibration interval for the instruments.

how often should instruments be calibrated SUMMARY The benefits of calibration history trend analysis: •  Analyzing and determining the optimal calibration interval for instruments •  Conclusions can be made regarding the quality of a particular measuring instrument •  Time savings: faster analyses is possible when compared to traditional. manual methods •  Enables engineers to check that the instruments they have purchased for the plant are performing to their technical specifications and are not drifting out of tolerance regularly •  Supplier evaluation: the performance and quality of different sensors from different manufacturers can be compared quickly and easily. When calibration frequency can be decreased: •  If the instrument has performed to specification and the drift has been insignificant compared to its specified tolerance •  If the instrument is deemed to be non-critical or in a low priority location When calibration frequency should be increased: •  If the sensor has drifted outside of its specified tolerances during a given time period •  If the sensor is located in a critical process or area of the plant and has drifted significantly compared to its specified tolerance over a given time period •  When measuring a sensor that is located in an area of the plant that has high economic importance for the plant •  Where costly production downtime may occur as a result of a ‘faulty’ sensor •  Where a false measurement from a sensor could lead to inferior quality batches or a safety issue 94 .

how often should instruments be calibrated ISO 9001:2008 quality management requirements 7. the organization shall assess and record the validity of the previous measuring results when the equipment is found not to conform to requirements.2. Records of the results of calibration and verification shall be maintained (see 4. In addition. e)  be protected from damage and deterioration during handling. b)  be adjusted or re-adjusted as necessary. The organization shall establish processes to ensure that monitoring and measurement can be carried out and are carried out in a manner that is consistent with the monitoring and measurement requirements. 95 . against measurement standards traceable to international or national measurement standards. where no such standards exist. The organization shall take appropriate action on the equipment and any product affected.4). This shall be undertaken prior to initial use and reconfirmed as necessary.6 Control of monitoring and measuring devices The organization shall determine the monitoring and measurement to be undertaken and the monitoring and measuring devices needed to provide evidence of conformity of product to determined requirements. measuring equipment shall a)  be calibrated or verified at specified intervals. maintenance and storage. the basis used for calibration or verification shall be recorded. Where necessary to ensure valid results. the ability of computer software to satisfy the intended application shall be confirmed. c)  be identified to enable the calibration status to be determined. When used in the monitoring and measurement of specified requirements. d)  be safeguarded from adjustments that would invalidate the measurement result. or prior to use.

.

how often should calibrators be calibrated

How often should calibrators be calibrated

A

s a general rule for Beamex’s documenting MC calibrators, starting with a 1-year calibration period is recommended, because the calibrators has a 1-year uncertainty specified. The calibration period can be changed in the future, once you begin receiving cumulated stability history, which is then compared to the uncertainty requirements. In any case, there are many issues to be considered when deciding a calibrator’s calibration period, or the calibration period for any type of measuring device. This article discusses some of the things to be considered when determining the calibration period, and provides some general guidelines for making this decision. The guidelines that apply to a calibrator, also apply to other measuring equipment in the traceability chain. These guidelines can also be used for process instrumentation. An important aspect to consider when maintaining a traceable calibration system is to determine how often the calibration equipment should be recalibrated. International standards (such as ISO9000, ISO10012, ISO17025, CFRs by FDA, GMP, etc.) require the use of documented calibration programs. This means that measuring equipment should be calibrated traceably at appropriate intervals and that the basis for the calibration intervals should be evaluated and documented. When determining an appropriate calibration period for any measuring equipment, there are several things to be considered. They are discussed below.

Uncertainty need is one of the most important things to consider when determining the calibration period.

97

how often should calibrators be calibrated

Uncertainty need One of the first things to evaluate is the uncertainty need of the customer for their particular measurement device. Actually, the initial selection of the measurement device should be also done based on this evaluation. Uncertainty need is one of the most important things to consider when determining the calibration period. Stability history

In critical applications, the costs of an outof-tolerance situation can be extremely high (e.g. pharmaceutical applications) and therefore calibrating the equipment more often is safer.

When the customer has evaluated his/her needs and purchased suitable measuring equipment, (s)he should monitor the stability history of the measuring equipment. The stability history is important criteria when deciding upon any changes in the calibration period. Comparing the stability history of measuring equipment to the specified limits and uncertainty needs provides a feasible tool for evaluating the calibration period. Naturally, calibration management software with the history analysis option is a great help in making this type of analysis. The cost of recalibration vs. consequences of an out-of-tolerance situation Optimizing between recalibration costs and the consequences of an outof-tolerance situation is important. In critical applications, the costs of an out-of-tolerance situation can be extremely high (e.g. pharmaceutical applications) and therefore calibrating the equipment more often is safer. However, in some non-critical applications, where the out-oftolerance consequences are not serious, calibration can be made less frequently. Therefore, evaluating of the consequences of an out-oftolerance situation is something to be considered. The corrective actions in such a case should also be made into an operating procedure. Some measurements in a factory typically have more effect on a product quality than others, and therefore some measurements are more acute than others and should be also calibrated more often than others. Initial calibration period When you purchase calibration equipment with which you are not familiar, you still need to decide the initial calibration period. In this

98

how often should calibrators be calibrated

situation, abiding by the manufacturer’s recommendation is best. For more acute applications, using a shorter calibration period right from the beginning is recommended. Other things to be considered There are also other issues to be considered when determining the calibration period, such as the workload of the equipment, the conditions where the equipment will be used, the amount of transportation and is the equipment look damaged. In some cases, crosschecking with other similar measuring equipment is also feasible for detecting the need for calibration. Crosschecking may be carried out before every measurement in some acute applications. Naturally, only appropriate, metrological, responsible personnel in the company may make changes to the calibration equipment’s calibration period.

In some cases, crosschecking with other similar measuring equipment is also feasible for detecting the need for calibration.

SUMMARY

The main issues to be considered when determining the calibration period for measuring equipment should include at least following: •  The uncertainty needs of the measurements to be done. •  The stability history of the measuring equipment. •  Equipment manufacturer’s recommendations. •  The risk and consequences of an out-of-tolerance situation. •  Acuteness of the measurements.

99

Other than helping to save our planet and reducing the number of trees cut down each year. ecology. especially given that there are technologies. printer paper. almost 4 billion trees or 35% of the total trees cut down across the world are used in paper industries on every continent (source: www. 101 . Take the calibration of plant instrumentation devices such as temperature sensors. Today. Traditional paper-based calibration systems Typically. calibrating instruments is an enormous task that consumes vast amounts of paperwork. perhaps even a poster on the wall. which means they are missing out on the benefits of moving towards a paperless calibration system. as businesses. which means they are missing out on the benefits of moving towards a paperless calibration system.paperless calibration improves quality and cuts costs Paperless calibration improves quality and cuts costs P aper is part of our everyday lives – whether in the workplace or at home.com). Globally. our consumption of paper is far higher than it needs to be. significant benefits in minimising the use of paper. software and electronic devices readily available today which render the use of paper in the workplace unnecessary. Global consumption of paper has grown 400% in the last 40 years. there are other. As manufacturing companies. a maintenance or service Far too many of these companies still use paper-based calibration systems. Far too many of these companies still use paper-based calibration systems. magazines. a paper-based calibration system involves the use of handwritten documents. Whilst out in the field. So let’s not add to this already heavy burden on our forests and the environment. Take a minute to look around the room you are in and you’ll notice how many objects are made from paper: books. weighing instruments and pressure transducers. amongst the process manufacturing industries.

etc. it is not practicable to store or carry lots of paperwork. in industrial environments. engineer will typically use a pen and paper to record instrument calibration results. important paper records could potentially be lost or damaged in an accident or fire. accessing calibration data quickly is not easy. paper records that need approval have to be routed to several individuals. every square foot of the business has an associated cost. it is extremely labourintensive and means that historical trend analysis of calibration results becomes very difficult.paperless calibration improves quality and cuts costs With paperless systems. With paperless systems. whilst also improving quality. Paper systems are time consuming. paper-based system requires little or no investment in new technology or IT systems. So why would these companies generate and store separate paper copies of important records such as works orders. While using a manual. after which they are archived as paper documents. workflow and making other significant cost savings for the business. by utilising the latest software-based calibration management systems from companies like Beamex. However. After all. blank calibration certificates. Practical benefits of using less paper Aside from the financial benefits of moving towards a paperless calibration system. as those individuals who need to sign off records or calibration 102 . there are practical reasons why firms should go paperless. these notes are then tidied up and/or transferred to another paper document. they soak up lots of company resources and manual (typing) errors are commonplace. when these records can all be combined into a single electronic record? Improved workflow With paper-based systems. standard operating procedures (SOPs). There will be less waiting time. These same companies that use paper-based calibration systems are together generating hundreds of thousands (millions?) of paper calibration certificates each year. Furthermore. workflow improves dramatically. these organisations can significantly reduce their paper consumption. workflow improves dramatically. which is time-consuming. Dual effort and the re-keying of calibration data into multiple databases become significant costs to the business. On returning to the office. Often. In addition.

Calibration intervals can be optimised. Costly production downtime due to unforeseen instrument failures will also be reduced. the business benefits are significant. In turn. The cost and time associated with printing copies of paper documents is also eliminated. Locating records and verifying that the system works becomes effortless when compared to traditional paper-based record keeping. Illegible handwritten notes are also a problem. Sometimes users may inappropriately modify the results data due to work pressures or lack of time/resources. calibration software facilitates both the preparation and the audit itself. whilst mistakes and manual errors will be virtually eliminated. Paperless calibration systems improve plant efficiencies because the entire calibration process is now streamlined and automated. manual errors such as misreadings can occur. especially if these paper records need to be typed or transcribed to a computer system or database. particularly calibration results. engineers and management will have more confidence in Paperless calibration systems improve plant efficiencies because the entire calibration process is now streamlined and automated. faster and more reliable. Business benefits For those more enlightened companies that use software-based calibration systems. 103 . When a plant is being audited. Just as important. particularly when using weighscales or other instruments that are open to an individual’s own interpretation of the data. electronic records enable easier analysis of data. For example. Paper records may not always reflect the truth. The whole calibration process – from initial recording of calibration data through to historical trend analysis – will take less time. Transcription errors such as these can lead to all sorts of problems for a business and can take months to rectify or to identify the rogue data. For example. which again has cost reduction benefits to the business. Data integrity The integrity of paper-based calibration systems cannot be relied upon. as well as the cost of filing and storing those paper records. those instruments that are performing better than expected may well justify a reduction in their calibration frequency. this means that operators. Historical trending becomes easier.paperless calibration improves quality and cuts costs documents can share or access electronic records simultaneously from a central database.

Electronic data also provides an excellent foundation for ongoing plant operation and maintenance. How paperless should you go? Of course. In other words. Companies can go even further than this and use electronic records for works orders. resulting in completely paperless. business management systems. technician friendly interfaces on industrialized PDA or tablet based hardware when manual data entry cannot be avoided. end-to-end workflows. increased efficiencies and reduced plant downtime. 104 . and the manual entry of calibration results into unintelligent calibration forms on portable industrial computers prone to eye-to-hand data mis-reads and repetitive strain induced error. particularly when it comes to plant audits. and for control systems. many companies are neither completely paperless nor rely solely on paper-based systems – the process is sometimes a hybrid of the two. the calibration data is shared with other business IT systems electronically. in reality. without needing to collect all the plant data again. this greater confidence in calibration data leads to a better understanding and analysis of business performance and KPIs (particularly if the calibration software is integrated with other business IT systems such as a CMMS) leading to improved processes. electronic records are easy to manipulate and can be re-used in different IT systems. Although handover by commissioning teams that use paper records is straightforward and of universal format.paperless calibration improves quality and cuts costs the data. often in difficult industrial environments that would make the use of portable office computers impractical. The un-editable electronic data stored on high performance multifunction calibrators can be uploaded to calibration management software for safe storage and asset management. end-to-end workflows. Commissioning At plant commissioning times. A key part of paperless calibration records is the capture of data at point of work. One way to overcome these error prone data capture methods is to use portable documenting calibrators to measure what can be measured and provide intelligent. resulting in completely paperless. data historians. In addition. electronic records simplify the handover of plant and equipment. The calibration data is shared with other business IT systems electronically.

as this will help companies to overcome the natural resistance to change amongst the workforce. paper-based systems. training & education Paperless systems also need validating in the user’s own environment. Beamex provides comprehensive validation. Chemicals. as this will help companies to overcome the natural resistance to change amongst the workforce.” Today. Education and training for users is critical.paperless calibration improves quality and cuts costs Suitable hardware Rather than rely on engineers in the field accurately keying in calibration results into suitably robust laptops or PDAs. Croda uses the CMX calibration management software system from Beamex. Case study Beamex is helping many organisations to implement paperless calibration management systems. which may be used to dealing with traditional. Each vessel needs to be certified at least once every two years in order to demonstrate that the vessel is safe and structurally sound. as well as the sensors that monitor the incoming chemical additives and the outgoing effluent.” Education and training for users is critical.” says Wright. paper-based systems. which may be used to dealing with traditional. the Croda plant uses pressurised vessels to purify lanolin for healthcare and beauty products. education and training services for customers. which coordinates data collection tasks and archives the results. Validation. Amongst these customers is UK firm Croda Chemicals Europe. including Pharmaceuticals. Senior Instrument Technician David Wright recalls what it was like to perform all of those calibration operations with paper and pencil during the company’s regularly scheduled maintenance shutdowns: “It took us one week to perform the calibrations and a month to put together the necessary paperwork. Power & Energy. Here. Oil Gas & Petrochemicals companies. This includes a functionality check on all of the pressure instrumentation. it is better to source the data electronically using documenting calibrators that are specifically designed for this task. easier and more accurate than our old paper-based procedures. “It’s faster. “It’s saving us around 80 manhours per maintenance period and should pay for itself in less than three years. Based in East Yorkshire near Goole. 105 .

106 .

Commissioning may involve mock operations which are commissioning activities conducted to allow operational testing of the equipment and operator training and familiarization. Pre-commissioning activities are those which have to be undertaken 107 . or any defined part of a plant. At the completion of commissioning. What is process instrument commissioning? Successful commissioning of process instrumentation must be considered within the context of the overall commissioning program. Plant commissioning involves activities such as checking to ensure plant construction is complete and complies with the documented design or acceptable (authorized and recorded) design changes. the plant will be fully ready for production operation. These activities are usually described as ‘cold commissioning’. commissioning activities are those associated with preparing or operating the plant or any part of the plant prior to the initial start-up and are frequently undertaken by the owner or joint owner/ contractor team. Successful commissioning of process instrumentation is an essential requirement for ideal plant performance. This article explains process instrument commissioning and the benefits of calibration during the commissioning phase. Energizing power systems. testing of the control systems as well as verification of the operation of all interlocks and other safety systems are also typical commissioning tasks. calibration of instrumentation. A plant. is ready for commissioning when the plant has achieved mechanical completion. In general. operational testing of plant equipment.intelligent commissioning Intelligent commissioning C alibration plays a vital role in process plant commissioning and when installing new instruments.

and ensure this commitment is maintained. Management. plant operation and plant maintenance. it may cost more. Basically. Commissioning requires a team of people with a background in plant design. prior to operating equipment. The cost of process instrument commissioning is typically affected by the following issues: learning and familiarizing with the field device. and the re-commissioning activities have been completed to the extent where the owner approves the plant and can begin commissioning activities. and a schedule needs to be established for each task including benchmarks for monitoring purposes. Each of the commissioning activities must be broken down into a number of manageable tasks. there is a risk that the work may be under-resourced. such as adjustments and checks on machinery performed by the construction contractor prior to commissioning and without which the installation cannot be said to be mechanically complete. these steps must be repeated with every field device that will be installed at the plant. detailed planning of commissioning and plant handover are essential elements of the overall project plan and schedule as any other grouping of activities. An extra day taken for commissioning means the same to the plant owner as an extra day taken during designing or construction.intelligent commissioning There are many reasons why instruments should be calibrated during the commissioning phase before start-up. As there are many cost factors in the commissioning process. because the funds have been allotted to cover budget overruns. and adverse incidents at the start-up phase can be avoided. as the plant owner’s commitments in terms of product marketing and operational costs are likely to be higher. personnel and cost of commissioning Since commissioning takes place toward the end of the project. connecting to and identifying the field device. in fact. It is essential to comprehend the scope and length of commissioning activities and include them in the initial project plan and budget allocations. This can prove to be a worthwhile investment for large plants because it allows for dedicated responsibility and focus in operations and significant improvements to schedules. The rate 108 . configuring the required parameters and testing the configuration and interface to other systems. Some companies employ specialized commissioning engineers. physically installing the field device. Mechanical completion of a plant or any part of a plant occurs when the plant or a part of the plant has been completed in accordance with the drawings and specifications.

of commissioning is measurable (e. teamwork and training are all essential. Calibration and the commissioning of field instrumentation New process instrumentation is typically configured and calibrated by the manufacturer prior to installation.intelligent commissioning Construction Pre-commissioning Mechanical completion Commissioning Trial operation Initial start-up Examine product specification Examine production performance Acceptance of plant The calibration database can be calibration software designed specifically for managing calibration assets and information. Sequence of activities leading to commissioning and acceptance of a plant. Successful commissioning of process instrumentation must be considered within the context of the overall commissioning program. such as the Beamex® CMX Calibration Software. and the size of the team and composition of specialists depends on the nature and scope of the system. The commissioning team consists of a mixture of specialists.g. instruments are often recalibrated upon arrival at the site. documentation. coordination. However. thereby enabling progress to be reviewed regularly. especially if there has been obvious 109 . communications. Good planning. number of loops or sequence of steps tested per day). instrument and process engineers.

However. as was explained in the previous paragraph. You can check the overall quality of the instrument to see if it is defective and to ensure it has the correct. Monitoring the quality and stability of a transmitter When calibration procedures are performed for an uninstalled instrument.intelligent commissioning damage in transit or storage. Assuring transmitter quality First of all. Calibrating a new instrument before installing or using it is a quality assurance task. By calibrating an instrument you can check the settings of the instrument. the calibration serves also future purposes. it is possible to reconfigure the transmitter. By calibrating the transmitter before installation and on a regular basis thereafter. specified settings. There are also many other reasons why instruments should be calibrated during the commissioning phase before start-up. Entering the necessary transmitter data into a calibration database By calibrating an instrument before installation it is possible to enter all the necessary instrument data into the calibration database. The transmitter information is critical in defining the quality of the instrument and for planning the optimal calibration interval of the instrument. or years of operation and which can be left as is for 110 . The calibration database can be calibration software designed specifically for managing calibration assets and information. Transmitters that are found to be highly stable need not be recalibrated as often as transmitters that tend to drift. such as the Beamex® CMX Calibration Software. weeks. weeks. The new uninstalled instrument or transmitter may have the correct. the fact that an instrument or transmitter is new does not automatically mean that it is within required specifications. The trick is determining which sensors should be recalibrated after a few hours. Reconfiguring a transmitter The trick is determining which sensors should be recalibrated after a few hours. as well as to monitor the instrument’s stability. it is possible to monitor the stability of the transmitter. After you have performed this task. Calibration is therefore a key element in the process of reconfiguring an uninstalled transmitter. it is possible that the original planned settings are not valid anymore and they need to be changed. specified settings. when the initial planned specifications have been changed. or years of operation and which can be left as is for longer periods without sacrificing quality or safety.

All calibration documentation is Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks do not get forgotten. The Beamex® CMX Calibration Software can be used for improving the quality. Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks do not get forgotten. thereby eliminating unnecessary calibration work. the calibration results are stored automatically in the calibrator’s memory during the calibration process. Foundation Fieldbus and Profibus. entering the instrument data into a calibration management system is part of the calibration procedures performed on an instrument before it is installed and in use. The Beamex® MC series documenting calibrators can be used for calibrating pressure. consisting of calibration software and documenting calibration equipment. Therefore. The Beamex® CMX can be used for planning and scheduling calibrations. Integrated calibration solution by Beamex The Beamex® Integrated Calibration Solution. managing and storing all calibration data as well as analyzing and optimizing the calibration interval. such as analog. Using the CMX gives always a clear status of the transmitters. overlooked or become overdue. By using a documenting calibrator. for instance. The Beamex calibrators support various different transmitter protocols. Intrinsically safe calibrators for potentially explosive environments are also available. temperature. The Beamex calibrators are all-in-one calibrators. Engineers performing calibrations no longer have to write down any results on paper. making the entire process much quicker and reducing costs.intelligent commissioning longer periods without sacrificing quality or safety. overlooked or become overdue. does anyone perform the calibration (check in/out function) and what is the instrument/position status (pass/fail). HART. which mean that they can be used to replace several individual measurement devices. Having a fully integrated calibration management system – using documenting calibrators and calibration management software – is important. productivity and cost-effectiveness of a plant’s calibration process. improves the quality and efficiency of the entire calibration system through faster. electrical and frequency signals. Doing so allows maintenance personnel to concentrate their efforts only where needed. are they installed and ready for calibration. smarter and more accurate management of all calibration assets and procedures. 111 .

The quality and accuracy of calibration results also improve. again. This means that engineers do not spend their time transferring the results from their notepad to final storage on a computer. the calibration results are stored automatically in the calibrator’s memory during the calibration process. Major time-savings can also be achieved by using Beamex’s documenting MC calibrators HART and/or Fieldbus functionality to enter transmitter data into the calibrators’ memory where the data can be populated to the CMX Calibration Software. therefore automatically produced when using the Beamex® Integrated Calibration Solution.intelligent commissioning By using a documenting calibrator. The calibration results are transferred automatically from the calibrator’s memory to the computer/ database. saving time and money. SUMMARY Calibration is beneficial during process plant commissioning for various different reasons: • Transmitter quality assurance • Reconfiguring a transmitter • Monitoring the quality and stability of a transmitter •  Entering the necessary transmitter data into a calibration database and defining the optimal calibration interval 112 . as there are fewer mistakes due to human error. instead of typing the data manually into the calibration database.

113 .

114 .

where equipment and plant infrastructure is large. including pharmaceuticals. profitability and growth. but critical part of a company’s asset management strategy should be the calibration of process instrumentation. boilers. 115 . seamlessly integrated set of IT systems across the plant. Beamex® CMX. chemicals. Maintenance management has become an issue which deserves enterprise-wide and perhaps multi-site attention. having a reliable. oil and gas. metal processing. safety issues or batches of inferior quality goods being produced. perhaps resulting in costly production downtime.successfully executing a system integration project Successfully executing a system integration project F or process manufacturers today. especially if the company is part of an asset-intensive industry. the consequences can be disastrous. If stoppages to production lines due to equipment breakdowns are costly. paper. is critical to business efficiency. nuclear. In the process industries. furnaces. Manufacturing plants need to be sure that their instrumentation products – temperature sensors. which then have to be scrapped. flow meters and the like – are performing and measuring to specified tolerances. In the process industries. complex and expensive. For this. conveyor systems or hydraulic pumps – is equally critical for these companies. but critical part of a company’s asset management strategy should be the calibration of process instrumentation. implementing the latest computerized maintenance management systems (CMMS) might save precious time and money. has proved itself time and time again across many industry sectors. special purpose machines. Beamex’s calibration management software. Maintaining plant assets – whether that includes production line equipment. a small. pressure transducers. or across multiple sites. a small. If sensors drift out of their specification range.

Beamex® CMX Professional or Beamex® CMX Enterprise software can easily be integrated to CMM systems. Beamex® CMX helps companies document. Seamless communication with calibrators also provides many practical benefits such as a reduction in paperwork. elimination of human error associated with manual recording. Seamless communication between CMX and ‘smart’ calibrators means that companies have the ability to automate predefined calibration procedures. All detailed calibration results are stored and available on the CMX database. the customer and the CMM system software partner. plan. and the ability to speed up the calibration task. once the calibration work order has been executed. schedule. Beamex® CMX Professional or Beamex® CMX Enterprise software can easily be integrated to CMM systems. whether it is a Maximo. like procedures. 116 . As well as retrieving and storing calibration data. SAP or Datastream CMM system or even a company’s own. in-house software for maintenance management. CMX also stores the complete calibration history of process instruments and produces fully traceable calibration records. Integration project A customer may have a large CMM system and a considerable amount of data keying to perform before integration is complete.successfully executing a system integration project Seamless communication Today. Then. in-house software for maintenance management. reminders and safety-related information. A data exchange module or interface that sits between the two systems is required. analyze and optimize their calibration work. SAP or Datastream CMM system or even a company’s own. The integration project involves three main parties: Beamex. whether it is a Maximo. Integrating CMX with a CMM system means that plant hierarchy and all work orders for process instruments can be generated and maintained in the customer’s CMM system. CMX can also download detailed instructions for operation before and after calibrating. most process manufacturers use some sort of computerized maintenance management system (CMMS) that sits alongside their calibration management system. Calibration work orders can easily be transferred to CMX Calibration Software. CMX sends an acknowledgement order of this work back to the customer’s CMM system.

successfully executing a system integration project Project organization and resourcing In order to have a successful integration. Development and Implementation 3. The role of each member should be defined and project managers appointed. The role. the customer and the CMM system software partner. Installation. Scope of Work 2. Moreover. a project organization should be established and include members from both the supplier’s and the customer’s organization. The project manager is usually responsible for the operative management of the project. as a successful project requires input from both parties. it’s important that the right people and decision-makers are involved and participate right from the beginning of the project. The project steering group is responsible for making key decisions during the project. agreement. project plan or other document annexed to the offer / agreement. Testing 4. Project phases The integration project is divided into four main phases: 1. Each project phase should be approved according to the acceptance procedures defined in the offer. a project steering group may need to be established. The integration project involves three main parties: Beamex. It’s also essential that the main roles and responsibilities of the parties are specified before the project evolves. In addition. 117 . Verification and Training The four main phases are also often divided into sub-phases. A schedule is usually defined for the completion of the entire project as well as for the completion of each project phase. tasks and authority of the project steering group must be defined as well as the decision-making procedures.

successfully executing a system integration project Scope of work To ensure successful integration with a satisfied customer. which will direct the project back to phase one where a review of the scope is necessary. The scope of work is important to make sure that both the supplier and the customer have understood the project in question and they have similar expectations from it. This means that establishing some framework and limitations for the project are also very important. partner responsibilities and the desired outcome. Defining what is not included in the scope of work is just as important as defining what is included in it. The SOW is often developed through pre-studies and workshops. 118 . as the resourcing. questions or problems may appear later in the project. This is an urgent but time-consuming matter and can be avoided if the right people and decision-makers participate in the first project phase. main roles. The scope of work should include a brief project description. scheduling and costs of the project depend greatly on the scope of work. However. it is important that the supplier and customer agree on change management procedures as early as the starting phase of the project. defining the correct scope of work (SOW) is crucial. If the scope of work is not defined carefully. services provided. as changes to the original scope of work may be necessary and required even in projects where the SOW phase has been done carefully.

the integration can enter the next phase. The supplier will. The project manager at the buyer’s facility now plays a major role in the success of the integration process. assist with informing.successfully executing a system integration project Development and implementation When the scope of work has been defined and approved by both parties. When the integration is finished. • Purpose / needs • Target • Supplier’s responsibilities • Customer’s responsibilities • Project management and project steering group Scope of work (SOW) Specifications documentation • Change management Development and implementation Implementation documentation • Testing and acceptance procedures Testing Testing documentation • Final approval by customer Installation Verification Training Instructional documentation FOLLOW UP CLOSURE OF INTEGRATION PROJECT 119 . reduces costs and increases productivity by preventing unnecessary double effort and rekeying of procedures in separate systems. verification and training The final stage in the integration process is the installation and testing at the customer’s facility and taking the system into production use. if required and agreed. The testing. the customer has a system that saves time. and at the final stage of the project. approval procedures and timelines should be defined when agreeing on the project. in order to be able to continue the development work to next phase. Installation. reduces costs and increases productivity by preventing unnecessary double effort and re-keying of procedures in separate INTEGRATION PROJECT PHASES When the integration is finished. Testing Testing occurs both during the project after each partial delivery. which is the actual development and implementation of the project deliverables. training and providing training materials. the customer has a system that saves time.

the most common benefits being: less equipment breakdowns (leading to a reduction in overall plant downtime). typing errors are eliminated. particularly with production teams. When there is no need to manually re-key the data. This improves the quality of the entire system. Enterprise Asset Management. Savings from EAM are reasonably well-documented and come in various guises. better management of spare parts and equipment stocks.successfully executing a system integration project Integrating a CMM system with calibration management software is an important step in the right direction when it comes to EAM. However. It’s about companies taking a business-wide view of all their plant equipment and coordinating maintenance activities and resources with other departments and sites. But the key to success is really the quality of information you put in the software. systems. the data has to be as close to 100% accurate as possible to get maximum benefit from the system. A CMMS integration will enable the customer company to automate its’ management with smart calibrators. Enterprise Asset Management. and optimized scheduling of maintenance tasks and resources. more efficient use of maintenance staff. a corresponding increase in asset utilization or plant uptime. Integrating a CMM system with calibration management software is an important step in the right direction when it comes to EAM. 120 . EAM is more than just maintenance management software.

121 .

.

Calibration in Industrial Applications .

124 .

Foundation Fieldbus or Profibus output of the transmitters and can even be used for configuring ‘smart’ sensors. many process plants are under pressure to calibrate instruments quickly but accurately and to ensure that the results are then documented for quality assurance purposes and to provide full traceability. the key final step in any calibration process – documentation – is often neglected or overlooked because of a lack of resources. safety or custody transfer. regular calibration of instruments throughout a manufacturing plant is common practice. regulatory bodies often need to know just how inaccurate a particular instrument is and whether it drifts in and out of a specified tolerance over time. The purpose of calibration itself is to determine how accurate an instrument or sensor is. However. including frequency and pulses. 125 . What is a documenting calibrator? A documenting calibrator is a handheld electronic communication device that is capable of calibrating many different process signals such as pressure. Although most instruments are very accurate these days. temperature and electrical signals. In plant areas where instrument accuracy is critical to ensure product quality. time constraints or the pressure of everyday activities. Some calibrators can read HART. calibration every six months – or even more frequently – is not unusual. Many process plants are under pressure to calibrate instruments quickly but accurately and to ensure that the results are then documented for quality assurance purposes and to provide full traceability. Indeed.the benefits of using a documenting calibrator The benefits of using a documenting calibrator F   or process manufacturers. and then automatically documenting the calibration results by transferring them to a fully integrated calibration management software.

The quality and accuracy of calibration results will also improve. saving time and money. By using a documenting calibrator. With a documenting calibrator.” A non-documenting calibrator is a device that does not store data. the calibration procedure can be automatically transferred from the computer to the handheld calibrator before going out into the field. This means the engineer does not have to spend time transferring the results from his notepad to final storage on a computer. As Laurila states. or stores calibration data from instruments but is not integrated to a calibration management system. again. the calibration results are stored automatically in the calibrator’s memory during the calibration process. the calibration procedure itself is critical. This means calibrations are carried out in the same way every time because the calibrator tells the engineer which test point he needs to measure next. Product Manager at Beamex in Finland comments. which makes the entire process much faster and consequently reduces costs. who are out in the field performing instrument calibrations. especially if calibration is one of the many tasks that the user has to carry out in his daily maintenance routine. “Engineers.” With a multi-functioning documenting calibrator. having an easy-touse documenting calibrator is definitely the way forward. spreadsheet or paper filling system. receive instant pass or fail messages with a documenting calibrator. Why use a documenting calibrator? The engineer does not have to write any results down on paper.the benefits of using a documenting calibrator Heikki Laurila. as there will be fewer mistakes due to human error. which makes the entire process much faster and consequently reduces costs. such as the 126 . “I would define a documenting calibrator as a device that has the dual functionality of being able to save and store calibration results in its memory. Calibration results have to be keyed manually into a separate database. as well as detailed instructions on how to calibrate the transmitter. Performing the calibration procedure in the same way each time is important for the consistency of results. The calibration results are automatically transferred from the calibrator’s memory to the computer/database. The tolerances and limits for a sensor. are entered once into the calibration management software and then downloaded to the calibrator. but which also integrates and automatically transfers this information to some sort of calibration management software. The engineer does not have to write any results down on paper. With instrument calibration. Also.

the benefits of using a documenting calibrator Beamex® MC5 or MC6. the user can download calibration instructions for hundreds of different instruments into the device’s memory before going out into the field. Unintentional errors often occur and the whole process is timeconsuming. Both calibrators can be used also to calibrate. configure and trim HART. This means the user can work in the field for several days. and the system provides full traceability. Using Beamex® CMX Calibration Software and the documenting Beamex® MC6 or MC5 Multifunction Calibrators provides full control of the entire calibration process and reduces costs by up to 50 % . typically with five-point checks on each instrument. Oftentimes. January. overlooked or overdue. the user doesn’t need to carry as much equipment while out in the field. 127 . Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks are not forgotten. The corresponding calibration results for these instruments can be saved in the device without the user having to return to his PC in the office to download/upload data. the calibration process is much faster. Laurila continues. paper-based systems for documenting. Foundation Fieldbus H1 or Profibus PA transmitters. Benefits in practice Conventional calibration work relies on manual.* Why? Because the devices provide higher accuracy.” Having a fully integrated calibration management system – using documenting calibrators and calibration management software – is important. handwrites the results onto a paper form and then reenters this information into a database when he returns to the office. Manual calibration takes more time and is more prone to error. such as the MC5 or the MC6. 2004 Calibration software ensures that calibration procedures are carried out at the correct time and that calibration tasks are not forgotten. the field engineer calibrates the instrument. overlooked or overdue. speed and accuracy are critical. ___________________ * Reported to the Industrial Instrumentation and Controls Technology Alliance and presented at the TAMU ISA Symposium. Using the MC6 or MC5 with CMX software means that calibration instructions for an instrument and calibration orders are downloaded to the calibrators and ready to guide the engineer in the field with correct calibration procedures. When you’ve got to calibrate instruments throughout a site. “With a documenting calibrator.

•  Improved accuracy. •  Less paperwork and fewer manual errors. the system provides a full quality assurance report of all instruments calibrated along with a required calibration certificate. •  Calibration results are automatically transferred from the calibrator’s memory to a computer or fully integrated calibration management system. This not only ensures full traceability but also reflects full and traceable documentation of the completed work. •  The calibration procedure itself is guided by the calibrator. which uploads detailed instructions from the computer or calibration management software. •  Calibration results are automatically stored in the calibrator’s on-board memory during the calibration procedure. •  A fully traceable calibration system for the entire plant. •  No manual printing or reading of calibration instructions is required. consistency and quality of calibration results. SUMMARY The benefits of using a documenting calibrator Calibration results are automatically transferred from the calibrator’s memory to a computer or fully integrated calibration management system. saving time and money and simplifying the process. again. 128 . •  Reduced costs from a faster and more efficient calibration process.the benefits of using a documenting calibrator After completing instrument calibrations.

129 .

.

but in the EU they are the same. the quality of a weighing instrument is already defined in OIML regulations. usually called scales or balances. should provide the correct weighing results. 131 . and the user must have the knowledge to apply the information achieved through calibration. it is the owner or the user of the instrument that carries the final responsibility of measurement capability and who is also responsible for the processes involved. for example when invoicing is based on the weight of a solid material. at least at the stage when the weighing instrument is being introduced into use. whereas verification depicts the In any case. weighing instruments. Calibration depicts the deviation between indication and reference (standard) including tolerance. One of these features includes making measurements for which legal verification is required.calibration of weighing instruments  part 1 Calibration of weighing instruments Part 1 F rom the point of view of the owner. Verification and calibration abide by a different philosophy. at least in Europe. How the weighing instrument is used and how reliable the weighing results are can be very different. In any case. (S)He must select the weighing instrument and maintenance procedure to be used to reach the required measurement capability. Calibration and legal verification Weighing instruments may also possess special features. the user must define the measurement capability of it. it is the owner or the user of the instrument that carries the final responsibility of measurement capability and who is also responsible for the processes involved. From a regulatory point of view. Calibration is a means for the user to obtain evidence of the quality of weighing results. The features may vary slightly from country to country. Using weighing instruments for legal purposes must have legal verification. If a weighing instrument is used in a quality system.

The most important aspect of a calibration program is that it allows the user to select the calibration method that corresponds to the required level of measuring tolerance. One calibration provides information on a temporary basis and a series of calibrations provides time-dependent information. However. including information for other measuring devices. if necessary. The method of calibration should be selected such that it provides sufficient information for evaluating the required measuring tolerance. This is a feasible practice for all weighing. produces a calibration certificate – is the best way to achieve reliable information to use in comparisons. Confirmation is the collecting of information We must remember that the quality of the evaluation of measuring tolerance depends on the collected information through calibration. are based on these practices of calibrating and verifying.calibration of weighing instruments  part 1 maximum permissible amount of errors of the indication. It is also handy for monitoring measuring systems. We must remember that the quality of the evaluation of measuring tolerance depends on the collected information through calibration. this means investigating the efficiency of the weighing instrument. Confirming the capability of weighing instruments should happen by estimating the quality of the measuring device in the place where it will be used. we need more information about the weighing instrument. The terminology and practices used previously for verifying measurement capability. eccentric load. such as repeatability. hysteresis. Comparing the indication of weighing instruments with a set standard gives the deviation or error. etc. this operation is known as calibration (or verification). which goes through the same steps for every calibration – calculates deviation and measuring tolerance. The method should be precise for achieving comparable results during all calibrations. In practice. even if it was a question of general weighing (non-legal). 132 . This type of program is able to store all the history of calibrated weighing instruments. and. The practical work for both methods is very similar and both methods can be used to confirm measurement capability. and it displays the history of calibrations and in this way provides the user with comprehensive information concerning measuring capability. to be able to define the measuring tolerance. and for weighing technology in general. Using a calibration program. as long as legal verification is not needed.

It is very important that the users of the weighing instruments. are familiar with these parameters and use them as protocol. However. They possess several possibilities for adjusting parameters in measuring procedures. The factors in question may include the effect of the environment where the weighing instrument is used and how often the instrument needs to be cleaned. the idea is that the weighing instrument remains in good working condition until the next calibration. Since there are several parameters in use. a quality system is usually concerned with the traceability of measurements and the known measuring tolerance of the measurements made. regular monitoring of the zero point and the indication number with a constant mass. Calibration itself. 133 . the function of weighing instruments. Performing calibrations based on the measuring tolerance is better than doing routine measuring. Therefore. it is important to always have the manual for using the weighing instrument easily available to the user. The calibration certificate of a single measuring device is used as a tool for evaluating the process of measuring tolerance and for displaying the traceability of the device in question. Today. however. however. If these values are exceeded. The user must define the limits for permitted deviation from a true value and required measuring tolerance. the user must evaluate the process of measuring tolerance and compare this value with the required measuring tolerance of the process. The content of the calibration certificate Very often the calibration certificate is put on file as evidence of a performed calibration to await the auditing of the quality system. is a short-term process.calibration of weighing instruments  part 1 The purpose of calibration and complete confirmation Calibration is a process where the user is able to confirm the correct function of the weighing instrument based on selected information. is a short-term process. the idea is that the weighing instrument remains in good working condition until the next calibration. the user must determine all of the external factors which may influence the proper functioning of the weighing instrument. For this reason. is based on microprocessors. as well as calibration personnel. Calibration itself. as well as many other instruments. an adjustment or maintenance is necessary. Calibration should be carried out using settings based on the parameters for normal use.

calibration of weighing instruments  part 1 SUMMARY Calibration (or verification) is a fundamental tool for maintaining a measuring system. 134 . It also assists the user in obtaining the required quality of measurements in a process. The following must be taken into consideration: •  the type of procedure to be applied in confirming measuring tolerance •  the interpretation of information while abiding by the calibration certificate •  changing procedures based on received information Quality calibration methods and data handling systems offer state-of-the-art possibilities to any company.

calibration of weighing instruments  part 1 135 .

.

the equipment was tested or calibrated on a regular basis. Repeated weighing measurements provide different indications Usually. Typical scale calibration involves weighing various standard weights in three separate tests: •  repeatability test •  eccentricity test •  weighing test (test for errors of indication) In the pharmaceutical industry in the United States. and published by the European Collaboration in Measurement and Standards (euromet). their customers and/or regulatory bodies. 137 . which was prepared by the European Co-operation for Accreditation. or on the potential consequences of erroneous weighing results. this information was obtained by classifying and verifying the equipment for type approval. often need to know just how inaccurate a particular scale may be. i. the object being weighed is placed on the load receptor and the weighing result is read only once. Typical calibration procedures Calibrating scales involves several different procedures depending on national. Subsequently. Guidelines on the Calibration of Non-automatic Weighing Instruments.e. Originally. but users. One clear and thorough guide is the EA-10/18. Weighing instruments are often highly accurate. their customers and/or regulatory bodies.and/or industry-specific guidelines or regulations. often need to know just how inaccurate a particular scale may be. but users.e. tests for determining minimum weighing capability are also performed. i.calibration of weighing instruments  part 11 Calibration of weighing instruments Part 2 W eighing is a common form of measurement in commerce. industries and households. If you weigh the object Weighing instruments are often highly accurate.

Substitution mass should also be used if the construction of the scale does not allow the use of standard weights. A truck scale is unsuitable for weighing letters The purpose of the minimum weight test is to determine the minimum weight. which can be assuredly and accurately measured using the scale in question. If the scale’s maximum load limit is extremely large. i.e. This condition is met if the measurement error is less than 0.73%. 138 .calibration of weighing instruments  part 11 repeatedly. random variation in the indications. with a probability of 99.g. you will notice slight. suitable substitution mass is used instead. but users. their customers and/or regulatory bodies. This enables you to correct the errors and definitions for non-linearity and hysteresis. You must also know how certain you can be about the error found at each point of calibration. In such a case. Combined standard uncertainty of the error U(E) Knowing the error of the scale indication at the point of each calibration is not sufficient. it may be impractical to use standard weights for calibrating the entire range. This is sometimes difficult due to the shape or construction of the object being weighed. e. Typical calibration procedures include the eccentricity test.: Weighing instruments are often highly accurate. Center of gravity matters The eccentricity test involves placing the object being weighed in the middle of the load receptor as accurately as possible.1% of the weight. There are several sources of uncertainty of the error. Test for errors in indication The weighing test examines the error of the indication on the scale for several predefined loads. The repeatability test involves weighing an object several times to determine the repeatability of the scale used. often need to know just how inaccurate a particular scale may be. You can determine how much the eccentricity of the load will affect the indication on the scale by weighing the same weight at the corners of the load receptor.

E = 2. •  There are random variations in the indications as can be seen in the Repeatability Test.8 g 0.73 % 3. •  Air buoyancy around the weights varies according to barometric pressure.5 g and the actual error.6 g Example: The calibration error and its uncertainty at the calibration point of 10 kg may be expressed e.45 % U(E) = 3u(E) 99. •  The weights are not in the exact middle of the load receptor. 139 . The combined standard uncertainty of the error at a certain point of calibration has a coverage probability of 68.8 g and 3.9 g 4. 1.27% as well. is between is between 1. which means that the calculated error in the indication is 2.2 g.27%.5 g ±u(E) 68. with a coverage probability of 68.2 g 3.27%).27% U(E) = 2u(E) 95. •  Digital scale indications are rounded to the resolution in use.5 g and u(E) = ±0. •  A substitute load is used in calibrating the scale.4 g 1. •  Analogous scales have limited readability.g.1 g 2. •  Air convection causes extra force on the load receptor.calibration of weighing instruments  part 11 •  The masses of the weights are only known with a certain uncertainty.7 g. The values of uncertainty determined at each point of calibration are expressed as standard uncertainties (coverage probability: 68. which correspond to one standard deviation of a normally distributed variable. air temperature and humidity.

with a coverage probability of 95.0039 kg 95. •  A load may be situated eccentrically in routine weighing. is between 1. the uncertainty of the results of later routine weighings is usually larger.5 g and the actual error. However. As the above-mentioned case indicates. This means that the calculated error of the indication is 2. Standard and expanded uncertainties of weighing results are calculated using technical data of the weighing instrument. 140 . or the reliability of the standard uncertainty value is insufficient.1 g and 3. •  Tare balancing device may be used in routine weighing. If you are able to use the k = 2 coverage factor. Normally.27% is insufficient. at least once. If the distribution of the indicated error cannot be considered normal. while calibration is made at certain calibration points. then a larger value should be used for the k-factor. the indication of weighing an object of 10 kilograms will be between 10. •  Routine weighing measurements are not repeated whereas indications received through calibrations may be averages of repeated weighing measurements. a coverage probability of 68. •  The temperature. then the error and its extended uncertainty at the point of calibration are E = 2. you know that if you repeat the calibration several times.45%. Typical reasons for this are: •  Routine weighing measurements involve random loads. •  Loading/unloading cycles in calibration and routine weighing may be different.4 g. •  The adjustment of the weighing instrument may have changed.5 g and U(E) = ±1.0011 kg and 10. for all The purpose of calibration is to determine how accurate a weighing instrument is. its calibration results. barometric pressure and relative humidity of the air may vary.45% of the time. Defining the uncertainty of weighing results is highly recommended. knowledge of its typical behaviour and knowledge of the conditions of the location where the instrument is used. •  Finer resolution is often used in calibration.45% by multiplying it with the coverage factor k = 2.calibration of weighing instruments  part 11 Expanded uncertainty in calibration U(E) In practice.9 g. it is extended to a level of 95. Uncertainty of a weighing result The purpose of calibration is to determine how accurate a weighing instrument is.

It allows you to enter additional. CMX’s scale calibration enables you to uniquely configure calibration and test each weighing instrument. Wide variation in user-specific limits is also possible. Correspondingly. user-defined uncertainty components in addition to supported uncertainty components. CMX calculates combined standard uncertainty and expanded uncertainty at calibration of the weighing instrument. Error limits can be set according to OIML or Handbook-44. Calibrating and testing weighing instruments using CMX CMX’s scale calibration enables you to uniquely configure calibration and test each weighing instrument. CMX’s versatile calibration certificate and possibility to define a user specific certificate assure that you can fulfill requirements set for your calibration certificates. 141 .calibration of weighing instruments  part 11 typical applications and always for critical applications. copying configurations from one scale to another is easy. However. determining the uncertainty of weighing results is not part of calibration. Calculating the uncertainty of weighing results assists you in deciding whether or not the accuracy of the weighing instrument is sufficient and how often it should be calibrated.

142 .

Therefore. the user must be sure to verify the permanency of accuracy. show incorrectly. T/C). Every temperature measurement is different. If temperature is a significant measurable variable from the point of view of the process.calibrating temperature instruments Calibrating temperature instruments T he most commonly and most frequently measurable variable in industry is temperature.g. energy consumption and environmental emission is significant. Temperature. For each measurement. Temperature greatly influences many physical features of matter. calibration will prove by how much. it is necessary to calibrate the instrument and the temperature sensor. RTD’s are based While standards determine accuracy to which manufacturers must comply. quality. While standards determine accuracy to which manufacturers must comply. which makes the temperature calibration process slow and expensive. makes it different from other quantities. they nevertheless do not determine the permanency of accuracy. a model that includes all influencing factors must be created. The polynoms are specified in ITS 90 table (International Temperature Scale of 1990). Temperature sensors The most commonly used sensors in the industry used for measuring temperature are temperature sensors. It is important to keep in mind an old saying: all meters. Metrology contains mathematic formulas for calculating uncertainty. A temperature measurement consists of several time constants and it is crucial to wait until thermal equilibrium is reached before measuring. being a state of equilibrium. they nevertheless do not determine the permanency of accuracy. 143 . RTD) or convert temperature into low voltage (Thermocouples. and its influence on e. including sensors. They either convert temperature into resistance (Resistance Temperature Detectors.

Most commonly. a small temperature dependent voltage difference/current can be detected. All heat 144 . Thermocouple consists of two different metal wires connected together. but to achieve better accuracy and reliability. The most important criterion in the calibration of temperature sensors is how accurate the sensors are at the same temperature. The uncertainty of calibration is not the same as the accuracy of the device. we compare the sensor to be calibrated and the reference sensor. Many factors influence the total uncertainty. and performing calibration is not the least influencing factor. Pt100 is a common RTD type made of platinum and its resistance in 0 ˚C (32 ˚F) is 100∧. PtRh/Pt/Rh) are used especially in high temperatures for better accuracy and stability. but the difference in temperature. The most important criterion in the calibration of temperature sensors is how accurate the sensors are at the same temperature. The most common T/C type is the K-type (NiCr/NiAl). such as Fieldbus transmitters. This means that the thermocouple is not measuring the temperature. Calibrating temperature instruments To calibrate a temperature sensor. while the transmitter converts the sensor signal. transmitters with a digital output signal. Temperature transmitters The signal from the temperature sensor cannot be transmitted a longer distance than the plant. Therefore. the noble thermo-elements S-. are also being adopted. temperature transmitters were developed to convert the sensor signal into a format that can be transmitted easier. and therefore the transmitter must be calibrated on regular basis. it must be inserted into a known temperature. If the connections (hot junction and cold junction) are at different temperatures.or B-type (PtRh/Pt. the transmitter converts the signal from the temperature sensor into a standard ranging between 4 and 20 mA. To make comparisons. Despite their lower sensitivity (low Seebeck coefficient). Nowadays. it also has an impact on the total accuracy. R. an external reference temperature sensor is recommended.calibrating temperature instruments on the fact that the resistance changes with temperature. Sensors are calibrated either by using temperature dry blocks for industrial field or liquid baths (laboratory). The heat source may also have an internal temperature measurement that can be used as reference. A temperature transmitter can be calibrated using a temperature calibrator.

01 (former EA-10/13). Measurement uncertainty Axial homogeneity Axial homogeneity is the temperature distribution in the measurement zone along the boring (axial temperature distribution). heat flux along the length of the thermometer stem. stability and immersion depth.calibrating temperature instruments sources show measurement errors due to their mechanical design and thermodynamic properties. Loading effect When several sensors are placed in the borings of the heat source. These effects can be quantified to determine the heat source’s contribution to the measurement uncertainty. Thirty minutes is commonly used. Immersion depth To achieve a more stable calibration. radial homogeneity. Radial homogeneity Radial homogeneity can be explained as the difference in temperature occurring between the borings. they will affect accuracy. Stem conduction. This phenomenon is called loading effect. The uncertainty of calibration is not the same as the accuracy of the device. affects both the reference sensor and the unit being tested. Stability Stability means variation of the temperature in the measurement zone over time when the system has reached equilibrium. The major sources of measurement uncertainty are axial homogeneity. loading effect. Guidelines for minimizing measurement uncertainty should be applied according to Euramet/cg-13/v. the immersion depth for a probe should be sufficient for the sensor being calibrated. 145 .

the calibration device must always be more accurate than the instrument or sensor being calibrated. The dry blocks communicate with the Beamex documenting multifunction calibrators enabling fully automated temperature calibration and documentation. more efficient and accurate solution for calibrating temperature. “The temperature products and services we are now introducing form an integral part of the Beamex® Integrated Calibration Solution. The Beamex® FB and MB dry blocks are part of the Beamex® Intergrated Calibration Solution. the calibration device must always be more accurate than the instrument or sensor being calibrated. both in numeric and in graphic form. With the CMX Software. However. 146 . CEO of Beamex Group. “This helps the client to follow the condition of the instrument. regular calibration of sensors and traceable calibration as well as calibration documentation. more accurate and efficient management of all calibration assets and procedures”. The ISO quality control system presupposes the quality control of calibration. The instrument to be calibrated is connected to the calibrator controlled by a computer. which is useful in making decisions about purchasing new instruments. The Beamex® Integrated Calibration Solution concept is the combination of calibrator. determining service in advance and recalibration. The instrument’s calibration information is saved in the calibrator and History Trend reports. such as a series of highquality dry blocks for field and laboratory use. The level of performance a calibration device needs to have depends on the accuracy requirements determined by each company. where the computer controls the calibration event. The calibration results can then be uploaded from the documenting calibrators to the Beamex® CMX Calibration Software. the calibration of instruments effecting production. However. smart reference probes and temperature calibration laboratory services. It is a complete solution for temperature calibration with various products and services. calibration software and PC for online calibration. you can print out a calibration report as well as a traceable. a complete calibration solution that enables faster.calibrating temperature instruments The calibration of instruments and sensors must be performed periodically. says Raimo Ahola. Calibration of instruments and sensors can be carried out either on site or in a laboratory. Integrated calibration solution – a smarter way to calibrate temperature Beamex has introduced a smarter.

Mr Ahola adds. calibration software and PC for online calibration. 147 .calibrating temperature instruments accredited calibration certificate. Our integrated calibration solution concept saves valuable time. eliminates any errors related to manual entry and assures repeatable calibration procedures”. The Beamex® Integrated Calibration Solution concept is the combination of calibrator.

148 .

It will also present how to calculate the total uncertainty of a calibration performed with a dry block. This article will focus on models that use interchangeable metallic multi-hole inserts. There are fast and lightweight dry blocks for industrial field use as well as models that deliver near bath-level stability in laboratory use. They may actually even impede the dry block’s performance and damage its internal components. controller. July 2007 [previously EA-10/13]): • The Euramet calibration guide defines a normative way to calibrate There are fast and lightweight dry blocks for industrial field use as well as models that deliver near bath-level stability in laboratory use. There are also some work safety issues that favor dry blocks in preference to liquid baths. it could even cause a small steam explosion which may splash hot oil on the user. EURAMET The EURAMET guideline (EURAMET /cg-13/v. What is a temperature dry block? A temperature dry block consists of a heatable and/or coolable metallic block. 149 .01. Heat transfer fluids or pastes are sometimes used around or inside the insert. If a drop of water gets into hot silicon oil. an internal control sensor and optional readout for external reference sensor. but they don’t necessarily improve performance. Dry blocks are almost without exception meant to be used dry. For example.total uncertainty of temperature calibration Calculating total uncertainty of temperature calibration with a dry block T   his article will discuss the various uncertainty components related to temperature calibration using a temperature dry-block. in temperatures above 200 °C liquids can produce undesirable fumes or there may be fire safety issues.

As most of the manufacturers nowadays publish their product specifications including the main topics in the Euramet guide. or to use an external reference temperature probe inserted in the block as a reference measurement. the following uncertainty components should be taken into account: •  D isplay accuracy (accuracy of the internal measurement) It is important to remember that all of the thermometers based on thermal contact measure their own temperature. the products are easier to compare.total uncertainty of temperature calibration dry blocks. Internal measurement as reference When using a dry block’s internal measurement as reference. •  Main topics in the EURAMET guideline include: – Display accuracy – Axial uniformity – Radial uniformity – Loading – Stability over time – Hysteresis – Sufficient immersion (15 x diameter) – Stem loss for 6 mm or greater probes – Probe clearance (<= 0.5 mm at –80…660 °C) (<= 1. There is always thermal resistance between the internal sensor and the probes inside the insert and other sources of uncertainty need to be considered. With dry blocks. whereas the probes to be calibrated are immersed in the insert. the internal control sensor is typically located inside the actual block. The Euramet calibration guide defines a normative way to calibrate dry blocks. Some manufacturers specify these components and some do not.0 mm at +660…1 300 °C) Related uncertainty components Uncertainty components that are related to temperature calibration are relevant to all manufacturers’ dry blocks. 150 . It is possible to use a dry block with the block’s internal measurement as the reference (true value).

The purpose of this homogenous measurement zone is to cover various sensor constructions. by the placement of the heaters. With this in mind. thermal properties of materials and alignment of the insert holes. The Euramet calibration guide states. for example. a homogenous zone of at least 60 mm is recommended. The thermocouple typically has its “hot junction” close to the tip of the probe whereas the PRT sensing element may be 30 to 50 mm long. •  Axial uniformity  Axial uniformity refers to the variation in temperature along the vertical length of the insert. Related uncertainty is caused. “dry wells should have a zone of sufficient temperature homogeneity of at least 40 mm in length” at the bottom of the insert. Non-symmetrical loading or probes with significantly different thermal conductivity (for example large diameter probes) may cause additional temperature variation.total uncertainty of temperature calibration MAIN PARTS OF THE DRY BLOCK Stem conductance Sensor to be calibrated Reference sensor Axial uniformity Internal sersor Radial uniformity Uncertainty components that are related to temperature calibration are relevant to all manufacturers’ dry blocks. 151 . • Radial uniformity   Radial uniformity refers to the variation in temperature between the holes of the insert.

–  The Euramet calibration guide defines stability as a temperature variation over a 30-minute period. the more the ambient temperature will affect the measurements. plus the length of the sensing element. 152 . Stability describes how well the temperature remains the same during a given time. • Immersion Sufficient immersion is important in any temperature measurement. lead wire thermal conductivity etc.  The Euramet calibration guide states that the immersion depth should be at least 15 x the probe’s outer diameter. This means that the temperature of the dry block may be a bit different depending on the direction from which the set point is approached. wall thickness. If sufficient recommended immersion cannot be reached.total uncertainty of temperature calibration • Loading effect Every probe in the insert conducts heat either from or into the  insert. as a rule of thumb.). The loading effect is not visible in the control sensor indication and the controller cannot completely compensate for this shift. • Hysteresis Hysteresis causes the internal sensor to be dependent on its previous  exposure. The hysteresis is greatest at the mid-point and is proportional to the temperature range. when the system has reached equilibrium. • Stability over time –  Stability describes how well the temperature remains the same during a given time. As the probe constructions vary greatly (sheet material. to use immersion depth of 20 x the diameter. it should be estimated or evaluated. To minimize the stem conduction error it’s recommended. a test for each individual probe type to be calibrated should be made. Sufficient immersion depth and dual zone control helps to reduce load-related uncertainties. The specifications for the above uncertainty components should be in the block’s specifications. then the uncertainty caused by the insufficient immersion should be estimated/ evaluated. The more the load. If some component has not been specified.

• Loading effect Since the internal sensor cannot completely compensate the load related temperature shift inside the insert. In case the probe to be calibrated is short and won’t reach the measurement zone at the bottom of the insert. etc. the external reference enables more accurate measurement of the temperature of the probes to be calibrated. The internal sensor has to deal with quick temperature changes. Unfortunately.total uncertainty of temperature calibration Using an external reference sensor as reference Unlike using the dry block’s internal sensor as a reference. the external reference sensor is within the same calibration volume as the sensors to be calibrated. the reference probe can be drawn out to match the immersion. the stem conductance has to be taken into account. Using an external reference sensor enables more accurate measurement of the temperature of the probes to be calibrated. Using an external reference sensor enables smaller total uncertainty of the system. If the reference sensor and the sensor to be calibrated are sufficiently similar in diameter and thermal conductivity. In the case of using an external reference sensor. hysteresis. the user can reduce the axial uniformity well below specification. 153 . In many cases. There are many advantages to using a separate reference sensor. mechanical robustness is usually inversely proportional to good performance: stability. The loading effect is usually much less significant with an external reference sensor. vibration and possible mechanical shocks so it has to be quite a robust mechanically. the following uncertainty components should be taken into account: • Axial uniformity A xial uniformity-related uncertainty can be minimized by aligning  the centers of the sensing elements. The internal sensor is used just to adjust temperature close to the desired calibration point and keep it stable. Therefore. the user may obtain good results. the external reference sensor is inside the insert together with the probes to be calibrated. • Radial uniformity Radial uniformity is still present when using an external reference  probe and should be taken into account as specified. Of course. It helps to minimize calibration uncertainty but also provides reliability in measurements.

hysteresis. 154 . drift. and it may often be smaller than the specification. stem conduction. for instance. However. It can be the block or an external device. and the readout device’s uncertainty. the different loads. – Of course. • External reference sensor – The external reference sensor (PRT) is typically much more capable of producing accurate measurements than the internal sensor. All of the previously mentioned uncertainty factors need to be carefully considered.total uncertainty of temperature calibration • Stability over time The external reference sensor can be used to measure the actual  temperature deviation inside the insert. It also helps the user to see when the unit has truly stabilized. there may still be some difference between the block and the insert temperatures when the indicator shows the unit has stabilized. The external reference sensor can be used to measure the actual temperature deviation inside the insert and it also helps the user to see when the unit has truly stabilized. the external reference sensor needs a unit that measures the sensor. – Uncertainty related to the reference probe components includes the probe’s calibration uncertainty. Dry blocks usually have a stability indicator. but depending on. using an external reference does not automatically mean better results.

004 0. The temperature in both examples is 0 °C.007 0.020 Standard Uncertainty (°C) 0.025 0.10 0.003 0.010 0.006 0.05 Combined Uncertainty: Expanded Uncertainty: Standard Uncertainty (°C) 0. One is done using the internal temperature measurement and the other with a reference probe.029 0.02 0. In both cases the MB155R is used as the dry block.067 0.02 0. The standard uncertainties are combined as the root sum of the squares.034 Standard Uncertainty (°C) 0.004 0.005 0. When using an external reference sensor the total expanded uncertainty is 34 mK (0. Finally the combined uncertainty has been multiplied by two to get the expanded uncertainty.01 Combined Uncertainty: Expanded Uncertainty: MB155R and RPRT-420 Combined uncertainty: Expanded Uncertainty: 0.003 0.01 0. 155 .005 0.006 0. That is why they are divided by the square root of three to get Standard Uncertainty.014 0.034 °C). MB155R with internal measurement @0 °C Component Display Accuracy Hysteresis Axial Uniformity Radial Uniformity Stability Loading Effect Specification (°C) 0.005 0. they are divided by the square root of three to get the Standard Uncertainty.007 0.total uncertainty of temperature calibration CALCULATION EXAMPLES ■  Here are two examples of total uncertainty calculations.01 0.006 Combined Uncertainty: Expanded Uncertainty: Reference Sensor (Beamex RPRT-420) Component Short-term repeatability Drift Hysteresis Calibration uncertainty Specification (°C) 0.012 0. The various uncertainty components used in the examples can be found in the specifications in the product brochures.135  °C). Due to the rectangular probability distribution of the specifications. As can be seen in the examples the total expanded uncertainty using the internal reference sensor is 135 mK (0.014 0.003 0.006 0.012 0.01 0.058 0.017 0.135 MB155R with external measurement @0 °C Component Axial Uniformity Radial Uniformity Stability Loading Effect Ref sensor measurement Specification (°C) 0.003 0.006 0.028 All specifications have a rectangular probability distribution.

.

fieldbus transmitters must also be calibrated Fieldbus transmitters must also be calibrated F ieldbus is becoming more and more common in today’s instrumentation. During the 1960s. Fieldbus transmitters are able to deliver a huge amount of information via the quick two-way bus. and throughout the 1990s a number of various fieldbuses were developed. smart transmitter was introduced in the 1980s. During the 1990s. one way. manufacturers battled to see whose fieldbus would be the one most commonly used. but how can it be done? 157 . But what is fieldbus and how does it differ from conventional instrumentation? Fieldbus transmitters must be calibrated as well. making things much easier. instruments from any manufacturer can be connected to the same fieldbus as plug-and-play. A standard was finally set in the year 2000 when the IEC61158 standard was approved. the mA signal was introduced. but how can it be done? Conventional transmitters can deliver only one simultaneous parameter. Each transmitter needs a dedicated pair of cables. The first digital. In the 1970s. The Fieldbus transmitters must be calibrated as well. higher-level fieldbus backbone. The first fieldbus was introduced in 1988. History of fieldbus Back in the 1940s. Conventional I/O systems are no longer needed because segment controllers connect the instrument segments to the quicker. Being an open standard. Several transmitters can be connected to the same pair of wires. instrumentation utilized mainly pneumatic signals to transfer information from transmitters. using first proprietary protocols. and I/O subsystems are required to convert the analog mA signal to a digital format for a control system. computerized control systems began to make their arrival.

less planning/drawing costs. One important reason is the better return on investment. were chosen as standards. The development of new fieldbuses has slowed down and it is unlikely that new fieldbus standards will appear in the near future to challenge the position of Foundation Fieldbus or Profibus. the total installation costs for a fieldbus factory is far less than conventional. 158 . For the most part. a large number of fieldbus installations already exist and the number is increasing at a huge rate. Fieldbus benefits for industry Obviously process plants would not start utilizing fieldbus. such as reduction in field wiring. The Foundation Fieldbus and Profibus have begun to clearly dominate the fieldbus markets. offering maintenance savings. Future of fieldbus Currently. The Foundation Fieldbus and Profibus have begun to clearly dominate the fieldbus market. and no need for conventional I/O subsystems. or even a little bit more. if it would not offer them benefits compared to alternative systems.fieldbus transmitters must also be calibrated Foundation Fieldbus H1 and the Profibus PA. A large portion of new projects is currently being carried out using fieldbus. There are also certain applications that prefer certain fieldbus installations despite the geographical location. both used in process instrumentation. lower installation labour cost. Another big advantage is the on-line self-diagnostics that helps in predictive maintenance and eventually reduces the downtime. There are also other advantages compared to conventional instrumentation. Although fieldbus hardware may cost the same as conventional. Other areas are more divided. The improved system performance is important criteria for some plants. Remote configuration also helps to support reduced downtime. Critical applications and hazardous areas have also begun to adopt fieldbus. Recent co-operation between Foundation Fieldbus and Profibus suppliers will further strengthen the position of these two standards. one can say that the Foundation Fieldbus is dominating the North American markets and the Profibus is the market leader in Europe. Both Foundation Fieldbus and Profibus have reached such a large market share that both buses will most likely remain also in the future. This is caused by many reasons.

Reading the digital output is not always an easy thing to do. Calibrating fieldbus transmitters The word “calibration” is often misused in the fieldbus terminology when comparing it to the meaning of the word in metrology. Fieldbus transmitters are calibrated in very much the same way as conventional transmitters – you need to place a physical input into the transmitter and simultaneously read the transmitter output to see that it is measuring correctly. In fieldbus terminology. Changing the output signal does not change the need for periodic calibration. but you also need to have a way to read the output of the fieldbus transmitter. “calibration” means that you compare the transmitter to a traceable measurement standard and document the results. Also.fieldbus transmitters must also be calibrated Fieldbus transmitters must also be calibrated The main difference between a fieldbus transmitter for pressure or temperature and conventional or HART transmitters is that the output signal is a fully digital fieldbus signal. 159 . it does not eliminate the need for calibration. So it is not possible to calibrate a fieldbus transmitter using only a configurator or configuration software. such as quality systems and regulations. When fieldbus is up and running. Although fieldbus hardware may cost the same as conventional. “calibration” is often used to mean the configuration of a transmitter. While your fieldbus and process automation systems are idle. that make the periodic calibrations compulsory. There are also many other reasons. the total installation costs for a fieldbus factory is far less than conventional. Although modern fieldbus transmitters have been improved compared to older transmitter models. The input is measured with a traceable calibrator. or even a little bit more. Naturally these two people need to communicate with each other in order to perform and document the calibration. you need to find other ways to read the transmitter’s output. In some cases you can use a portable fieldbus communicator or a laptop computer with dedicated software and hardware. it is not possible to calibrate a fieldbus transmitter remotely. The other parts of a fieldbus transmitter are mainly comparable to conventional or HART transmitters. In terminology pertaining to metrology. you can have one person in the field to provide and measure the transmitter input while another person is in the control room reading the output.

time-consuming and may require an abundance of resources. 160 . The calibration results can be automatically stored into the memory of the MC6 or uploaded to calibration software. The Beamex® MC6 will help to overcome these challenges by combining a full field communicator and an extremely accurate multifunctional process calibrator. The Beamex® MC6 can be used as a communicator for the configuration as well as a calibrator for the calibration of smart instruments with the supported protocols. The Beamex® MC6 can be used as a communicator for the configuration as well as a calibrator for the calibration of smart instruments with the supported protocols. There is no need for an additional communicator.fieldbus transmitters must also be calibrated Fieldbus instruments are increasing in popularity and calibration can in many cases be cumbersome.

.

.

for the engineers who need to configure and calibrate the transmitter. the digital communication protocol is the biggest difference compared to conventional transmitters. Generally. There is no standardized technical definition for what smart really means in practice. etc. temperature. etc. produce diagnostics.configuring and calibrating smart instruments Configuring and calibrating smart instruments S o called “smart” instruments are ever more popular in the process industry. Engineers can A modern smart transmitter typically outperforms an older type of conventional transmitter regarding measurement accuracy and stability. it will utilize a microprocessor and should also have a digital communication protocol that can be used for reading the transmitter’s measurement values and for configuring various settings in the transmitter. What is a “smart” transmitter? A process transmitter is a device that senses a physical parameter (pressure. Furthermore. a modern smart transmitter typically outperforms an older type of conventional transmitter regarding measurement accuracy and stability. A microprocessorbased smart transmitter has a memory that can perform calculations. 163 . In any case. These new smart instruments bring new challenges to the calibration and configuration processes. The vast majority of delivered instruments today are smart instruments. But what are these smart instruments and what is the best way to configure and calibrate them? Beamex has recently introduced a new revolutionary tool. The term “smart” is more of a marketing term than a technical definition. in order for a transmitter to be called smart. that will help to overcome these challenges.) and generates an output signal proportional to the measured input. the Beamex® MC6 –Advanced Field Communicator and Calibrator.

So in order to do the configuration. Foundation Fieldbus and Profibus PA protocols. It is crucial to remember that although a communicator can be used for configuration. Some are proprietary protocols of a certain manufacturer. but these seem to be fading out in popularity and favor is being given to protocols based on open standards because of the interoperability that they enable. i. but they need to have the possibility to communicate with the transmitter and read the digital signal. The fieldbuses. it is not a reference standard and therefore cannot be used for metrological calibration. That brings a whole new challenge . etc. A HART transmitter contains both a conventional analog mA signal and a digital signal superimposed on top of the analog signal. would be a transmitter with a purely analog (or even pneumatic) output signal. The configuration needs to be done via the communication protocol. no analog signal. The most common transmitter protocol today is the HART (Highway Addressable Remote Transducer) protocol. sensor type. you will need to use some form of configuration device. Configuration One important feature of a smart transmitter is that it can be configured via the digital protocol.how can the digital output be read? Thinking of the opposite of a smart transmitter. typically also called a communicator. Recently the HART protocol seems to be getting more boosts from the newest WirelessHART protocol. Foundation Fieldbus and Profibus are gaining a larger foothold on the process transmitter markets. Since it also has the analog signal. WirelessHART. Configuration of a smart transmitter refers to the setting of the transmitter parameters. it is compatible with conventional installations. to support the selected protocol. such as Foundation Fieldbus and Profibus. a non-smart transmitter.configuring and calibrating smart instruments no longer simply measure the output analog signal. These parameters may include engineering unit. Most of the protocols are based on open standards. Smart transmitter protocols There are various digital protocols that exist among transmitters considered smart. This article will discuss “smart” transmitters. 164 . including HART. contain only a digital output.e.

e. potential adjustments are often included when the calibration process is performed. to do any configuration or trimming. But how can a smart transmitter. To calibrate a conventional. analog transmitter. it is not a reference standard and therefore cannot be used for metrological calibration.configuring and calibrating smart instruments It is crucial to remember that although a communicator can be used for configuration. calibration is a comparison of the device under test against a traceable reference instrument (calibrator) and documenting the comparison. by using a calibrator. Calibration of a smart transmitter According to international standards. 165 . Wired HART (as opposed to WirelessHART) is a hybrid protocol that includes digital communication superimposed on a conventional analog 4–20mA output signal. or to read the digital output signal (if it is used). be calibrated? Obviously the transmitter input still needs to be generated/measured the same way as with a conventional transmitter. The calibration may. several types of devices may be needed and several people to do the job. Sometimes it is very difficult or even impossible to find a suitable device. by using a calibrator. with output being a digital protocol signal. Configuring the parameters of a smart transmitter with a communicator is not in itself a metrological calibration (although it may be part of an adjustment/trim task) and it does not assure accuracy. The 4–20mA output signal of a wired HART transmitter is calibrated the same way as a conventional transmitter. to see what the transmitter output is. you need a dual-function calibrator able to process transmitter input and output at the same time. by definition a traceable reference standard (calibrator) is always needed. However.e. therefore. or alternatively two separate singlefunction calibrators. which can read the digital output. a HART communicator is needed. i. it will automatically document the calibration results. In this case calibration is quite easy and straight forward. For a real metrological calibration. However. If the calibration is done with a documenting calibrator. The transmitter input needs to be generated/ measured the same way as with a conventional transmitter. especially a mobile one. i. a device or software able to read and interpret the digital protocol is needed. Although the calibration formally does not include any adjustments. you can generate or measure the transmitter input and at the same time measure the transmitter output. but in order to see what the transmitter output is. be a very challenging task. you will need some device or software able to read and interpret the digital protocol.

which means that it is fast and easy to use. including power supply and required impedances for the protocols. such as the pharmaceutical. It has a large 5. The MC6 supports all of the protocol commands according to the transmitter’s device description file.configuring and calibrating smart instruments The solution The new Beamex ® MC6 is a device combining a full field communicator and an extremely accurate multifunctional process calibrator. The Beamex® MC6 can be used both as a communicator for the configuration and as a calibrator for the calibration of smart instruments with the supported protocols. the smart transmitter’s input can be generated/ measured at the same time as reading the digital output. energy. Foundation Fieldbus and Profibus PA instruments. It offers calibration capabilities for pressure. high-accuracy field calibrator and communicator. temperature and various electrical signals. Any additional communicator is therefore not needed. The usability and ease-of-use are among the main features of the MC6. food and beverage. The robust IP65-rated dustand water-proof casing. a separate communicator is needed in any case. The MC6 also contains a full fieldbus communicator for HART. calibrator. With the Beamex® MC6. The operation modes are: meter. The results can be automatically stored into the memory of the MC6 or uploaded to calibration software. ergonomic design and light weight make it an ideal measurement device for field use in various industries. WirelessHART. With the Beamex® MC6. About Beamex® MC6 Beamex® MC6 is an advanced. typically only for one protocol (mostly HART) and offering very limited support.7" color touch-screen with a multilingual user interface. When it comes to configuration of the smart transmitters. 166 . Foundation Fieldbus H1 and Profibus PA protocols. oil and gas. In practice. service as well as the petrochemical and chemical industries. The MC6 is one device with five different operational modes. the MC6 includes a full field communicator for HART. the smart transmitter’s input can be generated/ measured at the same time as reading the digital output. All required electronics are built-in. There are some other “smart” process calibrators on the market with limited support for different protocols. and you can carry less equipment in the field.

the MC6 is more than a calibrator. In addition.employee safety as well as customer/patient safety. •  Environmental reasons. There are numerous reasons to calibrate instruments initially and periodically.configuring and calibrating smart instruments documenting calibrator. such as quality systems. Why calibrate? A modern transmitter is advertised as being smart and extremely accurate and sometimes sales people tell you they don’t need to be calibrated at all because they are so “smart”. safety systems. The MC6 also contains a full fieldbus communicator for HART. So why would you calibrate them? First of all. enabling fully automated and paperless calibration and documentation. •  Economical reasons – any measurement having direct economical effect. •  Even the best instruments and sensors drift over time. standards. A short summary of the main reasons include. etc. environmental systems. •  To achieve high and consistent product quality and to optimize processes. •  Safety reasons. In conclusion. 167 . especially when used in demanding process conditions. Foundation Fieldbus and Profibus PA instruments. the output protocol of a transmitter does not change the fundamental need for calibration. •  Regulatory requirements. data logger and Fieldbus communicator. the MC6 communicates with Beamex® CMX Calibration Software.

.

in many locations. Even seemingly safe water treatment systems use combustible materials such as chlorine in their processes. combustible fuels. pharmaceuticals. Similarly. The materials and fluids used in some processes can be hazardous in the sense that they can ignite or explode. universities. including agriculture. water/ wastewater. For example. 169 . such as reactors that hydrogenate oils. abnormal conditions. Therefore. For example. However. In addition. and chemical plants are flammable and are typically contained within vessels and pipes. pulp/paper. power generation. many materials and fluids used in seemingly “safe” industries are themselves flammable. and fluid accumulation may allow hydrocarbons to be present such that the flame could ignite the hydrocarbons with disastrous results. food. certain areas of food plants. oil refineries. If this were truly the case.calibration in hazardous environments Calibration in hazardous environments S triking a match in an environment that contains combustible gas is nothing short of dangerous – personal injury and property damage are likely consequences. it is important for plants to examine their processes and identify hazardous locations so that the proper instruments are Many materials and fluids used in seemingly “safe” industries are themselves flammable. and in the home. may pose hazards as well. This means that certain areas of a water treatment plant may well be considered hazardous. retail. Improperly calibrating an instrument in this hazardous environment can be almost as dangerous. such as natural gas. hydrocarbons in mines. are used in all industries. Hydrocarbons and other flammable fluids are not limited to the petroleum and chemical industries. leaks. an external flame would not ignite the hydrocarbons.

The intensity with which various vapors can combust is generally different. a hot surface temperature on a device can cause ignition. 170 . and the protection concept applied. frequency of the hazard. the IS concept is to design the calibrator such that it limits the amount of energy available such that it cannot ignite a combustible gas mixture. Adding the applicability of IS designs to various hazards in the previous table yields: Zone 0 ia Flammable material present continuously Zone 1 ia. Equipment requirements in hazardous locations Protection requirements for hazardous locations vary according to the type of material present. ib Flammable material present abnormally In addition. Groupings (IEC 60079-10) in order of decreasing ignition energy (with an example of a gas in the group) are: Intrinsic Safety (IS) is the most common protection concept applied to calibrators used in hazardous locations. ib Flammable material present intermittently Zone 2 ia. Temperature classes limit the maximum surface temperature between 450˚C (T1) and 85˚C (T6). In general. Group IIC Acetylene Group IIB+H2 Hydrogen Group IIB Ethylene Group IIA Propane The hazardous area classifications (IEC 60079-10) in order of decreasing frequency are: Zone 0 Flammable material present continuously Zone 1 Flammable material present intermittently Zone 2 Flammable material present abnormally Intrinsic Safety (IS) is the most common protection concept applied to calibrators that are used in hazardous locations.calibration in hazardous environments selected. installed. and maintained in accordance with practices that are appropriate for the hazard.

it is practical to remove these instruments and calibrate them on the workshop with a calibration test bench. and other variables designed in hazardous locations are generally used to monitor and control the process. which means that many instruments are calibrated in the field. and tracks calibration history. This software generally makes calibration work faster and easier and is designed to integrate into management systems such as SAP and Maximo. they can be used for the overwhelming majority of applications where a vapor hazard is present. The Beamex® CMX software integrates calibration management by allowing efficient planning and scheduling of calibration work. The Beamex® CMX software integrates calibration management by allowing efficient planning and scheduling of calibration work. The Beamex modular calibration system is a test bench and calibration system for workshops and laboratories that incorporates the functionality of the MC5 multifunction calibrator and can measure/generate additional parameters such as precision pressures. 171 . This is usually not the case. level. In some applications. creates documentation. and frequency measurements. but also automatically takes data. It not only alerts you when to calibrate. Calibration solutions for hazardous locations Instruments designed to measure flow. temperature.calibration in hazardous environments Beamex calibrators for hazardous locations are designed and certified for Ex ia IIC T4 hazards per the ATEX Directive and are applicable to all vapor hazards where a temperature class of 135˚C in a 50˚C ambient is acceptable. adheres to GMP regulations (21 CFR 11). RTD. voltage. thermocouple. As such. The ergonomic design and modular construction allow the user to select the necessary functions in a cost-effective manner. pulse. current. The Beamex multifunction IS-calibrators are portable and intrinsically safe and have modules that can accommodate wide ranges and many types of pressure. there are calibrators that are specifically designed to operate safely in rugged environments and hazardous locations. pressure. Fortunately.

calibration in hazardous environments A few points to remember •  Improper actions in hazardous locations can result in property damage and bodily injury. •  Instruments should be specified. and in the home. •  Hazardous locations can exist in virtually all industries. •  Portable Beamex calibrators for hazardous locations are designed to be used in virtually all vapor hazards. stores. Hazardous locations can exist in virtually all industries. installed. operated. stores. and maintained in accordance with requirements for the hazardous location. 172 . and in the home.

calibration in hazardous environments 173 .

.

When safety becomes a top priority issue in calibration. such as gases. and if it occurs it will exist only for a short time. “calibrators for Ex Areas”. An Ex Area also refers to an explosive environment and an Ex calibrator is a device designed for use in the type of environment in question. An intrinsically safe calibrator is therefore designed to be incapable of causing ignition in the surrounding environment with flammable materials. Intrinsically safe calibrators are also often referred to being “Ex calibrators”. The idea behind intrinsic safety is to make sure that the available electrical and thermal energy in a system is always low enough that ignition of the hazardous atmosphere cannot occur. There are also industrial environments where the calibration of fieldbus instruments should not only be made accurately and efficiently. Zone 1: an explosive gas & air mixture is likely to occur in normal operation. 175 . A hazardous atmosphere is an area that contains elements that may cause an explosion: source of ignition. such as oil refineries. By definition. a flammable substance and oxygen. Intrinsically safe calibrators are designed for potentially explosive environments. Zone 2: an explosive gas & air mixture is not likely to occur in normal operation. mists. The concept has been developed for safely operating process control instrumentation in hazardous areas. vapors or combustible dust. Where is intrinsically safe calibration required? Many industries require intrinsically safe calibration equipment. intrinsic safety (IS) is a protection technique for safely operating electronic equipment in explosive environments. intrinsically safe fieldbus calibrators enter into the picture. gas Hazardous area classifications in IEC/ European countries are: Zone 0: an explosive gas & air mixture is continuously present or present for a long time. but also safely. or “IS calibrators”. rigs and processing plants.the safest way to calibrate fieldbus instruments The safest way to calibrate fieldbus instruments F ieldbus transmitters must also be calibrated just like conventional instruments.

176 . Are intrinsically safe calibrators technically different from regular industrial calibrators? Intrinsically safe calibrators are different from other industrial calibrators in both design and technical features. provides additional efficiency improvements with its seamless communication with calibration software. as they can be safely used in environments where the risk of an explosion exists. such as the Beamex® MC5. the calibrators provide performance and functionality. there are also some guidelines and constraints for how to use them in hazardous areas. which should be read carefully before using the device. In view of safety. This eliminates the need of manual recording of calibration data and improves the quality and productivity of the entire calibration process. any potentially explosive industrial environment can benefit from using intrinsically safe calibrators. as well as pharmaceutical plants. The product safety note lists all the “do’s and don’ts” for safe calibration. intrinsically safe calibrators are the only technique permitted for Zone 0 environments (explosive gas and air mixture is continuously present or present for a long time). Performance and functionality Multifunctional intrinsically safe calibrators provide the functionality and performance of regular industrial calibration devices. Safest possible technique Intrinsically safe calibrators are safe for employees. What are the benefits of using intrinsically safe calibrators? There are clear benefits in using intrinsically safe calibration equipment. First of all. Secondly. it is the safest possible technique. temperature and electrical signals.IS. any potentially explosive industrial environment can benefit from using intrinsically safe calibrators.the safest way to calibrate fieldbus instruments pipelines and distribution centres. In addition. A documenting intrinsically safe calibrator. They can be used for calibration of pressure. petrochemical and chemical plants. Basically. Basically. Every intrinsically safe calibrator is delivered with a product safety note. but in a safe way.

. quicker to discharge • Battery must be charged in a non-Ex area • When using external pressure modules. Making a calibrator safe and unable to cause ignition – typical technical differences: • Surface made of conductive material • Constraints in using the device (listed in Product Safety Note) • Small differences with electrical ranges (e. When charging the battery. IECEx is an international scheme for certifying procedures for equipment designed for use in explosive atmospheres. maximum is lower). Many times intrinsically safe equipment operate only with dry batteries. 177 The differences in design and technical features were made with one purpose in mind – to ensure that the device is safe to use and is unable to cause an ignition. the ATEX rules are obligatory for electronic and electrical equipment that will be used in potentially explosive atmospheres sold in the EU as of July 1. but the Beamex intrinsically safe calibrators operate with chargeable batteries.g. The battery of an intrinsically safe calibrator is usually slower to charge and it discharges quicker.g. it must be done in a non-Ex area. External pressure modules can be used with IS-calibrators. 2003. IEC (International Electrotechnical Commission) is a nonprofit international standards organization that prepares and publishes international standards for electrical technologies. In addition. maximum is lower) • Battery slower to charge. There are also usually small differences with electrical ranges compared to regular industrial calibrators (e. The objective of the IECEx Scheme is to facilitate international trade in equipment and services for use in explosive atmospheres. but they must also be intrinsically safe. The surface of the device is made of conductive material. ATEX 95 equipment directive 94/9/EC concerns equipment intended for use in potentially explosive areas.the safest way to calibrate fieldbus instruments The differences in design and technical features were made with one purpose in mind—to ensure that the device is safe to use and is unable to cause an ignition. Companies in the EU where the risk of explosion is evident must also use the ATEX guidelines for protecting the employees. explosive atmospheres in French) is a standard set in the European Union for explosion protection in the industry. they must be IS-versions What are ATEX and IECEx? ATEX (“ATmosphères EXplosibles”. The IEC TC/31 technical committee deals with the standards related to equipment for explosive atmospheres. while maintaining the required level of safety.

Recalibration can be done by calibration laboratories (still preferably with ISO/IEC 17025 accreditation). The MC5-IS is also ATEX and IECEx certified. which means that it communicates seamlessly with calibration software. Safe fieldbus calibration with the Beamex® MC5-IS Intrinsically Safe Multifunction Calibrator The Beamex® MC5-IS Intrinsically Safe Multifunction Calibrator is a high accuracy. It is a documenting calibrator. it ensures the calibrator is fit for its intended purpose and that sufficient information is supplied with it to ensure that it can be used safely. The most important thing to remember is that an intrinsically safe calibrator must maintain its intrinsic safety after the service or repair. The MC5-IS also has HART communication. temperature. Is service different for intrinsically safe calibrators? There are certain aspects that need special attention when doing service or repair on an intrinsically safe calibrator. The most important thing to remember is that an intrinsically safe calibrator must maintain its intrinsic safety after the service or repair. Using documenting calibrators with calibration software can remarkably improve the efficiency and quality of the entire calibration process. The MC5-IS has calibration capabilities for pressure.the safest way to calibrate fieldbus instruments As Beamex® MC5-IS Intrinsically Safe Multifunction Calibrator is certified according to ATEX and the IECEx Scheme. Being an all-in-one calibrator. all-in-one calibrator for extreme environments. The MC5-IS can also be used for calibrating Foundation Fieldbus H1 or Profibus PA transmitters. the MC5-IS replaces many individual measurement devices and calibrators. electrical and frequency signals. The MC5-IS also has HART communication. The best way to do this is to send it to the manufacturer or to an authorized service company for repair. 178 .

.

.

2004. In technical. the purpose of common dictionaries is to record the ways that people actually use words. only the technical definitions should be used. In some cases several may be merged to better clarify the meaning or adapt the wording to common metrology usage. NCSL Glossary. The technical definitions may be different from the definitions published in common grammar dictionaries. If a word is defined in a technical standard. However. It is a supplement to the VIM. international and industry standards.appendix: calibration terminology a to z 1 Calibration terminology A to Z 1 T his glossary is a quick reference to the meaning of common terms. journals. and other publications. so everyone in the business knows what it is. 181 . scientific and engineering work (such as metrology) it is important to correctly use words that have a technical meaning. Those documents give the intended meaning of the word. and the information in the other references listed at the end. not to standardize the way the words should be used. ______________ 1. The Metrology Handbook. In technical work. Definitions of these words are in relevant national. GUM. Milwaukee: ASQ Quality Press. as well as publications of relevant technical and professional organizations. its definition from a common grammar dictionary should never be used in work where the technical standard can apply. Jay L. Many of these definitions are adapted from the references.  Bucher.

with the documented measurement parameters and their best uncertainties. Accreditation criteria – Set of requirements used by an accrediting body that a laboratory must meet in order to be accredited. NCSL Glossary of metrology-related terms. Boulder. IEC. 1997.appendix: calibration terminology a to z 1 Terms that are not in this glossary may be found in one of these primary references: 1. U. Accreditation body – An organization that conducts laboratory accreditation evaluations in conformance to ISO Guide 58. Boulder. International vocabulary of basic and general terms in metrology (called the VIM). and OIML. CO: NCSL International.) Glossary Accreditation (of a laboratory) – Formal recognition by an accreditation body that a calibration or testing laboratory is able to competently perform the calibrations or tests listed in the accreditation scope document. 3.  NCSL. IUPAC. (VIM. (It is assumed that a calibration or metrology activity owns copies of these as part of its basic reference material. Accuracy (of a measurement) – Accuracy is a qualitative indication of how closely the result of a measurement agrees with the true value of the parameter being measured. IUPAP. An accreditation certificate without the documented parameters is incomplete. 3. Accreditation includes evaluation of both the quality management system and the competence to perform the measurements listed in the scope. ISO. ANSI/NCSL Z540-2-1997. serves as proof of accredited status for the time period listed. Accreditation certificate – Document issued by an accreditation body to a laboratory that has met the conditions and criteria for accreditation. CO: NCSL International. 2nd ed. The certificate. Geneva: ISO. but should be considered an addition to the references listed above.  ISO. 2. Some terms may be listed in this glossary in order to expand on the definition. accuracy of a measurement is always 182 . 1999.5) Because the true value is always unknown. IFCC. S. BIPM.  ANSI/NCSL. 1993. not a replacement of them. Guide to the expression of uncertainty in measurement (called the GUM).

Calibration is performed with the item being calibrated in 183 . Bespoke/Customized computerised system – A computerised system individually designed to suit a specific business process Best measurement capability – For an accredited laboratory. systematic error Calibration – (1). realize. Contrast with: accuracy (of a measurement) Application – Software installed on a defined platform/hardware providing specific functionality Assessment – Examination typically performed on-site of a testing or calibration laboratory to evaluate its conformance to conditions and criteria for accreditation.appendix: calibration terminology a to z 1 an estimate. See also: correction. 5. conserve. It is the process of verifying the capability and performance of an item of measuring and test equipment by comparison to traceable measurement standards. 5. (VIM.18) Accuracy is a design specification and may be verified during calibration.25) The value and direction of the bias is determined by calibration and/or gage R&R studies. compensates for the bias. or reproduce a unit of that quantity or one or more of its values. which is always the negative of the bias. An accuracy statement by itself has no meaning other than as an indicator of quality.” (EA-4/02) The best measurement capability is based on evaluations of actual measurements using generally accepted methods of evaluating measurement uncertainty. Contrast with: accuracy (of a measuring instrument) Accuracy (of a measuring instrument) – Accuracy is a qualitative indication of the ability of a measuring instrument to give responses close to the true value of the parameter being measured. Adding a correction. (VIM. or when performing more-or-less routine calibrations of nearly ideal measuring instruments designed for the measurement of that quantity.11 and NCSL pages 4–5 for primary and secondary definitions. (See VIM 6. It has quantitative value only when accompanied by information about the uncertainty of the measuring system. Bias – Bias is the known systematic error of a measuring instrument. the best measurement capability for a particular quantity is “the smallest uncertainty of measurement a laboratory can achieve within its scope of accreditation when performing more or less routine calibrations of nearly ideal measurement standards intended to define.) Calibration is a term that has many different – but similar – definitions.

See also: performance test. or lamps. report. but of unverified accuracy. •  In some cases. determining or assigning one or more values. This may be in the form of a pass/fail decision. calculation of correction factors or adjustment of the instrument being compared to reduce the magnitude of the inaccuracy. The result of a calibration is a determination of the performance quality of the instrument with respect to the desired specifications. The calibration process consists of comparing an IM&TE unit with specified tolerances. •  Calibration does not include any maintenance or repair actions except as just noted. if necessary. fuses. Many calibration procedures in manufacturers’ manuals are actually factory alignment procedures that only need to be performed if a UUC is in an indeterminate state because it is being manufactured. Calibration provides assurance that the instrument is capable of making measurements to its performance specification when it is correctly used. 184 . under a set of specified and controlled measurement conditions. minor repair such as replacement of batteries. or minor adjustment such as zero and span. Notes: •  A requirement for calibration does not imply that the item being calibrated can or should be adjusted. or is otherwise in an indeterminate state. calibration procedure Contrast with: calibration (2) and repair Calibration – 2 A) Many manufacturers incorrectly use the term calibration to name the process of alignment or adjustment of an item that is either newly manufactured or is known to be out of tolerance. and with a specified and controlled measurement system. •  The calibration process may include. may be included as part of the calibration. or minimize by adjustment any deviations from the tolerance limits or any other variation in the accuracy of the instrument being compared. or artifacts as needed to verify the performance. The calibration process uses traceable external stimuli. measurement standards. or the determination of one or more corrections.appendix: calibration terminology a to z 1 its normal operating configuration – as the normal operator would use it. to a measurement system or device of specified capability and known uncertainty in order to detect. Calibration is performed according to a specified documented calibration procedure.

appendix: calibration terminology a to z 1 is known to be out of tolerance. a certificate often refers to the permanent record of the final result of a calibration. or an industry-specific name. These and similar tasks are excluded from the metrological definition of calibration. A laboratory database certificate is a record that cannot be changed. a means to record 185 . a metrology laboratory or department. or after it is repaired. or tolerances for a model of measuring or testing equipment. (2) In a laboratory database program. specifications. Calibration certificate – (1) A calibration certificate is generally a document that states that a specific item was calibrated by an organization. including subsidiary operations of a larger entity. Examples include performing a selftest as part of normal operation or performing a self-calibration (normalizing) a measurement system before use. the intent is that they are part of the normal work done by a trained user of the system. self-calibration. When calibration is used to refer to tasks like this. It may be called a calibration laboratory. which are repair activities and excluded from the metrological definition of calibration. or any combination or variation of these. calibration means the same as alignment or adjustment. See also: calibration report Calibration procedure – A calibration procedure is a controlled document that provides a validated method for evaluating and verifying the essential performance characteristics. A calibration procedure documents one method of verifying the actual performance of the item being calibrated against its performance specifications. the organization presenting the certificate. It provides a list of recommended calibration standards to use for the calibration. and the effective date. A calibration certificate should provide other information to allow the user to judge the adequacy and quality of the calibration. IM&TE instruction manuals may use calibration to describe tasks normally performed by the operator of a measurement system. or department. Contrast with: calibration (1) See also: normalization. if it is amended later a new certificate is created. shop. When used this way. standardization Calibration activity or provider – A laboratory or facility – including personnel – that perform calibrations in an established location or at customer location(s). (B) In many cases. The certificate identifies the item calibrated. It may be external or internal.

standard reference material. As calibration standards are used to calibrate other IM&TE items. 6. placard. Calibration report – A calibration report is a document that provides details of the calibration of an item. they are more closely controlled and characterized than the workload items they are used for. 6. and 6. Designation as a 186 . Note: A calibration procedure does not include any maintenance or repair actions.14. clearly indicates tampering. when removed or tampered with. In addition to the basic items of a calibration certificate.” A calibration seal provides a means of deterring the user from tampering with any adjustment point that can affect the calibration of an instrument and detecting an attempt to access controls that can affect the calibration of an instrument. Calibration program – A calibration program is a process of the quality management system that includes management of the use and control of calibrated inspection. and NCSL pages 36–38. Calibration standards generally have lower uncertainty and better resolution than general-purpose items. A calibration seal is usually imprinted with a legend similar to “Calibration Void if Broken or Removed” or “Calibration Seal – Do Not Break or Remove.appendix: calibration terminology a to z 1 quantitative performance data both before and after adjustments. and the process of calibrating IM&TE used to determine conformance to requirements or used in supporting activities.1 through 6. and information sufficient to determine if the unit being calibrated is operating within the necessary performance specifications. or measurement transfer standard that is designated as being used only to perform calibrations of other IM&TE items. and the actual measurement results and uncertainty. A calibration program may also be called a measurement management system (ISO 10012:2003). and test and measuring equipment (IM&TE). a calibration report includes details of the methods and standards used. The purpose of a calibration seal is to ensure the integrity of the calibration. artifact. and by virtue of its design and material. the parameters checked.) A calibration standard is an IM&TE item. Note: A calibration seal may also be referred to as a tamper seal. or label that. Calibration standard – (See VIM.13. A calibration procedure always starts with the assumption that the unit under test is in good working order and only needs to have its performance verified.9. See also: calibration certificate Calibration seal – A calibration seal is a device.

not on any other consideration. 187 . Calibration standards are often called measurement standards. For example. however. 2. one might be designated as a calibration standard while the others are all general purpose IM&TE items. and so on.4) See also: expanded uncertainty Commercial of the shelf software – Software commercially available. For a person. For an infinite (or very large compared to the sample) population. standard deviations. n is the number of items in the sample.3. and t  is the Student’s T value for α ⁄2 and (n – 1) (α is the level of significance). the demonstrated ability to perform the tests or calibrations within the accreditation scope and to meet other criteria established by the accreditation body. x is the sample mean. The confidence interval is calculated from sample statistics. the demonstrated ability to apply knowledge and skills. since it is a synonym and has more accepted usage in the United States. Competence – For a laboratory.appendix: calibration terminology a to z 1 calibration standard is based on the use of the specific instrument. weighted according to how the measurement result varies with changes in those quantities. slopes. whose fitness for use is demonstrated by a broad spectrum of users. The terms are the variances or covariances of these other quantities. (GUM. in a group of identical instruments. See also: standard (measurement) Combined standard uncertainty – The standard uncertainty of the result of a measurement. lines. Confidence interval – A range of values that is expected to contain the true value of the parameter being evaluated with a specified level of confidence. Note: The word qualification is sometimes used in the personal sense. the confidence interval is: p (1 – p) s CI = x ¯ ± t = –––   or   CI = p ±  ––––––––– n    n where CI is the confidence interval. when that result is obtained from the values of a number of other quantities. Confidence intervals can be calculated for points. p is the proportion of items of a given type in the population. s is the sample standard deviation. It is equal to the positive square root of a sum of terms.

See also: bias. which 188 . Departure value – A term used by a few calibration laboratories to refer to bias. 3. 3. accreditation bodies. correction (of error). The error can never be known exactly. systematic error Corrective action – Corrective action is something done to correct a nonconformance when it arises.) In metrology. Systematic error (also known as bias) may be corrected. error or systematic error. error (or measurement error) is an estimate of the difference between the measured value and the probable true value of the object of the measurement. Error (of measurement) – (See VIM. The correction value is equal to the negative of the bias. It is usually given the value 2. which approximately corresponds to a probability of 95 percent for degrees of freedom > 10’. 2.10. Equivalence – (A) Acceptance of the competence of other national metrology institutes (NMI). and NCSL pages 11–13. The exact meaning can usually be determined from examination of the calibration certificate.appendix: calibration terminology a to z 1 Correction (of error) – A correction is the value that is added to the raw result of a measurement to compensate for known or estimated systematic error or bias. (B) A formal. Compare with: preventive action Coverage factor – A numerical factor used as a multiplier of the combined standard uncertainty in order to obtain an expanded uncertainty. and/or accredited organizations in other countries as being essentially equal to the NMI. accreditation body. error.14. documented determination that a specific instrument or type of instrument is suitable for use in place of the one originally listed.6) The coverage factor is identified by the symbol k.15) Any residual amount is treated as random error. random error. sometimes referred to as a nonconformance. Error may be systematic and/or random. (GUM. including actions taken to prevent reoccurrence of the nonconformance. Deficiency – Nonfulfillment of conditions and/or criteria for accreditation. systematic error Gage R&R – Gage repeatability and reproducibility study. random error. An example is the value calculated to compensate for the calibration difference of a reference thermometer or for the calibrated offset voltage of a thermocouple reference junction. it is always an estimate. (VIM. and/or accredited organizations within the host country. 3. See also: bias.12–3.3. for a particular application.

and diagnostic equipment). S. This term includes all items that fall under a calibration or measurement management program. personnel.appendix: calibration terminology a to z 1 (typically) employs numerous instruments. measuring. GPETE (general purpose electronic test equipment). TMDE (test. The number of instruments. IM&TE – The acronym IM&TE refers to inspection. Internal audit – A systematic and documented process for obtaining audit evidence and evaluating it objectively to verify that a laboratory’s operations comply with the requirements of its quality system. PMET (precision measuring equipment and tooling). IM&TE items are typically used in applications where the measurement results are used to determine conformance to technical or quality requirements before. the equivalent document is ANSI/NCSL Z540-2-1997. The data captured are analyzed statistically to obtain best measurement capability. during. personnel. U. Interlaboratory comparison – Organization. and length of time are established to be statistically valid consistent with the size and level of activity of the organization. and so on) where a tolerance is not specified and the indication is not critical to safety. which is expressed as an uncertainty with a coverage factor of k = 2 to approximate 95 percent. International Organization for Standardization (ISO) – An international nongovernmental organization chartered by the United Nations in 1947. and evaluation of tests or calibrations on the same or similar items or materials by two or more laboratories in accordance with predetermined conditions. PME (precision measuring equipment). In the United States. An internal audit is done by or on behalf of the laboratory itself. performance. Note: Organizations may refer to IM&TE items as MTE (measuring and testing equipment). Switzerland. pressure. or SPETE (special purpose electronic test equipment). so it is a first-party audit. and measurements over a period of time to capture quantitative observations. The mission of ISO is “to promote the development of standardization and 189 . and test equipment. or after a process. Guide to the Expression of Uncertainty in Measurement. measuring. measurements. GUM – An acronym commonly used to identify the ISO Guide to the Expression of Uncertainty in Measurement. with headquarters in Geneva. Some organizations do not include instruments used solely to check for the presence or absence of a condition (such as voltage.

Level of confidence – Defines an interval about the measurement result that encompasses a large fraction p of the probability distribution characterized by that result and its combined standard uncertainty. (Mass. technological and economic activity. programming. operation.) SI units are defined and maintained by the International Bureau of Weights and Measures (BIPM) in Paris. and p is the coverage probability or level of confidence of the interval. ISO – Iso is a Greek word root meaning equal. it would be different in each language. In this context.” The scope of ISO’s work covers all fields of business. thermodynamic temperature. amount of substance. specification. (The acronym SI is from the French Systéme International. The members of ISO are the designated national standards bodies of each country. 2. (If the acronym was based on the full name were used. ISO is not an acronym. scientific.1) 190 . formal. IT Infrastructure – The hardware and software such as networking software and operation systems.) The name also symbolizes the mission of the organization – to equalize standards worldwide. (VIM. and scheduled examination of the status and adequacy of the quality management system in relation to its quality policy and objectives by the organization’s top management. Effectively.appendix: calibration terminology a to z 1 related activities in the world with a view to facilitating the international exchange of goods and services. periodic. which makes it possible for the application to function. France. The International Organization for Standardization chose the word as the short form of the name.) SI is international system of measurement for all physical quantities. installation. and to developing cooperation in the spheres of intellectual. electric current. (The United States is represented by ANSI. and maintenance. Measurement – A set of operations performed for the purpose of determining the value of a quantity. The SI system is popularly known as the metric system. length. Life cycle – All phases in the life of the system from initial requirements until retirement including design.) See also: ISO International System of Units (SI) – A defined and coherent system of units adopted and used by international treaties. the coverage level expressed as a percent. industry and commerce except electrical and electronic engineering. testing. and luminous intensity. Management review – The planned. so it will be a constant in all languages. time.

whose membership is open to any organization with an interest in the science of measurement and its application in research.2).” Normalization. industrial. measurement standards. NCSL has member organizations from academic. or commerce. Examples of natural physical constants important in metrology are the speed of light in a vacuum (c). associated materials and accessories. the quantum charge ratio (h/e). NCSL was formed in 1961 to “promote cooperative efforts for solving the common problems faced by measurement laboratories. and other quantifiable factors that combine to determine the success of a measurement process. Any value other than zero is an offset created by inhomogeneity of the thermocouple wires combined 191 . the personnel. instrument calibration. or the use of a virtual office. the ratio of a circle’s circumference to its diameter (p). For example. vehicle. scientific. Natural (physical) constant – A natural constant is a fundamental value that is accepted by the scientific community as valid. if the thermocouple alloy leads of a reference junction probe are formed into a measurement junction and placed in an ice point cell. people. commercial and government facilities around the world. Mobile operations – Operations that are independent of an established calibration laboratory facility. development. and the reference junction itself is also in the ice point. methods. The measurement system includes at least the test and measuring instruments and devices. Natural constants are used in the basic theoretical descriptions of the universe. 2. home. Normalize – See: self-calibration Offset – Offset is the difference between a nominal value (for an artifact) or a target value (for a process) and the actual measured value. Mobile operations may include work from an office space. NCSL promotes technical and managerial excellence in the field of metrology.appendix: calibration terminology a to z 1 Measurement system – A measurement system is the set of equipment. education. NCSL international – Formerly known as the National Conference of Standards Laboratories (NCSL). NCSL is a nonprofit organization. and test and measurement. and the physical environment.16 K). the triple point of water (273. conditions. the procedures used. then the theoretical thermoelectric emf measured at the copper wires should be zero. the gravitational constant (G). and the base of natural logarithms (e). Metrolog y – Metrology is the science and practice of measurement (VIM.

or general management position on a specific topic. See also: calibration (1) Policy – A policy defines and sets out the basic objectives. Calibration: see calibration procedure. See also: policy Process owner – The person responsible for the business process. Policies can also be in the organization’s policy/procedure manual. 3. Compare with: bias. See also: procedure Precision – Precision is a property of a measuring system or instrument.5) Preventive action – Preventive action is something done to prevent the possible future occurrence of a nonconformance. Performance Test – A performance test (or performance verification) is the activity of verifying the performance of an item of measuring and test equipment to provide assurance that the instrument is capable of making correct measurements when it is properly used. error On-site operations – Operations that are based in or directly supported by an established calibration laboratory facility. Contrast with: corrective action Procedure – A procedure describes a specific process for implementing all or a portion of a policy. Policy statements relevant to the quality management system are generally stated in the quality manual. The level of detail needed should correlate with the level of education and training of the people with the usual qualifications to do the work and the amount of judgment normally allowed to them by management. goals. even though such an event has not yet happened. but actually perform the calibration actions at customer locations. vision. This includes climate-controlled mobile laboratories. page 26) Precision is not the same as accuracy. Some policies may be implemented by fairly detailed procedures. A performance test is the same as a calibration (1). (NCSL. There may be more than one procedure for a given policy. while others may only have a few general guidelines. A procedure has more detail than a policy but less detail than a work instruction. 192 . Preventive action helps improve the system. Precision is a measure of the repeatability of a measuring system – how much agreement there is within a group of repeated measurements of the same quantity under the same conditions.appendix: calibration terminology a to z 1 with other uncertainties. (VIM. A performance test is done with the item in its normal operating configuration. A policy describes what management intends to have done regarding a given portion of business activity.

or minor cleaning of switch contacts. Contrast with: calibration (1). repair (minor) Repair (minor) – Minor repair is the process of quickly and economically returning an unserviceable item to serviceable condition by doing simple work using parts that are in stock in the calibration lab. auditors use the quality manual when they audit the quality management system. or proprietary information. it does not usually contain any detailed policies and never contains any procedures. or replacing one or two in-stock components. Random error – Random error is the result of a single measurement of a value. the GUM specifically does not replace random error with either Type A or Type B methods of evaluation. Repair includes adjustment or alignment of the item as well as component-level repair. Random error is usually evaluated by Type A methods. Note: Contrary to popular belief. minor repair is always followed by calibration of the item. In addition to its regular use by the organization. or lamps. (VIM. fuses.) The need for repair may be indicated by the results of a calibration.appendix: calibration terminology a to z 1 Proficiency testing – Determination of laboratory testing performance by means of interlaboratory comparisons. Quality manual – The quality manual is the document that describes the quality management policy of an organization with respect to a specified conformance standard. is a measure of dispersion. repair is always followed by calibration of the item. or is removed from its case and may be disassembled to some degree. minus the mean of a large number of measurements of the same value. Passing 193 . For calibratable items. work instructions.13) Random error causes scatter in the results of a sequence of readings and. For calibratable items. but Type B methods are also used in some situations. The instrument is opened. therefore. Therefore. or repairing a broken wire. 3. Examples include replacement of batteries. or has covers removed. The quality manual is generally provided to customers on request. The need for repair may be indicated by the results of a calibration. The quality manual briefly defines the general policies as they apply to the specified conformance standard and affirms the commitment of the organization’s top management to the policy. (Some minor adjustment such as zero and span may be included as part of the calibration. See also: error Compare with: systematic error Repair – Repair is the process of returning an unserviceable or nonconforming item to serviceable condition. Passing the calibration test indicates success of the repair.

with the associated measurement uncertainty. and where no parts have to be ordered from external suppliers. Contrast with: calibration (1). Minor repairs are defined as repairs that take no longer than a short time as defined by laboratory management. The uncertainty is usually expanded uncertainty as defined in the GUM. specific measurements. parameters. or week of continuous operation. In general. day. The specific type and format vary according to the type of measurement being made. the scope is a documented list of calibration or testing fields. Round robin – See: Interlaboratory Comparison Scope of accreditation – For an accredited calibration or testing laboratory. The reported value is usually the mean of a number of repeat measurements. •  Deviation from the nominal (or reference) value and uncertainty. Only the calibration or testing areas that the laboratory is accredited for are listed in the scope document. The accreditation body usually defines the format and other details.appendix: calibration terminology a to z 1 the calibration test indicates success of the repair. The process may be required at intervals such as every power-on sequence. The value may be reported this way when it is known that the instrument is part of a measuring system and the systematic error will be used to calculate a correction that will apply to the measurement system results. or if the ambient temperature changes by a specified 194 . The reported value is the difference between the nominal value and the mean of a number of repeat measurements. most reported values will be in one of these formats: •  Measurement result and uncertainty. or calibrations and their best measurement. Self-calibration – Self-calibration is a process performed by a user for the purpose of making an IM&TE instrument or system ready for use. repair Reported value – One or more numerical results of a calibration process. •  Estimated systematic error and uncertainty. as recorded on a calibration report or certificate. The scope document is an attachment to the certificate of accreditation and the certificate is incomplete without it. or once per shift. and where substantial disassembly of the instrument is not required. The uncertainty of the deviation is usually expanded uncertainty as defined in the GUM. uncertainty. and only the listed areas may be offered as accredited calibrations or tests.

Standard (measurement) – A standard (measurement standard. A product that performs outside the specification limits when tested (calibrated) is rejected for later adjustment. calibration standard. repair. An example is ANSI/NCSL Z540-11994. given in terms of the relevant parameters and including the accuracy or uncertainty. reference standard. Self-calibration is not equivalent to periodic calibration (performance verification) because it is not performed using a calibration procedure and does not meet the metrological requirements for calibration. if an instrument requires self-calibration before use. Compare with: calibration (2. Items with similar uses in a production shop are generally regarded as working-level instruments by the calibration program. a norme) is a document that describes the processes and methods that must be performed in order to achieve a specific technical or management objective. Primary standard. The usual purpose is accuracy enhancement by characterization of errors inherent in the measurement system before the item to be measured is connected. Also. Accepted as having the highest metrological 195 .B) Contrast with: calibration (1) Specification – In metrology. then that will also be accomplished at the start of a calibration procedure. instrument. or the methods for evaluation of any of these. Standard (document) – A standard (industry. or international standard. national. an étalon) is a system. The value and uncertainty of the standard define a limit to the measurements that can be made: a laboratory can never have better precision or accuracy than its standards. Measurement standards are generally used in calibration laboratories. or scrapping. Selfcalibration may also be called normalization or standardization. a national standard that describes the requirements for the quality management system of a calibration organization and the requirements for calibration and management of the measurement standards used by the organization. artifact.appendix: calibration terminology a to z 1 amount. a specification is a documented statement of the expected performance capabilities of a large group of substantially identical measuring instruments. government. Customers use specifications to determine the suitability of a product for their own applications. laboratory standard. the process may be performed totally by the instrument or may require user intervention and/or use of external calibrated artifacts. Once initiated. or material that is used as a defined basis for making quantitative measurements. device.

note) See also: bias.14) Systematic error causes the average of the readings to be offset from the true value. The highest level standards. Working standard. procedures.3. A device used to transfer the value of a measurement quantity (including the associated uncertainty) from a higher level to a lower level standard. correction (of error) Compare with: random error 196 . Systematic error is a measure of magnitude and may be corrected.2. or work instructions. the GUM specifically does not replace systematic error with either Type A or Type B methods of evaluation. Systematic error is also called bias when it applies to a measuring instrument. See also: calibration standard Standard operating procedure (SOP) – A term used by some organizations to identify policies. Secondary standard. Standard reference material – A standard reference material (SRM) as defined by NIST “is a material or artifact that has had one or more of its property values certified by a technically valid procedure. (GUM. Examples: triple point of water cell and caesium beam frequency standard. SRMs represent one of the primary vehicles for disseminating measurement technology to industry. 2. The highest accuracy level standards in a particular laboratory generally used only to calibrate working standards. Systematic error may be evaluated by Type A or Type B methods. error. Note: Contrary to popular belief. expressed as a standard deviation. a certificate or other documentation which is issued by NIST… Standard reference materials are…manufactured according to strict specifications and certified by NIST for one or more quantities of interest. Also called a reference standard. or traceable to. A standard that is used for routine calibration of IM&TE. (3. are the realizations or representations of SI units. Systematic error – A systematic error is the mean of a large number of measurements of the same value minus the (probable) true value of the measured parameter.” Standard uncertainty – The uncertainty of the result of a measurement. found in national and international metrology laboratories. (VIM. 3.appendix: calibration terminology a to z 1 qualities and whose value is accepted without reference to other standards of the same quantity. according to the type of data available.3. Transfer standard.1) Standardization – See: self-calibration. and is accompanied by.

page 2) TUR =  UUT_tolerance  STD_uncert The TUR must be calculated using identical parameters and units for the UUC and the calibration standard. the test accuracy ratio (TAR) is the ratio of the accuracy tolerance of the unit under calibration to the accuracy tolerance of the calibration standard used. uncertainty 197 . the TAR is the ratio of the tolerance of the parameter being measured to the accuracy tolerance of the IM&TE. Third Party – Parties Tolerance – A tolerance is a design feature that defines limits within which a quality characteristic is supposed to be on individual parts. and maintenance of a computerised system and for the security of the data residing on that system. Compare with: specification. percentage. page 2) TAR =  UUT_tolerance  STD_tolerance The TAR must be calculated using identical parameters and units for the UUC and the calibration standard. A tolerance is a property of the item being measured. Test uncertainty ratio – In a calibration procedure. Tolerances are applied during design and manufacturing. (NCSL. (2) In the normal use of IM&TE items. Note: TAR may also be referred to as the accuracy ratio or (incorrectly) the uncertainty ratio.appendix: calibration terminology a to z 1 System owner – The person responsible for the availability. Note: The uncertainty of a measurement standard is not necessarily the same as its accuracy specification. Test accuracy ratio – (1) In a calibration procedure. If the accuracy tolerances are expressed as decibels. (NCSL. or another ratio. it represents the maximum allowable deviation from a specified value. they must be converted to absolute values of the basic measurement units. percentage. If the accuracy tolerances are expressed as decibels. the test uncertainty ratio (TUR) is the ratio of the accuracy tolerance of the unit under calibration to the uncertainty of the calibration standard used. they must be converted to absolute values of the basic measurement units. or another ratio.

10) Traceability is a demonstrated or implied property of the result of a measurement to be consistent with an accepted standard within specified limits of uncertainty. providing the ability to relate the measurement result to stated references. Transfer measurement – A transfer measurement is a type of method that enables making a measurement to a higher level of resolution than normally possible with the available equipment. fundamental or physical natural constants that are reproducible and have defined values. Transfer standard – A transfer standard is a measurement standard used as an intermediate device when comparing two other standards. traceability – Traceability is a property of the result of a measurement. (NCSL. a calibrated IM&TE. or industry or other accepted consensus reference standards. a measurement system. Evidence of traceability includes the calibration report (with values and uncertainty) of calibration standards. capacitors. or inductors. Only the result of a specific measurement can be said to be traceable. but the report alone is not sufficient. 6. ratio type comparisons. Examples of typical transfer standards are DC volt sources (standard cells or zener sources). A calibration laboratory. (VIM. The laboratory must also apply and use the data. 6. pages 42–43) The stated references are normally the base or supplemental SI units as maintained by a national metrology institute. Reference to a NIST test number is specifically not evidence of traceability. through an unbroken chain of comparisons each having stated uncertainties. 198 . (VIM. Common transfer methods are differential measurements and ratio measurements. or any other thing is not and be traceable to a national standard. or from a secondary standard to a working standard in order to create or maintain measurement traceability. That number is merely a catalog number of the specific service provided by NIST to a customer so it can be identified on a purchase order. certified standard reference materials. Traceability provides the ability to demonstrate the accuracy of a measurement result in terms of the stated reference. from a primary standard to a secondary standard. a calibration report. A calibration system operating under a program controls system only implies traceability.appendix: calibration terminology a to z 1 Traceable. provided all of the conditions just listed are met.8) Typical applications of transfer standards are to transfer a measurement parameter from one organization to another. Measurement assurance methods applied to a calibration system include demonstration of traceability. and singlevalue standard resistors.

Type B evaluation (of uncertainty) – Type B evaluation of measurement uncertainty includes any method except statistical analysis of actual measurement results. Type B evaluation Uncertainty budget – The systematic description of known uncertainties relevant to specific measurements or types of measurements.3. (GUM. 2. Uncertainty – Uncertainty is a property of a measurement result that defines the range of probable values of the measurand.5) Uncertainty can only be evaluated by Type A methods if the laboratory actually collects the data. See also: Type A evaluation. Total uncertainty may consist of components that are evaluated by the statistical probability distribution of experimental data or from assumed probability distributions based on other data. UUT – The unit under calibration or the unit under test – the instrument being calibrated.5) Data for evaluation by Type B methods may come from any source believed to be valid. and/ or other applicable measurement criteria. with a specified level of confidence. 3. Both random and systematic error may be evaluated by Type A methods. (GUM. After an item that has a specified tolerance has been calibrated using an instrument with a known accuracy. UUC. 3. Uncertainty is an estimate of dispersion. effects that contribute to the dispersion may be random or systematic. Also may be called device under test (DUT) or equipment under test (EUT).3.3) Uncertainty is an estimate of the range of values that the true value of the measurement is within. VIM – An acronym commonly used to identify the ISO International 199 .3 through 3. the result is a value with a calculated uncertainty. and/or procedures are fit for their intended use.3 through 3. Both random and systematic error may be evaluated by Type B methods. Validation – Substantiation by examination and provision of objective evidence that verified processes. These are standard generic labels for the IM&TE item that is being calibrated.appendix: calibration terminology a to z 1 Type A evaluation (of uncertainty) – Type A evaluation of measurement uncertainty is the statistical analysis of actual measurement results to produce uncertainty values.3. (GUM.2. methods. range of measurement. which are used in the text of the calibration procedure for convenience. categorized by type of measurement.3. Verification – Confirmation by examination and provision of objective evidence that specified requirements have been fulfilled.

The level of education and training of the people with the usual qualifications to do the work must be considered when writing a work instruction. 2004. Milwaukee: ASQ Quality Press. a work instruction defines the detailed steps necessary to carry out a procedure. 1. Work instructions are used only where they are needed to ensure the quality of the product or service. In a metrology laboratory.) Work Instruction – In a quality management system. Jay L. (The acronym comes from the French title.appendix: calibration terminology a to z 1 Vocabulary of Basic and General Terms in Metrology. The Metrology Handbook.  Bucher. 200 . a calibration procedure is a type of work instruction.

201 .

PORTABLE CALIBRATORS WORKSTATIONS PROFESSIONAL SERVICES CALIBRATION SOFTWARE 202 .

Support Installation. workstations. database conversion. systems and services for the calibration and maintenance of process instruments. oil and gas. calibration software. •  Customers from wide range of industries. such as automotive. 203 . calibration software and professional services form an integrated. (IAF). versatility. They are all delivered with a traceable. with global customer base and partner network. petrochemical and chemical. FINAS is a member of all Multilateral Recognition Agreements / Mutual Recognition Arrangements (MLA/MRA) signed by European and other international organizations. Industry pioneer with global presence A forerunner in developing high-quality calibration equipment and software. professional services and industryspecific solutions. aviation. training.000 companies worldwide utilize Beamex’s calibration solutions. nuclear.e. metal and mining. accessories. Help Desk and re-calibration services available. i. Integrated calibration solutions Beamex calibrators. education. automated system. •  Comprehensive product range includes portable calibrators. power and energy. validation. workstations. ease-of-use and reliability. manufacturing. pharmaceutical. •  Products and services available in more than 60 countries. International Laboratory Accreditation Cooperation (ILAC) and International Accreditation Forum Inc. marine. contractor engineering. European co-operation for Accreditation (EA). food and beverage. efficiency.about beamex About Beamex •  One of the world’s leading providers of calibration solutions. More than 10. and pulp and paper. High customer satisfaction Constantly improving understanding of customer needs and developing solutions to meet them. Why is Beamex better Accuracy assured Accuracy is assured when you decide to purchase a Beamex® calibrator. •  Beamex’s Accredited Calibration Laboratory is accredited and approved by FINAS (Finnish Accreditation Service). •  Certified in accordance with the ISO 9001:2008 quality standard. accredited calibration certificate. system integration. software. •  Develops and manufactures high-quality calibration equipment. •  For customers with requirements for accuracy.

You're Reading a Free Preview

Descarga
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->