Vacuum pressure standards of the orifice-flow type require known gas flows of 10−6 mol/s (10−2 atm cm3/s at 0 °C) and less. Known gas flows can also be used to calibrate ‘‘standard’’ leaks by comparing the pressures generated when flows from the leak and the flowmeter are alternately passed through a constant conductance. Two constant-pressure, piston displacement flowmeters developed at the National Bureau of Standards are described that can generate flows between 10−6 and 10−10 mol/s with an estimated uncertainty of 0.8% to 2%. Comparisons of the flowmeters with alternate calibration techniques, and repeated low-range leak and vacuum gauge calibrations, have been used to confirm the estimated uncertainty and random errors of the flowmeter.
This document is the consensus view of the Calibrated Leak Subcommittee of the Recommended Practices Committee of the American Vacuum Society. It is divided into four main sections: Description, Calibration, Proper Usage, and Recommended Documentation of Leaks. Included in Sec. II are discussion~ of types ofleaks, te~perature effects, depletion rates, and units ofleakage rate measurement. SectIOn III addresses pnmary and secondary techniques for leak calibration incl.uding uncertainties. Section IV addresses the proper handling and usage of leaks to achiev~ optImum r~sults, recommendations of standardization of connections, and safety. The documentatIOn to accompany and to be attached to each calibrated leak recommended in Sec V is intended t? provide ~he user with sufficient infonnation about the leak'for accurate and safe ~se:The append~ces con tam a glossary and a discussion of the use of throughput and flow rate units and conversions.FIG. I. Schematic diagram illustrating the major components of a leak. (I) leak element, (2) reservoir, (3) leak valve, (4) fill valve, (5) process connection.
The Guide to the Expression of Uncertainty in Measurement (GUM) provided for the first time an international consensus on how to approach the widespread difficulties associated with conveying information about how reliable the value resulting from a measurement is thought to be. This paper examines the evolution in thinking and its impact on the terminology that accompanied the development of the GUM. Particular emphasis is put on the very clear distinction in the GUM between measurement uncertainty and measurement error, and on the reasons that even though ‘true value’ and ‘error’ are considered in the GUM to be ‘unknowable’ and, sometimes by implication, of little (or even no) use in measurement analysis, they remain as key concepts, especially when considering the objective of measurement. While probability theory in measurement analysis from a frequentist perspective was in widespread use prior to the publication of the GUM, a key underpinning principle of the GUM was to instead consider probability as a ‘degree of belief.’ The terminological changes necessary to make this transition are also covered. Even twenty years after the publication of the GUM, the scientific and metrology literatures sometimes contain uncertainty analyses, or discussions of measurement uncertainty, that are not terminologically consistent with the GUM, leading to the inability of readers to fully understand what has been done and what is intended in the associated measurements. This paper concludes with a discussion of the importance of using proper methodology and terminology for reporting measurement results.
There is a growing requirement for an internationally accepted system of recognition of measurement capabilities and relationships within and among countries, to facilitate seamless global commerce and trade. As a result, metrologists worldwide have recently developed increased interest in the concept and definition of traceability. Classically, traceability provides a way of relating the results of a measurement (or value of a standard) to higher level standards. Such standards are usually national or international standards, and the comparisons used to provide the traceability must have well-understood uncertainties. An additional complexity arises because all instruments and standards are subject to change, however slight, over time. This paper develops approaches for dealing with the effects of such time-dependent changes as a part of traceability statements. The use of metrological time-lines provides a means of effectively visualizing these relationships in a statement of traceability. When the rate of change in the measurement process is sufficiently small, the approach proposed here is less important. However, documented measurement assurance procedures are required at all levels so that appropriate uncertainties may be estimated with confidence. When laboratory or national boundaries are crossed in the traceability process, other factors come into play, and the original concept of traceability can become obscure. It is becoming common to hear the term “equivalence” used to describe these more complex measurement relationships.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.