This paper focuses on the mathematical modelling required to support the development of new primary standard systems for traceable calibration of dynamic pressure sensors. We address two fundamentally different approaches to realising primary standards, specifically the shock tube method and the drop-weight method. Focusing on the shock tube method, the paper presents first results of system identification and discusses future experimental work that is required to improve the mathematical and statistical models. We use simulations to identify differences between the shock tube and drop-weight methods, to investigate sources of uncertainty in the system identification process and to assist experimentalists in designing the required measuring systems. We demonstrate the identification method on experimental results and draw conclusions.
We report on a system of well-characterized source masses and their precision positioning system for a measurement of the Newtonian gravitational constant G using atoms as probes. The masses are 24 cylinders of 50 mm nominal radius, 150.2 mm nominal height, and mass of about 21.5 kg, sintered starting from a mixture of 95.3% W, 3.2% Ni, and 1.5% Cu. Density homogeneity and cylindrical geometry have been carefully investigated. The positioning system independently moves two groups of 12 cylinders along the vertical direction by tens of centimeters with a reproducibility of a few microns. The whole system is compatible with a resolution ⌬G / G Ͻ 10 −4 .
An algorithm able to deal with any desired fitting model was developed for regression problems with uncertain and correlated variables. A typical application concerns the determination of calibration curves, especially (i) in those cases in which the uncertainties on the independent variables xi cannot be considered negligible with respect to those associated with the dependent variables yi, and (ii) when correlations exist among xi and yi. In the metrological field, several types of software have already been dedicated to the determination of calibration curves, some being focused just on problem (i) and a few others considering also problem (ii) but only for a straight-line fitting model. The proposed algorithm is able to deal with problems (i) and (ii) at the same time, for a generic fitting model. The tool was developed in the MATLAB® environment and validated on several benchmark data sets, fitted with linear and non-linear regression models. A review of the most commonly applied approximations to the parameter uncertainty is also presented, together with a Monte Carlo method proposed for comparison purposes with the results provided by the formula for the uncertainty evaluation which is implemented in the software.
After a brief description of the different methods employed in periodic calibration of hydrometers used in most cases to measure the density of liquids in the range between 500 kg m−3 and 2000 kg m−3, particular emphasis is given to the multipoint procedure based on hydrostatic weighing, known as well as Cuckow's method. The features of the calibration apparatus and the procedure used at the INRiM (formerly IMGC-CNR) density laboratory have been considered to assess all relevant contributions involved in the calibration of different kinds of hydrometers. The uncertainty is strongly dependent on the kind of hydrometer; in particular, the results highlight the importance of the density of the reference buoyant liquid, the temperature of calibration and the skill of operator in the reading of the scale in the whole assessment of the uncertainty. It is also interesting to realize that for high-resolution hydrometers (division of 0.1 kg m−3), the uncertainty contribution of the density of the reference liquid is the main source of the total uncertainty, but its importance falls under about 50% for hydrometers with a division of 0.5 kg m−3 and becomes somewhat negligible for hydrometers with a division of 1 kg m−3, for which the reading uncertainty is the predominant part of the total uncertainty. At present the best INRiM result is obtained with commercially available hydrometers having a scale division of 0.1 kg m−3, for which the relative uncertainty is about 12 × 10−6.
The redefinition of the kilogram, along with another three of the base SI units, is scheduled for 2018. The current definition of the SI unit of mass assigns a mass of exactly one kilogram to the International Prototype of the kilogram, which is maintained in air and from which the unit is disseminated. The new definition, which will be from the Planck constant, involves the realisation of the mass unit in vacuum by the watt balance or Avogadro experiments. Thus, for the effective dissemination of the mass unit from the primary realisation experiments to end users, traceability of mass standards transferred between vacuum and air needs to be established and the associated uncertainties well understood. This paper describes means of achieving the link between a unit realised in vacuum and standards used in air, and the ways in which their use can be optimised. It also investigates the likely uncertainty contribution introduced by the vacuum-air transfer process.1 See Section 2 of the draft mise en pratique of the definition of the kilogram http://www.bipm.org/cc/CCM/Allowed/15/02A_MeP_kg_141022_v-9.0_clean.pdf for details of the primary methods to realize the definition of the kilogram
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.