Effect of System Variables on the Uncertainty of the Mass Point Leak Rate Methodology Using First-Order Regression
Advances in nondestructive testing for leak rate characterisation are rare because procedures are well accepted; however, for many critical applications, an understanding of the accuracy and reliability of a reported leak rate is equally as important as the leak rate itself (e.g., manned spacecraft and nuclear power industry). Although it is known that the mass point leak rate method can achieve high accuracies, the measurement uncertainty and limitations of the method are less understood. Using a least-squares regression on a mass–time population and a statistical analysis of regression uncertainty, this study investigated the influence of (1) differential pressure, (2) steady-state and transient temperature, (3) volume size, (4) gas type, (5) sampling time duration, and (6) sampling interval on the reported mass point leak rate value. The analyses accounted for all significant sources of error. An experimental evaluation and validation of the method were conducted on a capillary-type, National Institute of Standards and Technology traceable leak standard and, where appropriate, simultaneous measurements were taken using a helium leak detector for comparison. The mass point technique was shown to provide high-accuracy results for various gases and volume sizes. Across the range of temperatures and pressures, the uncertainty measurements of the mass point technique were between ± 0.34% and ± 0.72%. When the temperature of the test section was varied ± 5°C during the experiment, the mass point technique results deviated 2–4% from those of the standard.
Nondestructive Testing and Evaluation
Daniels, Christopher C. and Garafolo, Nicholas G., "Effect of System Variables on the Uncertainty of the Mass Point Leak Rate Methodology Using First-Order Regression" (2013). Mechanical Engineering Faculty Research. 176.