The Leak Rate Test “Bubble” Choosing the Right Technology

August 31, 2016 | By CTS Admin
The ol’ “bubble test” is the simplest and, often, the cheapest form of leak rate testing. Seal the test part, attach a shop air hose to it, dunk it in a tank of water for a few seconds, and look for bubbles. We’ve all done it, and chances are good you’ve gotten a good splashing at least once when a major leak erupted underwater. However, there comes a time when giving your operators rubber gloves and a dunk tank is no longer sufficient for your leak testing needs. This is when process improvement decisions must be made, and an investment must be made in leak testing technology that can test production parts without relying on an operator’s vision and attentiveness.

“Leak rate, everything Leaks”

To choose the right technology for your leak testing needs, it’s important to understand the choices available on the market, the cost of each, their complexity, and, most importantly, what type of leak, if any, your part can experience while continuing to function correctly. There’s a saying in the leak testing industry: “Everything leaks, so how much leakage is acceptable?” Acceptable leak rates are identified to set limits for evaluating production parts’ leakage. Bubbles emitted by a dunked part can be correlated to a leak rate as a volumetric measurement of air loss over a set period of time. The volumetric measurement is identified in standard cubic centimeters per second or per minute, depending on the level of allowable leak. This level also dictates which technology should be selected for measuring said leaks. The technologies available for measuring leaks range from simple to complex. Moving past the simplistic approach of dunk testing and complete operator dependency, we find technology on the moderate side of operator-independence that is used for pressure measurements in pressure decay, vacuum decay, and mass flow testing. From there, there are far more complex, fully operator-independent techniques, such as tracer gas measurement technologies that use sniffing, accumulation, and hard vacuum mass spectrometry. The key factor in choosing the right technology for repeatable leak test measurement is the acceptable leak rate. Many factors can be considered in establishing an acceptable leak rate, but it will chiefly depend on what material shouldn’t leak into or out of the part. When testing for liquid or gaseous leakage, the viscosity of the material can be taken into account. For example, air will travel through a leak path 80 to 300 times faster than water will travel through the same path. (Flow is dependent upon the length and internal contours of the path.) Eventually, a leak path will become small enough that water molecules will clog the path, but air will still be able to flow through. Non-water leak rates are in the range of 1-5 SCC/min (standard cubic centimeters per minute) air flow, and sometimes as high as 10 SCC/min. These values depend on the characteristics of the leak path and pressure. As the leak path becomes smaller, air flow is reduced and will transition from viscous flow to molecular flow. Another important consideration when selecting the right technology is that the measurement device should supply 100 times greater resolution than the leak rate being measured.

Pressure-Based Technologies

The three primary test methods utilized for pressure-based technologies are pressure decay, vacuum decay, and mass flow. Each of these methods uses pressurized air as their media, and conducts measurements that are correlated to a volumetric flow of air as the tested part leaks. Pressure decay and vacuum decay measurements are based on the same basic technique. The test part is pressurized/evacuated to pressure by a regulator. Once the prescribed test pressure is reached, the regulated air flow stops and the pressure is isolated in the part. Stabilization time allows for pressurization characteristics to equalize. Pressure drop is measured over a set amount of time. To achieve an accurate volumetric flow measurement, this technology is calibrated to a certified flow rate orifice. Mass flow measurements are based on pressurizing a part with the use of a regulated source, maintaining pressure on the part, and measuring the amount of air flow through a mass flow sensor into the part as it loses air volume. With these technologies, resolution and repeatability are based on a pressure differential caused by the leak. This measurement dictates the minimum resolution for this technology, as the biggest noise-generating culprit is temperature change, either in the part itself or in the pneumatic measuring circuit. Additional factors that can affect test resolution for measuring flow include sensor repeatability, regulation repeatability, the calibration of the system, and the volume in the part and in the pneumatic circuit.

Repeatable Flow Rates

With these leak testing technologies, repeatable flow rate measurement depends on test pressure and part characteristics. In general terms, testing capabilities for mass flow technology can be as high as hundreds of SL/min or as low as 0.5 SCC/min, depending on the range of the flow meter used for the test. Test capabilities for pressure and vacuum decay technology can range from 0.5 SCC/min up to thousands of SCC/min. Either technology can go as low as 0.1 SCC/min, but both are highly dependent on the temperature stability of test, test part characteristics, part size (a smaller volume/size makes it easier to achieve accurate results), the type of test system regulation, and flow rate calibration. In instances where it is impossible to accommodate repeatable leak testing via pressure based technologies, or where the leak rate is below 1.0 SCC/min, tracer gas technologies must be used to achieve low level leak measurements. To test leak rates well below 1.0 SCC/min, tracer gas technology use a gas that is detected by the device as it leaks from the part. The most common tracer gas devices are helium mass spectrometers; hydrogen detection devices and random gas analyzer systems are less common but still frequently used. Logically, helium is the most commonly-used gas for tracer gas detection. When using helium as a tracer gas, it is possible to control the sensitivity for measurement by controlling the percentage of helium mixture to pressurize the part—anywhere from five to 100 percent. Hydrogen, when utilized as a tracer gas, is usually used in a five percent mixture with nitrogen, due to hydrogen’s flammability at higher percentages. Gaseous mixture percentages are important when addressing the sensitivity of tracer gas technology, and the leak rate levels needed to detect leaks. When higher leak rates are being tested, low-percentage mixtures of tracer gas can be used to prevent filling the test area with large amounts of tracer gas. At the other end of the spectrum, when testing for low leak rates, higher percentages of tracer gases allow greater testing sensitivity.

Tracer Gas Testing detect the leak rate

There are two “levels” of tracer gas testing. Level One measures for a tracer gas leaking into the ambient atmosphere; Level Two measures a tracer gas leaking into a vacuum atmosphere. The most basic form of tracer gas testing involves pressurizing the part with a gas mixture and passing a handheld sniffing wand over the tested part. As the part leaks, the gas is pulled into the wand’s sensor. This method is an attribute test utilized to find a leak location. Test sensitivity is based on how fast the operator moves the wand across the leak; typical sensitivity with both helium and hydrogen is in the range of 1 x 10 -3 SCC/sec. The accuracy of this style of testing is highly dependent on the operator. For a more repeatable leak test using tracer gases, the part must be placed into a test chamber, sealed, and subject to vacuum, which removes atmospheric air. Removing atmospheric air is critical to making sure all areas of the part receive the tracer gas mixture when the part is charged. With proper evacuation and tracer gas application, a leak anywhere in the part will leak the gas mixture, instead of just atmospheric air. The closed test chamber is held at atmospheric pressure, with a baseline gas PPM. As the test part leaks, the percentage of tracer gas will increase in the chamber over time. As the tracer gas rate in the chamber rises, it is measured by the tracer gas detection devices and correlated to a leak rate. Typical sensitivity for this type of test is 1 x 10 -5 SCC/sec, dependent upon the size of the free air space in the test chamber, the type of tracer gas used, the percentage of tracer gas mixture, and tracer gas background levels. Because helium naturally occurs in Earth’s atmosphere at a rate of five parts per million, lower leak rates can only be accurately measured by removing background gases from the test chamber. To measure smaller leak rates, the part chamber is evacuated to low vacuum levels, and a helium mass spectrometer is used to measure any helium molecules the part leaks into the test chamber. The low leak rate testing range for this technology is roughly 1 x 10 -8 CC/sec to 10 -9 CC/sec. Transitioning from bubble or dunk testing to measurable leak testing techniques provides the ability to test parts within design parameters and within user-defined limits. Applying the correct testing technique at one hundred times the required leak rate resolution also allow for repeatable and operator-independent test measurements. leak rate|Sentinel i28 

Leak testing is made easier and more accurate thanks to the proprietary test algorithms utilized by Cincinnati Test Systems’ Sentinel™ I-24 precision leak test instrument. The Sentinel I-24 is the fastest, most repeatable, and most accurate leak test instrument on the market. Operators can program up to five discrete tool motions from the I-24 menu, and the time-saving Auto Setup feature makes startup a snap.

Contact Cincinnati Test Systems today to learn more about tracer gas leak testing with the Sentinel I-24. 

Request a Quote