Explaining PPM and %RH (Relative Humidity)
LogicOil offer a range of portable field lab models that do both particle counting and water expressed as Parts Per Million (PPM). As the PPM sensors must be calibrated on the oil or diesel fuel being tested, the sensor will be inaccurate on any other oil or diesel blend tested. PPM sensors are also more expensive that %RH sensors.
There are two approaches to monitoring water content, these are PPM and RH (% Relative Humidity or Saturation). The issue with PPM sensors is that they need to be calibrated with the oil or diesel fuel to be tested. PPM sensors are temperature compensated as the amount of water a fuel or oil can hold changes dramatically with temperature.
Different fluids can hold different amounts of water. This is influenced by the manufacturer, additive packages (biodiesel content in the case of fuel), with the difference spannig from as little as 200PPM and as high as 1,000PPM. PPM Sensors need to be calibrated with the target oil or fuel throughout a temperature curve. This calibration process is where the cost is incurred, it also fixes the calibration to the target oil or fuel.
An RH sensor actually utilises the same sensing element as a PPM sensor however only reports the humidity level and does not have the calibrated as a temperature-controlled algorithm is built in.
To put it another way, in a PPM sensor the Relative Humidity correlates to the PPM value via an in-built look up table and these fluctuate for different fuels and oils. A fuel (or oil’s) saturation point (100% humidity) is the point at which free water will begin to form. Users should take action if the RH is high, say >60% as free water might be forming.
It is possible for an operator to correlate PPM values vs RH readings by measuring against a Karl Fischer at different temperatures in order to create a lookup chart. This would only be accurate on the target oil/fuel oil.