Natural gas is considered a fossil energy source from under
the earth's surface. Its main component is methane, but it can contain
non-hydrocarbon gases and natural gas liquids as well. Leaks are considered
very dangerous since they can build up into an explosive concentration. Gas
leaks can be hazardous to health as well as the environment. Leak detection is
a method in which the existence of a leak within a system is determined. Many
fluid leak detection mechanisms rely on observation of volume changes and
physical evidence of a leak, which may take hours, days and sometimes weeks or
months to be seen. This is a concern in gas plants where the proximity of the
leakage may constitute environmental pollution as well as health hazards for
personnel in the vicinity. Economic losses have also resulted from delays in
mitigating a gas leak problem due to late detection.
This study applies a machine learning technique to develop
an algorithm that can detect gas leaks in real-time, where the only possible
delay is the lag-time between the inlet gauges at the upstream valve and the
outlet gauge at the downstream valve. The dataset for the study was collected,
and a pre-processing phase was performed, which included cleaning the data,
attempting a linear regression model and other regressions (Random Forest).
This study also proposed a model and evaluated its performance. In this case
study of the JK-52 gas processing plant, the difference in pressure gauge
readings was calibrated against the volume of the gas in the inlet section to
quantify the leak volume. Because gaseous fluids do not present a physical
indication of volume, a pressure-based method was used for the detection, where
a drop in gauge pressure due to depressurisation indicates leakage in the
absence of a recorded gas supply or collection.
To build the model, the dataset was divided into samples to
train and test the model. Python coding language, using Jupyter and PyCharm
Integrated Development Environments (IDEs), was used for the programming. The
machine learning algorithm analyses the incoming streaming pressure versus time
datasets from the gauges during the residual and ramp-up flow phases to set the
acceptable pressure difference cut-off. A minimum difference in gauge reading
may be normal within an acceptable error margin. The change in the consistency
of reading within this acceptable window defines the tolerance. The system is
set up to blare an alarm when there is leakage, usually based on a cut-off or
tolerance, to be detected by the machine-aided process. Even if no immediate
event triggers the alarm, a leak can still be suspected and later confirmed
through further analysis. Over time, the model becomes predictive, improving
detection accuracy as it learns from experience.
Author(s) Details
Godsday Idanegbe
Usiabulu
World Bank, Africa Center of Excellence, Center for Oil Field Chemicals
Research, University of Port Harcourt, Choba, Rivers State, Nigeria.
Eddy Ifeanyi Okoh
FHN 26 Limited (First Hydrocarbon) Block W Shell Estate agent Edjeba,
Warri, Delta State, Nigeria.
Lucia Ndidi Okoh
Environmental Management and Toxicology, Southern Delta University, Delta
State, Nigeria.
Please see the book here :- https://doi.org/10.9734/bpi/ccert/v1/7080
No comments:
Post a Comment