Deploying data to monitor aging pipelines
August 06, 2020
August 06, 2020
Data management is at the core of effectively monitoring aging pipeline infrastructure for Energy & Resources projects
This article first appeared as “Is data the answer to pipeline maintenance?” in Stantec ERA, Issue 03.
Pipelines are predominantly how the Oil & Gas industry moves its product. In North America, millions of miles of pipeline infrastructure are used every day to safely and efficiently move energy and raw materials from production areas or ports of entry throughout the continent to consumers, airports, military bases, communities, and other industry partners.
While pipelines remain the safest way to move our energy products, there are still risks associated with the transport process—primarily oil and gas leaks. When a leak occurs, it can have severe negative environmental, social, and economic impacts for pipeline operators and their surrounding communities.
To avoid these impacts, it is essential that we safeguard pipelines from the risks associated with a leak. In order to maintain the highest standard of safety and security for pipeline infrastructure, we need to know that we are monitoring each line effectively. But at almost 3 million miles of gathering, transmission, and distribution pipelines in North America, how can we successfully manage them all?
By deploying a pipeline integrity data management solution.
Data is increasingly seen as an asset. It can promote better business decisions, optimize operations and maintenance, and reduce overall costs. We believe in managing our clients’ key pipeline database management projects—starting in the design and planning phase to operational implementation and monitoring. The goal of these databases? To improve all aspects of pipeline safety—from environmental impacts and leak prevention, to resource prioritization and loss prevention.
What we’ve found is that strategic data management is key to effectively monitoring aging pipeline infrastructure. But, the successful implementation of a data management solution requires a deep understanding of the data and its functionality.
By combining expertise from a variety of data-driven fields, our team offers a complete solution by objectively analyzing the data management structure. We use multiple specialized lenses, including pipeline integrity, engineering, geomatics, and information technology services. This advances our goal of spatial data management, to organize spatial data from several sources and provide a full picture across functional work groups.
Our experience has demonstrated that by aligning three key principles when managing pipeline integrity data, we can significantly enhance the quality of information and function. Those principles:
As our way of managing data continues to evolve, so does our ability to effectively monitor aging pipelines across North America.
When it comes to designing and operating smart pipeline infrastructure, all effective decisions must be based on a foundation of accurate and reliable data. This foundation of data is established by deploying a wide range of tools that can acquire, transform, validate, store, analyze, distribute, and maintain data of all forms and sources.
We take our foundation of data and combine it with real-time inputs like operations, land management, and inspection data. This approach provides end-users with the most up-to-date information. This centralized source offers three vital elements for businesses to reduce risks and costs:
Integrated data offers a more detailed representation of “what” the picture is to a wider audience and enables a better decision-making process. How? By promoting extensive collaboration between functional work groups and enhancing their ability to access the same information. Ultimately, this leads to more rapid and robust decision-making capabilities.
Much of today’s data is based upon spatial information. Many applications tailor information to you based on your location and proximity to other elements. It’s all about the “where,” and that is fundamentally governed by geomatics.
The same spatial data used in smartphones and other devices is now helping the Oil & Gas industry. From tracking nearby construction projects to managing vegetation growth in pipeline right-of-ways, the where often provides the linkage between unique datasets that can exist in different operational work groups. This generates a bigger picture and compounds the knowledge base.
Implementing spatial data—or the where—also offers lifecycle efficiencies regarding pipeline-threat detection, response planning, active maintenance, and monitoring. Our data management solutions can provide our clients with the what and the where faster than other conventional methods.
There are more than 3 million miles of pipeline infrastructure in North America—and its aging by the day. To stay ahead of the curve, operators are required to analyze large volumes of data to monitor pipelines, identify threats, and initiate corrective measures, if needed. Powerful data-management solutions generate these overlapping data elements and closely align with essential pipeline integrity processes.
For example, let’s consider a pipeline-integrity engineer who identifies an external corrosion feature on an active pipeline. If the engineer applies this data with historical information, he/she can highlight a breakdown in pipeline coating. The geolocation data defines where the repair should proceed. But it can also be further analyzed against crossing features and depth of cover information to provide further insights into site conditions when planning excavation.
Why is this important? Because the intersection of data can determine if the repair is under a road or waterway, and how much excavation is required for a fix. It may also include right-of-way information to aid in notifications and crossing agreements. The deeper the data, the greater the analytical capabilities—saving significant time and money and proving pipeline integrity data management solutions are key to monitoring our aging infrastructure.
Data can change how Oil & Gas providers deliver their products. By instituting a pipeline data-management solution, operational teams have the integrated information required to actively detect and prevent threats, monitor operations, and conduct essential response planning.
By monitoring a pipeline’s health, proactive steps can be taken to ensure an operator can measure, evaluate, and manage their risk. In general terms, risks are at the confluence of likelihood and consequence—both of which are identified using large volumes of data.
The likelihood of failure analyzes engineering and monitoring data in alignment with external threats such as terrain, crossings, and adjacent construction works. The results are then correlated to potential consequences, which consider diverse information pertaining to population, terrain, environmental concerns, and historical geospatial information.
Pipeline operators are actively seeking new ways to minimize risk. Risk to operations, stakeholders, environment, reputation, and communities. Forward-looking analysis using diverse and aligned datasets is critical to developing safer—and more efficient—operations.
As our way of managing data continues to evolve, so does our ability to effectively monitor aging pipelines across North America.