Regulations for pipeline operators within the Oil and Gas Pipeline Industry are becoming increasingly rigorous especially in the fields of pipeline integrity and emergency response and as such the need for equally rigorous approaches to analyzing, understanding, managing, and reporting of the effects of a pipeline release has increased. Trans-Northern Pipelines (TNPI) undertook a detailed fate and transport modeling project for its Ontario and Quebec based pipeline systems. These pipeline systems are relatively complex products delivery systems that handle multiple fluids and, for some pipeline segments, allow for flow in either direction. The goals of the project included understanding and improving risk management of the systems through consequence reduction and to identify and quantify potential impact to surrounding areas and risk receptors within the vicinity of the pipeline systems. The receptors are the focal points that receive the negative impact if there is a leak (i.e. pin-hole, small and large scenarios) or rupture of the pipeline. These can include health/safety, environmental, property damage, reputation and public disruption, and financial impacts. The outcomes from the project included a continuum of potential release volumes along the pipeline systems for each product and operational scenario from which a spatial representation of the potential impact areas (due to overland flow) along the pipeline system were derived and locations were identified where potential release volumes (i.e. initial and drain down volumes) could affect streams and could possibly be transported along the stream. Potential impact areas and representations of the stream transport were used to identify possible risk receptors. A unique aspect to this modeling project was that it was undertaken utilizing a four dimensional (4D) model thus allowing for the results to be visualized within a GIS and as animations; both of which facilitated visualization of potential release impact over time (e.g. a 48 hour period). Utilizing the time domain provided unique insights that were used to augment TNPI’s emergency response plan. Based on a review of the preliminary results Trans-Northern undertook additional effort to investigate what-if scenarios for valve placement. The purpose of these what-if scenarios was to quantify the improvement of additional valves on lowering the overall potential impact and thus allowing for a quantitative basis for valve placement.
Class location analysis is a key component of an effective integrity management plan. Application of class location results vary based on the product transported, but regardless of product, the analysis has requirements of audit, reporting, and annual update. Process definition and reengineering as part of the Natural Gas Business Unit (BP Canada) pipeline integrity management system was a driving force behind implementation of an automated class location analysis. The results of the automation effort have been manifested as: • Reduced data acquisition costs; • Improved data workflow; • Increased data reuse; • Cost savings through personnel reduction, and • Consistency in application of rules defined by the pipeline integrity management system.
For the majority of pipeline operators struggling to establish the business case for data management, records management, or geographic information systems, a step past the traditional information technology approach of return on investment (ROI) must be made. Traditional information technology value propositions are founded on information efficiencies that, for the most part, are extremely difficult to quantify since the processes are either not presently performed or the effort associated with the existing process has not been measured. Without a baseline of the existing process, a comparative analysis using improved efficiencies cannot be quantified to substantiate a return on investment. Justification of a data management system and its associated benefits in terms of its cost relative to the cost of the data it manages (e.g. ILI, excavation, CIS etc.) is compelling since it is only on the order of 2–10%, but typically even this metric is too general an argument for most pipeline integrity managers to feel comfortable defending. This paper will explore the process required to unearth the value of data management to support pipeline integrity. Many examples and cases will be discussed to back-up the approach to establishing value of data management for pipeline integrity.
BP’s Natural Gas Liquids business unit (NGLBU) has conducted integrity investigation and mitigation activities on its pipelines and has been following this best practice for numerous years. In recent times, NGLBU’s data management initiatives focused on establishing an enterprise Geographic Information System (GIS) coupled tightly with a derivative of the Pipeline Open Data Standard (PODS) data model. During successful implementation of the GIS, an analysis identified gaps in existing data management processes for pipeline integrity information. Consequently, the business unit adopted Baseline Technology’s Pipeline Information Control System (PICS) and its modules to support the pipeline integrity decision-making process on its 9000km of pipeline. The PICS implementation leverages the existing GIS implementation while addressing a number of unresolved data management and integration issues, including: • Integration of inline inspection with excavation results; • Migration of above ground surveys to a common repository; • Integration of multiple inline inspections; • Facilitation of corrosion growth modeling; • Structured process for prioritization of remediation; • Structured process for integration of inline inspections with risk parameters; • Defined data collection, storage, and integration standards. Data management solutions based solely on a GIS require pipeline surveys without explicit positional information to be converted into a common linear reference system (typically chainage or stationing) such that disparate data sets may be overlaid and compared. This conversion, or spatial normalization, process is where much of the data management effort is spent and is often prone to error introduction. Even when small errors are introduced, the normalization process is often performed such that it is not auditable. If the underlying spatial errors are not reported, addressed, and understood, the value of the data integration and any subsequent analysis of the combined data set is questionable.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.