Ask An Analyst: Reaching Gold Standard with the Arolytics OGMP 2.0 Service
Arolytics recently launched a code-based service that delivers an efficient, auditable, and scalable solution for UNEP OGMP 2.0 reconciliation and Level 5 reporting.
Today, we sit down with Dan Steeves, a Senior Emissions Analyst at Arolytics, to learn more about the framework.
Dan Steeves brings over a decade of engineering experience in the natural gas distribution and upstream oil and gas sectors, including previous roles in production operations, facility engineering, and project management. Dan provides practical insight into the root causes of emissions and identifies abatement opportunities that drive operational efficiencies and deliver return on investment. Dan also leads Alt-FEMP (Alternative Fugitive Emissions Management Program) applications, OGMP 2.0 directives, and provides insights within our software solutions.
For a company wanting to adopt your approach, what are the key prerequisites?
Data needs to be cleaned, standardized, and organized to support coherent analysis. A thorough inventory of facilities that were included in measurement campaigns and knowing their operational hours over the period being assessed is also needed. The more granular the measurement and inventory data is, the easier it is to compare and find discrepancies and areas which can be improved. Analysts that perform this work will benefit greatly from having firsthand experience and familiarity with oil and gas operations and should be well versed in the capabilities of technologies and their associated measurement data to interpret it correctly and validate any assumptions used in processing the data.
How do you address "undetected" or missing sources of emissions, and what patterns do you typically see in those gaps?
Different measurement technologies have their strengths at detecting certain types of emissions. Some site-level technologies tend to capture large, high rate sources that may be missed or are unable to be quantified by ground-based OGI, or similar technology, however, OGI is typically more effective at identifying smaller leaks that could fall below the detection limits of site-level technologies. Recognizing that a measurement technology only captures part of the emissions at a site, we can treat data from different technologies as complementary to each other, each with their own detection capabilities and effectiveness, which can form a more complete picture of the emissions distribution at a site. In addition, there are also non-routine episodic emissions which occur, such as blowdown events, which are often not captured during single point-in-time measurement campaigns. By reviewing operational data, such as SCADA, and operator logs, we can identify these emissions and incorporate them to form a more complete representation of the total emissions inventory.
Are there constraints on applying this framework (e.g. minimum data density, temporal resolution, metadata completeness) in different geographies or operational conditions?
Statistically, the framework works best when there is sufficient data density. A target minimum of around 20 measurements, per technology, per region should be present. Metadata completeness is key and missing information about the measurement methods, detection thresholds, locations, and times will limit how the data can be used. Snapshot measurement surveys from a single point in time are fine but accurate assumptions of the operating hours are needed in order to scale them to annualized averages. If emissions are highly intermittent, it’s important to identify and treat those separately. If geography and operational conditions can affect the detection performance, the framework can be adjusted.
How is the auditability and traceability of each step preserved? How easy is it for an external reviewer to retrace decisions?
The framework has been codified so that every step of the process - including assumptions, data transformations, and any input files - can be easily traced through the process. It also means that the entire analysis can be easily run again, with access to the data and scripts, to recreate the outputs. Our goal is to make this process as efficient, and effective for year-over-year repeatability.
What level of time and resources might a company expect to invest, when undertaking an inventory and reconciliation exercise?
It depends on the scope, the amount of data, and completeness of information available as the bulk of time required is spent building a thorough understanding of the assets and data sources, cleaning and organizing the data, and determining input assumptions. Building an inventory and initial reconciliation can typically be completed within 6-10 weeks, and after this has been completed once, subsequent analyses in the future with the same assets will be much easier and faster to process. A more detailed look into the reconciliation to determine any specific discrepancies, and potential improvements to measurement strategies, would be the next phase in the process which may take a similar amount depending on what other data sources are available.
Interested in streamlining your OGMP 2.0 reporting and reconciliation efforts? Contact info@arolytics.com to learn how our code-based service can add significant value to your organization.