Advancing OGMP 2.0 Methane Reconciliation: Interview with Dr. Tarakki
In a recent interview, Dr. Nadia Tarakki, Postdoctoral Researcher at the St. Francis Xavier University FluxLab, sits down to dive into the details of the Arolytics<>FluxLab methane reconciliation and inventory method for OGMP 2.0. Nadia was a Lead Researcher and key contributor to the now commercial, code-based framework.
Q: What gap or problem in methane accounting or reconciliation were you trying to fill when you started this project?
A: When I started this work, the main gap was the absence of a transparent, statistically defensible way to integrate multi-technology methane measurements into a single, reconcilable inventory.
Most operators were running measurement campaigns—aerial, truck, OGI—but each method saw only part of the emissions reality. There was no standard way to merge those partial truths, quantify uncertainty, or compare site-level results with source-level inventories in line with OGMP 2.0.
Our goal was to fill that gap by developing an open, traceable, and empirically grounded framework that converts heterogeneous field data into a unified, auditable emissions estimate, one that oil and gas operators can actually use for reconciliation instead of relying on black-box vendor outputs.
Q: Your framework uses a four-stage pipeline. Why did you choose that design?
A: We organized the workflow into structured pipelines because each evidence stream, measured, modeled, or calculated, needs to be treated differently but transparently.
- Pipeline 1 standardizes and scales raw field data through source stratification, bias correction, and mirror-match bootstrapping.
- Pipeline 2 uses a hierarchical bayesian model to integrate those measurements probabilistically.
- Pipeline 3 adds “invisible” sources that measurements can’t capture, such as flares or inaccessible vents.
- Pipeline 4 is where reconciliation, and year-over-year diagnostics occur.
This modular architecture lets analysts document every judgment call, reproduce results, and progressively refine inventories without turning the process into a black box.
Q: Why use a hierarchical Bayesian model for integrating measurements and scaling to site level?
A: Because methane measurements are incomplete and noisy by nature.
A hierarchical Bayesian model (HBM) allows us to treat each technology’s output—truck, OGI, aerial—as a partial glimpse of the same latent truth. The model weighs them according to their detection effectiveness and uncertainty, and uses Hamiltonian Monte Carlo to infer the most plausible total site-level emissions.
Compared with simple averaging, the HBM formally accounts for uncertainty, prevents double counting, and statistically fills detection gaps. It also allows prior knowledge, such as historical inventories or engineering expectations, to guide but not dominate the result, making it both data-driven and transparent.
Q: How does this framework align with OGMP 2.0 reporting requirements, and why is it especially relevant for operators?
A: It directly operationalizes OGMP 2.0 Level 5 guidance.
The framework produces site-level, measurement-informed inventories with quantified uncertainty, a core requirement for Level 5. It also supports iterative reconciliation: operators can compare their Level 4 (source-based) and Level 5 (site-based) values, identify under- or over-counted sources, and narrow the gap year over year.
For operators, the relevance is practical. The workflow is transparent, auditable, and compatible with OGMP 2.0 Level 5 and Veritas Pathways 2 and 3, meaning it can plug into existing reporting systems while giving clear diagnostic insight into where methane is actually coming from.
Q: Looking ahead 5–10 years, how do you see this methodology evolving with new measurement technologies?
A: I see it evolving in two directions:
- Automation and scale: As continuous monitoring and satellite data become common, the framework can ingest higher-frequency data streams while preserving its probabilistic structure.
- Adaptive priors and real-time reconciliation: We’ll move toward dynamic Bayesian updating, where new measurements automatically refine priors and uncertainty bands.
Over the next decade, I expect this to become a live, rolling inventory system rather than an annual exercise, providing operators and regulators near-real-time confidence in methane estimates.
Q: What were the hardest technical or practical challenges you encountered?
A: The toughest issues were data quality and alignment:
- Missing or inconsistent metadata: many datasets lacked clear source or facility IDs, which made proper source stratification difficult.
- Uncertain operating hours: it was often unclear how long emissions were active, making temporal scaling to the site or asset level challenging.
- Limited multi-technology overlap: OGI was mostly used for LDAR rather than quantification, while aerial datasets generally provided quantitative data but only for certain campaigns. This meant we had very few overlapping datasets across measurement technologies.
Addressing these required building strong analyst–operator feedback loops and establishing clearer data standards, because even the most advanced statistical model is only as good as the data that feeds it.
Q: How did you engage with Veritas and other existing frameworks while developing this approach?
A: I started by studying the framework developed by Professor Matt Johnson in detail and even met with him to discuss its design and limitations. I spent a significant amount of time understanding his approach, particularly the concepts of source stratification and the Bayesian modeling structure used to account for missed detections, which strongly influenced my own framework.
After that, I examined the OGMP 2.0 and Veritas protocols closely to understand their underlying logic and identify elements that could be integrated or improved upon. My goal was to build something that combined the strengths of these existing frameworks while enhancing transparency and reproducibility.
Throughout development, I also stayed in regular contact with the Veritas team, especially those working on their statistical components, to ensure that my methodology aligned with emerging best practices and maintained methodological integrity.
Q: What are the next steps for your research?
A: Next, I’m focusing on additional applications and validation of the framework under real-world measurement programs—working closely with operators and extending its application to other sectors, such as landfills.
We also recently presented our results at the AGU Fall Meeting 2025 in New Orleans, where we discussed how hierarchical modeling can bridge regulatory methane inventories with field-based observations.
In the longer term, the goal is to evolve this framework into an open-source, community-validated platform for transparent and measurement-informed methane accounting.
Quick FAQs: OGMP 2.0 Through a Research Lens
- Why is uncertainty so important in OGMP 2.0?
- Because underestimating uncertainty creates false confidence and undermines credibility.
- Why use Bayesian models instead of simple averages?
- They properly account for detection bias, uncertainty, and incomplete observations.
- Are invisible emissions scientifically significant?
- Yes. Excluding them consistently biases inventories downward.
- Can this approach evolve with new technologies?
- Yes. The framework is designed to ingest higher-frequency and emerging datasets.
- Is reconciliation a one-time exercise?
- No. From a research standpoint, it is an iterative learning process.
From Nadia Tarakki’s perspective as a postdoctoral researcher, OGMP 2.0 Level 5 is not just a reporting framework: it is a scientific challenge. Meeting it requires moving beyond raw measurements toward transparent, statistically defensible reconciliation.
By grounding inventories in probabilistic reasoning and rigorous data treatment, operators can build methane inventories that stand up to scrutiny and support real-world action.
-> Are you an oil and gas company currently building OGMP 2.0 inventories and interested in practical and efficent reconciliation approaches? Email us at info@arolytics.com to learn more about our code-based reconciliation framework.