Accelerating decision making in exploration workflows | SLB
Accelerating Exploration Workflows

Accelerating decision making in exploration workflows with digital

Li Dan Headshot Marie LeFranc Morten Kristensen
by  Li Dan Marie LeFranc and  Morten Kristensen

Continued hydrocarbon exploration is necessary to arrest global production decline and provide a long-term, predictable, and stable energy supply for the world’s growing population. As the energy transition gains momentum, however, hydrocarbon exploration is changing too. The past decade has seen a strategic shift toward infrastructure-led exploration (ILX), also known as near-field exploration. This shift is driven by operators needing to balance risk in their exploration portfolios while also reducing their production’s carbon intensity.

6 min read
Global

By focusing exploration on regions that are better understood geologically—and where potential discoveries can be tied into existing infrastructure without large and costly new developments—ILX projects are characterized by lower capex, shorter payback periods, and lower CO2 emissions. In a macro context of energy transition, decarbonization, and future energy mix uncertainty, this is all very compelling.

With most of the upcoming exploration activity occurring near existing infrastructure, and discoveries often being more marginal in nature, operators are now looking toward tech and solutions for reducing cycle time and improving development economics. Traditional exploration and appraisal workflows often take years from initial discovery to investment decision, meaning they’re no longer adequate. Achieving a shorter cycle time means transforming how data are utilized, how decisions are made, and how long it takes to make such decisions. We don’t have years—we have days, weeks, or maybe months.

The fast and the very fast

Turning exploration data into reservoir insights and key development metrics (and doing so in a fast-track manner to accelerate decisions) is essential in any near-field exploration project. That said, there are two characteristic time frames when it comes to serving critical decision processes:

  • A development decision time frame of (typically) weeks to months
  • A real-time decision time frame of hours to days.

The development decision time frame is governed by the need to fast-track the project and shorten the time to first production, thereby improving overall economics. The real-time decision time frame is governed by actions that can be taken while the rig is still available—actions such as changing or adapting the measurement acquisition program, changing a completion design, or deciding to drill a sidetrack. By nature, ILX projects are developed with very few exploration and appraisal wells. This means that reservoir complexities must be resolved and understood in real time while the tool is still in the well. There may not be another data gathering opportunity if logging analysis indicates that a reservoir is more complex than anticipated. 

How digital workflows accelerate decision making

Common to both decision time frames is the need to connect measurements to reservoir insights or metrics that matter for development decisions (factors such as lateral and vertical connectivity, well productivity, and connected hydrocarbon volume), while simultaneously quantifying the uncertainty in these metrics. This requires interpretation and integration of data across all sources and disciplines, from surface seismic and local geological knowledge to borehole logs, fluid properties, and pressure transient tests. Such integration is done by building a digital representation of the reservoir, i.e., a 3D static and dynamic model. 

 

Advancements in digital tech are rapidly changing how we do modeling by automating previously manual steps; deploying novel, AI-backed techniques; and leveraging cloud-based applications to connect data with interpretation and modeling in a real-time, multi-user environment.

Reservoir modeling is not a new endeavor. It has traditionally been a specialized, time-consuming exercise disconnected from the teams actually acquiring the data. Advancements in digital tech, however, are rapidly changing the way we do modeling. Steps that were once manual are now automated as novel, AI-backed modeling techniques are deployed. Meanwhile, more organizations are seen leveraging cloud-based applications to connect data with interpretation and modeling in a real-time, multiuser environment.

Automated answers

One of the biggest leaps in workflow productivity comes from the automation of processing and interpretation tasks, particularly for petrophysical logs and borehole images. While more advanced interpretation is always possible, the industry currently benefits from automated methods that directly provide key answers required for 3D modeling, including lithology, porosity, permeability, and fluid saturation.

We also have automated methods for the interpretation of structural and sedimentary dips from borehole images, an otherwise very laborious component of a borehole geologist's role. The processed borehole images then feed into AI-based algorithms for autozonation and facies classification. Machine learning segments and classifies images according to sedimentary geometry, a first step in depositional facies identification.

Novel modeling techniques

In an exploration where hard data is limited to a few well locations, two things require careful attention when modeling:

  • Honoring what we know and the measurements that we have
  • Being quantitative about uncertainty.

Reservoir modeling usually starts with a 3D structural model. Recent developments in implicit structural modeling have led to a new algorithm, known as Meshless Smart Structure, which automates much of the model building. Tailored to the near-wellbore environment, this algorithm handles inputs from seismic surfaces, well tops, fault surfaces and markers, as well as dips along and away from the borehole trajectory.

The next step of the modeling process creates different candidate realizations of the 3D depositional environment. Using a novel method based on 3D analogue search, the autozonation from log interpretation is queried against a large repository of 3D stratigraphic models to obtain a subset of 3D models that honor the well logs but have potentially varying geobody extensions away from the well. Together, these methods enable us to honor the wellbore data while creating realistic 3D depositional environments in a multiscenario, uncertainty-aware manner. Finally, to complete the static modeling, a hybrid method combines geostatistics and machine learning to populate petrophysical properties.

AI-powered pressure transient measurements are key to minimizing uncertainty.

Among exploration well measurements, pressure transient tests provide the deepest investigation into the formation. As such, they are crucial for determining a discovery's size and production potential. Pressure transient measurements are used to calibrate the dynamic reservoir models and identify the most likely geological realizations, thus narrowing down uncertainty. History-matching a dynamic reservoir model is time consuming, but the process is being accelerated by introducing so-called surrogate models into the workflow. 

Surrogate models use machine learning to accelerate computationally expensive numerical simulations. Such acceleration methods make it possible to deliver history-matched models with a quantified uncertainty after using measurements to do the calibrating. With this understanding, we gain a clearer picture of how individual well measurements contribute to reducing geological uncertainty, and which elements of the geological model remain uncertain even after taking all measurements into account.

Cloud-based platforms

As any reservoir engineer or geoscientist knows, data flow between siloed, domain-specific applications has been a major bottleneck in subsurface interpretation and modeling workflows. Regardless of the sophistication of new modeling techniques, workflow productivity cannot be improved without also addressing the way in which users interact with data, collaborate with each other, perform interpretation and modeling, and generate insights for decision making

Cloud-native applications have changed the game. Today, innovators are building cloud platforms that connect users with data directly from the wellsite, embed all the innovations and automations discussed above, and enable decision making in the relevant time frame. In other words, a one-stop shop connecting exploration well data to key, actionable reservoir insights.

Not everything is digital

Great strides are being made with digital tech to address the cycle time in ILX projects, but it all rests upon a foundation of quality measurements in the exploration wells. Borehole image data are needed for dip interpretation and facies delineation. High-definition spectroscopy and nuclear magnetic resonance measurements provide lithology and key petrophysical properties. And formation tester measurements provide reservoir pressure and fluid properties. 

The crucial production potential of a well is evaluated from either a well test or a wireline-conveyed formation test. With CO2 emissions reduction being a driver for many operators, the ability to test a well without flaring is gaining significant traction. Deep transient testing (DTT) on wireline bridges the gap between well testing and wireline formation testing. With flow rates above 100 bbl/d, DTT investigates deeper than traditional formation testing without flaring the produced fluids. It also maintains the lighter footprint and faster execution of a wireline operation, thus making it attractive in ILX projects.

With hydrocarbon exploration shifting to the near field and decarbonization goals gaining more prominence, the tech supporting the exploration workflow needs to change as well. What used to take years must now be done in weeks or months—and at less cost. There are no fundamental barriers to this change, and with innovations in both digital and hardware solutions, the industry is already well on its way.

Contributors

Li Dan

Reservoir Product Manager, Reservoir Performance

Li Dan’s role involves finding tech solutions to subsurface-related reservoir characterization challenges, building on leveraging deep domain knowledge, innovation, digital workflows, and actionable reservoir insights for customers. She currently leads a team of senior staff to define portfolio strategy, create the future tech roadmap, align research and engineering efforts, and control commercial introductions of tech that addresses future industry needs sustainably.

Marie LeFranc

Digital Innovation Subject Matter Expert, Reservoir Performance

Based in Norway, Marie LeFranc has more than 15 years of experience at SLB and has previously held positions in research, as program manager and research scientist, and petrotechnical consulting services across North America and Southeast Asia. She holds a PhD degree in geology and geostatistics from the French School of Mines of Paris in France.

Morten Kristensen

Digital Integration and Reservoir Engineering Advisor

Morten has more than 15 years of experience at SLB and has previously held positions in research, digital tech development, and petrotechnical consulting across Europe, Middle East, and North America. He holds MSc and PhD degrees in chemical engineering from the Technical University of Denmark.

Subscribe