Advanced Computing in the Age of AI | Saturday, April 20, 2024

Fueled by HPC, Energy Industry Drives to New Heights 
Sponsored Content by Dell EMC

In the data-driven energy industry, success increasingly hinges on the use of ever-more-powerful high performance computing systems.

From the discovery and mapping of new oil and gas fields to the design and placement of wind- and solar-energy farms, high performance computing continues to be an essential resource for the energy industry. Almost anywhere you look in the industry, you will find powerful parallel-processing systems churning through enormous amounts of data.

“Today, the energy sector continues to push the envelope for advanced simulation,” notes Randall Newton, a contributing editor for Digital Engineering, in an online article. “From oil and gas companies simulating subsea conditions in water depths of more than 2,000 meters — where most of the world’s remaining oil and gas resources lie — to modeling the energy needs of smart cities to bolster sustainability efforts, to predicting global wind patterns to inform wind farm designs, and more.”[1]

With those thoughts in mind, let’s look at a few examples of the way organizations are using HPC to take the energy industry to new heights.

Finding new oil and gas deposits

For decades, HPC has been one of the keys to oil and gas exploration, a pursuit that revolves around huge datasets, complex algorithms and massive calculations. The seismic processing associated with this work has always been both data- and compute-intensive. Today, we have a term to describe this: high performance data analytics, or HPDA. This term, coined by the research firm IDC, refers to the confluence of HPC and big data analytics deployed onto HPC-style configurations.[2]

In the energy industry, the combination of HPDA and really smart algorithms is helping companies churn through massive amounts of seismic data to precisely pinpoint elusive oil and gas reserves. How massive are those datasets? A large exploration project might produce multiple petabytes of raw data, and an energy company’s total data volumes might encompass tens or even hundreds of petabytes of data.[3]

Clearly, the energy industry needs HPDA, as do many other data-saturated industries. This need is fueling the growth of the HPDA industry. Globally, the HPDA market is expected to grow at the rate of more than 18 percent from 2016 to 2022, according to the firm Market Research Future.[4]

Taking simulation to a higher level

To better understand oil and gas reservoirs, energy companies use complex simulators to help determine the safest, most efficient, most productive and most environmentally friendly way to exploit resources. These simulators, which guide critical management decisions, can take an enormous amount of parallel processing power — and the more the better, because bigger systems equate to faster and higher resolution simulations.

A case in point: In February 2017, ExxonMobil announced that, together with the National Center for Supercomputing Applications (NCSA), it had put 716,800 processors to work to achieve a major breakthrough in parallel simulation. This record run resulted in data output thousands of times faster than the typical oil and gas industry reservoir simulation, the company reported.

“This breakthrough has unlocked new potential for ExxonMobil’s geoscientists and engineers to make more informed and timely decisions on the development and management of oil and gas reservoirs,” an ExxonMobil executive noted in a news release. “As our industry looks for cost-effective and environmentally responsible ways to find and develop oil and gas fields, we rely on this type of technology to model the complex processes that govern the flow of oil, water and gas in various reservoirs.”[5]

Improving seismic processing and imaging models

CGG, a global geophysical services company, enhanced its HPC capabilities in order to improve the performance of its seismic processing and imaging models. This initiative responded to CGG’s need to process and analyze growing data volumes produced by advances in seismic data-acquisition techniques. In addition, CGG wanted to satisfy customer demands for greater accuracy and speed while controlling the costs of producing seismic analysis.

With high performance parallel processing powered by Intel® Xeon® processors, as well as Intel™ Solid State Drives (SSDs) for enhanced throughput, CGG improved performance by approximately 400 percent, according to an Intel case study.[6] Today, CGG’s HPC cluster is helping the company’s clients identify potential oil and gas reservoirs, assess their economic viability, and design exploration and drilling operations that maximize output.

Leveraging analytics at the edge

In oil and gas fields, dynamic reservoir management and overall production operations benefit from real-time data analysis, modeling and scenario planning. This is one of the keys to ongoing production operations optimization capabilities that deliver measurable improvements and bottom-line results.

Dell EMC and Landmark — a leading provider of data and analytics, science, software and services for the exploration and production industry — enable real-time analytics with the Landmark Field Appliance. This Dell Edge Gateway allows engineers to do analytics at the edge based on data generated by thousands of sensors. Real-time analytics — run in the field without having to move the data to a data center — enables instant visibility into field operations, resulting in a faster understanding of what is happening with complex systems. That can be a key to realizing measurable improvements and clear bottom-line results. Watch the video.

Designing better materials

In the fall of 2017, U.S. Secretary of Energy Rick Perry kicked off a new Department of Energy initiative that will help industry players leverage HPC systems to accelerate the development of new and improved materials for use in severe and complex environments. The HPC for Materials Program aims to enable a step change in the cost, development time and performance of materials using HPC, modeling, simulation and data analysis, according to the DOE.[7]

The initiative will initially focus on the development of materials that can sustain such conditions as extreme pressure, radiation and temperature, along with corrosion, chemical, vibration, fatigue and stress states. This initial phase of the program will also focus on developing improved lightweight material technologies. The program is currently seeking concept papers that will address these challenges.[8]

Powering advances in wind-energy systems

In the renewable energy space, HPC is helping organizations improve the performance and efficiency of wind systems. That’s the case at Sandia National Laboratories, where the organization’s Advanced Simulation and Computing program arms the wind-energy industry with the processing and simulation power they need to design better energy-generation systems.[9]

One example: Research at the facility is driving advancements in the design of more effective wind turbine blades by predicting their performance and behaviors before they are built, developing wind turbine blade noise prediction methods, and improving the ability to predict power-generation impacts from wakes and the motion of air within an array of wind turbines.

Elsewhere in wind energy research, a petaflop supercomputer at the National Renewable Energy Laboratory’s High-Performance Computing Center operates what it says is the world’s largest HPC system dedicated to advancing renewable energy and energy efficiency technologies. This system gives researchers the tools they need to explore phenomena like wind turbine wakes and simulate the performance of wind farms — research that is key to the development of more productive wind-energy systems.

Stay tuned for more at SC18

This list of examples could go on and on, because HPC is now everywhere in the energy industry. HPC is essentially the energy — the driving force — that fuels the energy industry itself, as the industry takes research, exploration and production to new heights.

If you have the good fortune to attend the SC18 — the premier international conference for HPC, networking, storage and analysis — this fall, you’re sure to hear more stories of the power of HPC in the energy industry. That’s particularly appropriate given that the conference will take place in the heart of the U.S. energy country — Dallas, Texas.


[1] Randall Newton, Digital Engineering/HPC Leading Edge, “Far from Simple Simulation: Making the most of energy resources with high performance computing,” June 2018.

[2] Intel, “Big Data Meets High Performance Computing,” 2014.

[3] BizTech magazine, “How high-performance computing helps companies track down hidden fields of oil and natural gas,” accessed June 20, 1018.

[4] Market Research Future, “High Performance Data Analytics (HPDA) Market Research Report – Global Forecast to 2022,” June 2018.

[5] ExxonMobil news release, “ExxonMobil Sets Record in High Performance Oil and Gas Reservoir Computing,” Feb 16, 2017.

[6] Intel solution brief, “High Performance Computing Speeds Solution Discovery,” June 2016.

[7] U.S. Department of Energy, “Secretary of Energy Rick Perry Announces High Performance Computing for Materials Program to Help Industry Develop New, Improved Materials for Severe Environments,” September 19, 2017.

[8] U.S. Department of Energy, HPC for Materials Program, “First Round of HPC4Mtls Solicitation is Now Open,” accessed June 20, 2018.

[9] U.S. Department of Energy, “Wind Energy Facilities” overview.

EnterpriseAI