University of Texas at Austin study measuring methane emissions released from natural gas production
Volvo Car Corporation testing 22 kW on-board three-phase charger; recharging time down to 1.5h

High Performance Computing key enabler for accelerating development of high efficiency engines

Wagnerandpannala
Increasing complexity of vehicle design is driving the need for better simulation and more powerful computers. Wagner and Pannala. Click to enlarge.

The complexity of new and future vehicles—driven by the need for increasing fuel efficiency and decreasing emissions with ever-changing drive-cycle demands and environmental conditions—is adding unprecedented flexibility in design and driving the need for better simulation and more powerful computers, observed Dr. Robert M. Wagner, Director of the Fuels Engines and Emissions Research Center, and Dr. Sreekanth Pannala, Senior Research Staff Member in the Computing and Computational Sciences Directorate at Oak Ridge National Laboratory in a keynote talk at the recent Global Powertrain Conference.

Advances in high performance computing (HPC) resources are leading to a new frontier in engine and vehicle development, Wagner and Pannala suggested, including the ability to produce detailed simulations to generate benchmark data; engineering simulations to explore the design space (e.g., injector optimization at ORNL); and reduced models for design optimization and control strategies. In general, HPC can help solve problems which were once thought unsolvable, they noted.

Bansal
3D DNS of auto-ignition with 30-species DME chemistry. (Bansal et al. 2011) Click to enlarge.

In a 2009 paper describing the use of S3D—a flow solver for performing direct numerical simulation (DNS) of turbulent combustion that was developed at the Combustion Research Facility (CRF) at Sandia National Laboratories (CRF/Sandia)—in terascale direct numerical simulations of turbulent combustion, Chen et al. noted that:

Computational science is paramount to the understanding of underlying processes in internal combustion engines of the future that will utilize non-petroleum-based alternative fuels, including carbon-neutral biofuels, and burn in new combustion regimes that will attain high efficiency while minimizing emissions of particulates and nitrogen oxides.

Next-generation engines will likely operate at higher pressures, with greater amounts of dilution and utilize alternative fuels that exhibit a wide range of chemical and physical properties. Therefore, there is a significant role for high-fidelity simulations, direct numerical simulations (DNS), specifically designed to capture key turbulence-chemistry interactions in these relatively uncharted combustion regimes, and in particular, that can discriminate the effects of differences in fuel properties.

—Chen et al.

Sankaran
Growth of DNS capabilities. Sankaran et al. Click to enlarge.

DNS is a tool for fundamental studies of the micro-physics of turbulent reacting flows and provides full access to time-resolved fields and physical insight into chemistry turbulence. It is a tool for the development and validation of reduced model descriptions that will then be used used in macro-scale simulations of engineering-level systems interactions.

Combustion currently provides 85% of our nation’s energy needs and will continue to be a predominant source of energy as fuel sources evolve away from traditional fossil fuels. Low emission, low temperature engine concepts of the future operate in regimes where combustion is poorly understood. In an effort to reliably predict efficiency and pollutant emissions for new engines and fuels, computer simulations are used to study fundamental turbulence-chemistry interactions.

Direct Numerical Simulations (DNS) are first principle, high- fidelity computational fluid dynamics simulations in which the reactive compressible Navier-Stokes equations are numerically solved on a computational mesh in which all of the spatial and temporal scales of the turbulence are resolved. In many practical turbulent combustion situations, turbulence strains the flame, causing molecular mixing of reactant streams. With increased mixing, chemical reactions are enhanced and overall efficiency increases up to a point, at which the loss of heat and radicals exceeds their rate of generation due to chemical reaction and the flame extinguishes resulting in increased emissions. Heat-release caused by the chemical reactions creates a complex feedback mechanism, affecting the intensity of the turbulence through density and viscosity changes across the flame.

—Bennett et al.

The recent launch of Oak Ridge National Laboratory’s (ORNL) 20 petaflop Titan supercomputer (earlier post), highlights the importance of HPC as a key enabler for accelerating the development of high efficiency engines. One of the first six applications modified to run on the new Titan architecture is S3D.

S3D’s first Titan science problem is a 3‐dimensional DNS of HCCI combustion in a high‐pressure stratified turbulent ethanol/air mixture using detailed chemical kinetics (28 chemical species). Estimated Titan core-hours needed: 128M—an order of magnitude greater that the time required by three other initial problems to be run by other applications, and second only to the 150M core-hours required by a 100-year climate simulation to be run with tropospheric chemistry with 101 constituents at high spatial resolution (1/8 degree).

PreSICE. In 2011, the DOE convened a workshop including 60 US leaders in the engine combustion field from industry, academia, and national laboratories to identify research needs and impacts in Predictive Simulation for Internal Combustion Engines (PreSICE). The workshop focused on two critical areas of advanced simulation, as identified by the US automotive and engine industries:

Fuelspray
Fuel injection involves a cascade of complex processes. PreSICE report. Click to enlarge.

  1. Fuel spray processes. Fuel sprays set the initial conditions for combustion in essentially all future transportation engines, the workshop noted. However, designers now primarily use empirical methods that limit the efficiency achievable. The workshop identified three primary spray topics as focus areas:

    1. the fuel delivery system, which includes fuel manifolds and internal injector flow;
    2. the multi-phase fuel–air mixing in the combustion chamber of the engine, and;
    3. the heat transfer and fluid interactions with cylinder walls.
  2. Stochastic processes. Current understanding and modeling capability of stochastic processes in engines remains limited and prevents designers from achieving significantly higher fuel economy, the workshop found. To improve this situation, the workshop participants identified three focus areas for stochastic processes:

    1. improve fundamental understanding that will help to establish and characterize the physical causes of stochastic events;
    2. develop physics-based simulation models that are accurate and sensitive enough to capture performance-limiting variability, and;
    3. quantify and manage uncertainty in model parameters and boundary conditions.
Nerookar
Neroorkar et al. “Simulations and Analysis of Fuel Flow in an Injector Including Transient Needle Effects”, ILASS-Americas 24th Annual Conference on Liquid Atomization and Spray Systems, May 2012. Click to enlarge.

Improved models and understanding in these areas will allow designers to develop engines with reduced design margins and that operate reliably in more efficient regimes, the workshop report stated. All of these areas require improved basic understanding, high-fidelity model development, and rigorous model validation. Such advances would greatly reduce the uncertainties in current models and improve understanding of sprays and fuel–air mixture preparation that limit the investigation and development of advanced combustion technologies.

Because of their relatively low cost, high performance, and ability to utilize renewable fuels, internal combustion engines—including those in hybrid vehicles—will continue to be critical to our transportation infrastructure for decades. Achievable advances in engine technology can improve the fuel economy of automobiles by over 50% and trucks by over 30%.

Achieving these goals will require the transportation sector to compress its product development cycle for cleaner, more efficient engine technologies by 50% while simultaneously exploring innovative design space. Concurrently, fuels will also be evolving, adding another layer of complexity and further highlighting the need for efficient product development cycles. Current design processes, using “build and test” prototype engineering, will not suffice. Current market penetration of new engine technologies is simply too slow—it must be dramatically accelerated.

—PreSICE workshop report

Wagner-model-hierarchy
Model hierarchy proposed by DOE and national laboratories spans fundamentals to full vehicle. Source: Wagner and , prepared in cooperation between ORNL, Sandia National Laboratories, and DOE. Click to enlarge.

Opportunities. Many concepts for higher efficiency engines historically have been held back by the available technology, Wagner noted in an earlier talk at the US Department of Energy’s (DOE) 2012 Directions in Engine-Efficiency and Emissions Research (DEER) conference. Examples of this include:

  • Direct Injection Spark Ignition. (e.g., Scussel, Simko, and Wade, “The Ford PROCO Engine Update”, SAE Technical Paper 780699, 1978).

  • Low Temperature Combustion. (e.g., Najt and Foster, “Compression-Ignited Homogeneous Charge Combustion”, SAE Technical Paper 830264, 1983; Akihama, Takatori, and Inaga, “Mechanism of the smokeless rich Early SI Direct Injection diesel combustion by reducing temperature”, SAE Technical Paper 2001-01-0655, 2001.)

  • Dual-fuel Combustion. (e.g., Stanglmaier, Ryan, and Souder, “HCCI Operation of a Dual-Fuel Natural Gas Engine for Improved Fuel Efficiency and Ultra-Low NOx Emissions at Low to Moderate Engine Loads”, SAE Technical Paper 2001-01-1897, 2001; Singh, Kong, Reitz, Krishnan, Midkiff, “Modeling and Experiments of Dual-Fuel Engine Combustion and Emissions”, SAE Technical Paper 2004-01-0092, 2004.)

Advancing computational technology—from HPC supercomputers to on-board computers—is now opening unprecedented opportunities in combustion strategy and controls, Wagner and Pannala said. Four years from now, the next generation of supercomputer should arrive (OLCF-4), offering 400+ petaflops, they said, with the first 1 exaflop supercomputer (OLCF-5) following sometime after 2020, introducing the era of exascale computing. (Earlier post.) (Petaflop = 1 quadrillion floating point operations/second; Exaflop = 1 quintillion floating point operations/second.)

We can expect similar growth in computing capability of onboard vehicle hardware, the pair noted.

Simulation efforts fall into three broad areas:

  • Predictive combustion: combustion optimization and methods development;
  • Full engine simulation: engine system optimization and model-based onboard controls; and
  • Full vehicle simulation: technology interactions, component optimization and supervisory controls

Each scale of simulation requires different level of fidelity; an increase in complexity results in an increase in the simulation space and accompanying computational requirements. There is thus a need, Wagner noted in his DEER talk, for faster simulation, faster optimization methods, and reduced models for on-board controls.

Controls
Advanced control systems could enable combustion systems closer to the edge of stability. Wagner 2012. Click to enlarge.

There is a significant opportunity for delivering the high-efficiency engines required via prediction and control for the forced stabilization of inherently unstable systems. Stability has been and continues to be a roadblock to many advanced combustion implementations.

The current approach, Wagner noted, is to maintain distance from the edge of stability to avoid unintended excursions. However, he said, dynamic instabilities are short-term predictable and conducive to control.

New control opportunities will help optimize the operation of the next generation of high efficiency engines; however, new approaches to calibration of the controls will be necessary for optimization of the next generation of engines, as the control parameter space is already growing rapidly and is expected to continue to do so. Predictive simulation is essential for optimal design and controls, Wagner said.

DOE’s Leadership HPC effort is now being used to investigate instability mechanisms from an engine design perspective, and also to accelerate design optimization. Example ORNL projects in progress with industry include:

  • Large Infrastructure computing for Multi-cycle Instability and Transient Simulations (LIMITS). This project addresses the limits of the fuel economy benefit of dilute combustion, and focuses on stochastic and deterministic processes that drive cycle-to-cycle instabilities.

  • Injector design optimization. This project seeks to improve understanding and design optimization of fuel injector hole patterns for improved engine efficiency and reduced emissions.

Resources

Comments

mahonj

50% performance improvement is quite a step...
Even if they only get half that, it is still something to look forward to.
I suppose the golf diesel is an example of what can be done with conventional ICEs.

I imagine you would still want some degree of hybridisation to deal with low speed travel, but if ICEs get really good, we might not need to go to BEVs, except for urban use.

kelly

Combustion, in a cylinder, has been intensely studied for a hundred years.

Simulations of untried, unseen battery reactions/byproducts/cycling years would likely provide more 'bang for the buck'.

HarveyD

Come on...High Performance Computers have been around for 50+ years and were widely used to figure out how to go to the moon in the 1960s. They will have to find a better excuse for not doing/using them 50+ years ago?

By the way...11 out of 18 of the most reliable 2012 vehicles were built by Toyota. NONE from GM, Ford and Chrysler on the 2012 top list.

Is there a message there?

mahonj

Computing "Going to the moon" is really easy compared to fine scale combustion work.

They didn't really have high power computers in 1969, they had IBM 360's and slide rules, and they did an awful lot on slide rules.
The first Cray-1 wasn't installed till 1976, and they were used for atom bomb simulations etc.

Really big, quite cheap and readily available computers have only become available recently, in line with Moore's law, which works very well once you have parallel algorithms and multi-cores (as supercomputers do).

sd

@HarveyD

Yes, high performance computers have been around for 50+ years but the definition of what is called a high performance computer has changed dramatically and will continue to do so My cell phone has more computing power than the Apollo Computers. My job back in 1969 was to run the Lunar Landing simulator for the MIT Instrumentation Lab (renamed to Draper Lab). The computers that went to the moon had 64K of memory and were the first practical computer to use integrated circuit instead of discrete transistors. I also worked my way thru grad school at MIT and ended up with a doctorate in Mechanical Engineering. I remember taking a class in numerical methods where they taught the theory for finite elements and computational fluid dynamics but at the time we did not have the computational power to do useful problems. About 20 years ago, I worked on solid modeling and had one of the first solid models of an engine and we were able to automatically generate CNC code to make parts for a scale model of the engine. At that time, most engineers thought that solid modeling was not a practical solution for engineering design. Even 20 years ago, CFD was still something you that needed super computers for. Now I can use my lap top.

As for your other off-topic message, you must be getting your information from some alternate universe.

HarveyD

Please Google Search 1960 - 1980 Computer History (and other similar sources) and you will find that many of the basics for our current higher speed computers were developed during those 20 years. However, many improvements were done in the following 32 years and are still being done. The same applies to most batteries and ICEVs.

As for 2012 vehicles, it is a reported well know published CR fact that none of the Big-3 units were among the 18 most reliable vehicle types. Toyota really was awarded 11 of the top 18 types. That can hardly be changed. The most troubled free mid-size hybrid was the Camry 2012 hybrid. That is also past history that cannot be changed.

Will 2013 be different? Will the Big-3 be awarded 1 or 2 places out of 18? Will South Korea and China also be awarded a few places? The 2013 Camry Hybrid may still be one of the most reliable mid-size hybrid on the world market place.

Gorr

All these 2 researchers did was to do enthusiastic pr for keeping their jobs. Why not increase the performance of the gasoline itself instead of increasing the performance of the ice itself. Do you think seriously that you will magically do more mpg with the same gasoline that didn't change since 100 years. Also if ever they find a way to increase gasoline power it will probably be incompatible with actual ice and it will then take another ice engine to be compatible. So again, like battery chargers, numerous complicated infrastructure.

One thing that they discovered to be better was hydrogen fuelcell, so im ordering this product to buy right now and save money on gasoline ice studies. Put hydrogen fuelcell cars and trucks on the market now and nationalize any car compagnies that wouldn't comply to this, and start building efficient hydrogen stations or put the hydrogen gas maker directly into the hydrogen car. Ask this to these 2 researchers and they will say as the other paid researchers that they are near to find a miracle with gasoline ice. Their jobs is more to do pr for the actual ice then to find a miracle increase in mpg for real. If these researchers were working in a private labs and if they were paid only if they find a solution to sell, then these researchers will be poor and they will change their domain of researchs toward hydrogen instead that is more promising.

Herm

scientist dont do PR work or lie about anything..

sd

@HarveyD

If your reference "CR" refers to Consumer Reports, then it might as well be on an alternative universe. They have no understanding of statistics. Also, I have seen them give radically different reports for cars that made on the same line by the same workers in random order. Reading consumer reports is almost as amusing as reading the comments on this board. They are about the last source that I would use for making a rational purchasing decision. True idiots.

You could look at JD Powers but their reports rely on initial quality and not long term reliability.

There is no doubt that many of the basics of computing has been known since around WW II but the cost and speed of computing have made it practical to do what was only a theoretical dream. We now have machines with teraflop capability. However, computer science has also made gains in computational theory.

HarveyD

What is the value of various customer satisfaction surveys (often falsely referred to as studies)?

Is a J.D. Power paid (often biased towards whoever paid the bill) survey of some 12000 car owners better or more accurate than than CR's survey of 110,000+ car owners? It remains to be proven?

J.D. Power has always been biased towards the Big-3. That is normal when you have been working for the Big-3 for so many years.

Can't disagree with you that the performance and cost of computers have going done and are still going down at a fast pace. Too bad that batteries development is much slower. One can only hope that it will soon change.

The comments to this entry are closed.