Analysis of the Low Pressure Circuit of a Gasoline Direct Injection System

MTZ worldwide ◽  
2016 ◽  
Vol 77 (2) ◽  
pp. 56-61
Author(s):  
Michael Spitznagel ◽  
Uwe Iben ◽  
Ronny Leonhardt ◽  
Michael Bargende
2010 ◽  
Author(s):  
Stephan Schmidt ◽  
Martin Joyce ◽  
Jonathan Wall ◽  
Alexander Trattner ◽  
Roland Kirchberger ◽  
...  

2000 ◽  
Author(s):  
Shinji Ueda ◽  
Yukio Mori ◽  
Eiji Iwanari ◽  
Yoshitomo Oguma ◽  
Yousuke Minoura

2019 ◽  
Author(s):  
Soichi Saitoh ◽  
Hitoshi Shibata ◽  
Masahiro Ookuma ◽  
Masahiro Shigenaga

Author(s):  
Shima Nazari ◽  
Anna Stefanopoulou ◽  
Jason Martz

Turbocharging and downsizing (TRBDS) a gasoline direct injection (GDI) engine can reduce fuel consumption but with increased drivability challenges compared to larger displacement engines. This tradeoff between efficiency and drivability is influenced by the throttle-wastegate control strategy. A more severe tradeoff between efficiency and drivability is shown with the introduction of Low-Pressure Exhaust Gas Recirculation (LP-EGR). This paper investigates and quantifies these trade-offs by designing and implementing in a one-dimensional (1D) engine simulation two prototypical throttle-wastegate strategies that bound the achievable engine performance with respect to efficiency and torque response. Specifically, a closed-wastegate (WGC) strategy for the fastest achievable response and a throttle-wastegate strategy that minimizes engine backp-pressure (MBWG) for the best fuel efficiency, are evaluated and compared based on closed loop response. The simulation of an aggressive tip-in (the driver’s request for torque increase) shows that the wastegate strategy can negotiate a 0.8% efficiency gain at the expense of 160 ms slower torque response both with and without LP-EGR. The LP-EGR strategy, however offers a substantial 5% efficiency improvement followed by an undesirable 1 second increase in torque time response, clarifying the opportunities and challenges associated with LP-EGR.


2004 ◽  
Vol 37 (22) ◽  
pp. 273-278 ◽  
Author(s):  
E. Alabastri ◽  
L. Magni ◽  
S. Ozioso ◽  
R. Scattolini ◽  
C. Siviero ◽  
...  

Author(s):  
Zhang Ming ◽  
Zhong Jun ◽  
Capelli Stefano ◽  
Lubrano Luigi

The development process of a down-sized turbocharged gasoline direct-injection (GDI) engine/vehicle was partially introduced with the focus on particulate matter (PM)/particle number (PN) emission reduction. To achieve this goal, the injection system was upgraded to obtain higher injection pressure. Two types of prototype injectors were designed and compared under critical test conditions. Combined numerical and experimental analysis was made to select the right injector in terms of particle emission. With the selected injector, the effect of injection parameters calibration (injection pressure, start of injection (SOI) timing, number of injection pulses, etc.) on PM/PN emission was illustrated. The number of fuel injection pulses, SOI timing, and injection pressure were found playing the leading role in terms of the particle emission suppression. With single-injection strategy, the injection pressure and SOI timing were found to be a dominant factor to reduce particle emission in warm-up condition and cold condition, respectively; a fine combination of injection timing and injection pressure is generally able to decrease up to 50% of PM emission in a wide range of the engine map. While with multiple injection, up to an order of magnitude PM emission reduction can be achieved. Several New European Driving Cycle (NEDC) emission cycles were arranged on a demo vehicle to evaluate the effect of the injection system upgrade and adjusted calibration. This work will provide a guide for the emission control of GDI engines/vehicles fulfilling future emission legislation.


Sign in / Sign up

Export Citation Format

Share Document