Was it feasible to convert engine heat to extra power on WW2 era engines?

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

ironnet

Recruit
5
0
Mar 12, 2018
I apologize in advance if my question is absurd.

WW2 liquid cooled engines already had cooling systems, radiators, piping, cooling liquid etc. Is it possible to add something like a sterling engine on cooling system, and power a supercharger with it for example? Would it add much more weight or compexity or compromise cooling efficiency? They used turbochargers after all, turbochargers also increased complexity, besides they required strategic materials. Exhaust gases and engine heat are two main source of energy inefficiency in piston engines as far as i know. They harnessed one (to a degree) with turbochargers but did nothing for the other main source of energy loss that I am aware of.

I suspect there is a good reason for this, but I want to hear opinions of much more learned members of this forum.

I apologize for my bad English.
 
It's not an absurd question; the issue is that there is a small temperature difference between engine coolant and ambient, so the efficiency is quite low and the potential power is fairly low.
Using exhaust waste heat is more practical, and has been tested for small disels, like those in over-the-road trucks.
 
Thank you very much for your answer. Would it be feasible with a coolant which has higher boiling point than water?
 
I apologize in advance if my question is absurd.

WW2 liquid cooled engines already had cooling systems, radiators, piping, cooling liquid etc. Is it possible to add something like a sterling engine on cooling system, and power a supercharger with it for example? Would it add much more weight or compexity or compromise cooling efficiency? They used turbochargers after all, turbochargers also increased complexity, besides they required strategic materials. Exhaust gases and engine heat are two main source of energy inefficiency in piston engines as far as i know. They harnessed one (to a degree) with turbochargers but did nothing for the other main source of energy loss that I am aware of.

I suspect there is a good reason for this, but I want to hear opinions of much more learned members of this forum.

I apologize for my bad English.
The Meredith effect radiators on P51s, Mosquitos and Spitfires among others), exhaust thrust on most engines and the turbo compound are all examples of heat/power recovery.
 
Is it possible to add something like a sterling engine on cooling system, and power a supercharger with it for example? Would it add much more weight or compexity or compromise cooling efficiency?
I would think the hp this would provide would be far too low to accomplish much for the extra weight and complication.

One weird related idea that may have worked was surface skin radiators, ie the skin of the plane was used as the radiator, this greatly reduced drag but didn't work at low speeds all that well, and was extremely vulnerable to battle damage.
 
I would think the hp this would provide would be far too low to accomplish much for the extra weight and complication.

One weird related idea that may have worked was surface skin radiators, ie the skin of the plane was used as the radiator, this greatly reduced drag but didn't work at low speeds all that well, and was extremely vulnerable to battle damage.
This was part of the Spitfire original design, abandoned very early on. The Supermarine racers used it in the wings and floats.
 
The Meredith effect radiators on P51s, Mosquitos and Spitfires among others), exhaust thrust on most engines and the turbo compound are all examples of heat/power recovery.

Thank you for your answer. That's right, but AFAIR (and I am by no means an expert) there are two main sources of energy loss in internal combustion engines ie. exhaust loss and heat loss via surface of the engine. Exhaust energy somewhat harnessed via exhaust thrust or turbochargers but apart from meredith effect radiators, surface heat stayed unused. Liquid cooled engines already had radiators, piping etc. so complexity is already there to some degree, I wonder if it's possible to create liquid coolant flow back and fort between cool area of radiators and hot area of engine surface and transfer this energy via something like a compact water turbine to power a supercharger. I read surface heat loss accounts to around 30 percent of energy created in the engine while exhaust loss and useful mechanical work of engine each accounts around 35 percent. So if I understood that correctly, heat exchange had similar amount of energy potential to power on the shaft, if 10 percent of that amount could be converted, that should be useful to power a S/C. I guess around 150 hp could be sufficent.

Puting a water turbine inside coolant flow likely slows down that flow and probably necessitates larger cooling surface area and/or more coolant, ie. weight increase I guess.

I suspect if it's worth the trouble industry would have already using something similar on some ground based engines at least after the war. But who knows, aviation engines have different priorities and cooler air at higher altitudes might make some difference in efficiency. But I am way over my head in there.
 
The big problem is harnessing/recovering the energy with a small/light device/duct.
Stationary power plants (or truck engines) can devote large amounts of volume and or weight to increasing efficiency or recovering otherwise lost energy.

For aircraft every pound of energy recovery apparatus is either less performance or one pound less cargo (weapons load for military) or fuel.
That assumes no increase in drag.
Getting some of the Meredith systems to actually work took a bit of doing, getting them to work at a variety of speeds and altitudes (air densities)
took a lot more doing.
 
The big problem is harnessing/recovering the energy with a small/light device/duct.
Stationary power plants (or truck engines) can devote large amounts of volume and or weight to increasing efficiency or recovering otherwise lost energy.

For aircraft every pound of energy recovery apparatus is either less performance or one pound less cargo (weapons load for military) or fuel.
That assumes no increase in drag.
Getting some of the Meredith systems to actually work took a bit of doing, getting them to work at a variety of speeds and altitudes (air densities)
took a lot more doing.
We had the GE Turbosuperchargers on the B-17 increasing manifold pressure for take off and altitude flight.
 
About 1/2 of the total energy the typical ww2 supercharged engine was dumped through the exhausts, granted a good deal of that energy transformed into exhaust thrust. Havig a turbine (fed by exhaust gasses, though) directly coupled to the crankshaft was a solution to use 'waste' energy.
'Ballance sheet' of the said engine:

ballanceSheet.jpg
 
Last edited:
About 1/2 of the total energy the typical ww2 supercharged engine was dumped through the exhausts, granted a good deal of that energy transformed innto exhaust thrust. Havig a turbine (fed by exhaust gasses, though) directly coupled to the crankshaft was a solution to use 'waste' energy.
'Ballance sheet' of the said engine:

View attachment 487340
Also I believe that about half of the total energy being converted is the best case scenario, on full power it is even worse.
 
For a good spark ignition engine, the sfc is about 0.4 lb/hp-hr; gasoline has a lower heating value of about 19,100 BTU/lb, so only about 1/3 of the fuel's energy is converted to useful work, and that's at best cruise.
 
You never will get 100% efficiency out of any machine, modern Formula 1 racing cars are interesting as they have energy and heat recovery systems and store energy in a light weight battery and dump it back in by a high speed electric motor. Imagine a fighter that could convert its excess dive speed into energy to reclimb!
 

Users who are viewing this thread

Back