Which country designed the best engines for WWII?

Which country designed the best aircraft engines for WWII?


  • Total voters
    370

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

View attachment 502831
I burned through many pistons on two strokes, I know it is different technology but when the process starts it takes seconds or minutes not hours.

Depends entirely on the severity of the detonation, many engines run their entire lives with minor detonation. It is not a digital process and varies from minor to catastrophic in severity.

Please consult the graph below showing several different knock intensities measured. "Knocking combustion in spark-ignition engines" - 17th July 2016- State Key Laboratory of Automotive Safety and Energy

 
Last edited:
Oh and to reiterate the main point.
Bigger pistons are more likely to develop hot spots as the tops of the piston are by necessity as low mass as possible to reduce reciprocating stresses, this results in a thin area across the piston easily heated... you know, specific heat of metals not great, you need to sink heat... right but you know that... and these areas are the niduses (nidii?) for detonation. Which is why there's a general effort in piston manufacturers of boosted engines to avoid any valve cutouts or anything else which gives an even thinner cross section for a hot spot to develop... ok enough on my iPhone
 

Direct injection does NOT increase homogenization by itself whatsoever, it actually makes it considerably worse because the mixing time is drastically shorter than with port injection or carburation, and there is far less turbulence to mix it as the mixture does not pass through the port and tumble mechanism. This is why modern direct injection systems have to run at astonishingly high fuel pressure to work properly (port injectors can run happily at about 50 PSI - a modern petrol DI system is running well over 2000PSI). The advantage is principally elimination of wall-wetting which is a serious problem for legal emissions requirements and improves fuel economy appreciably as your actually burning more of your fuel as the stuff dripping off the port walls doesnt really combust fully with port injection/carbs.
 
Last edited:
I would note that some people clam that P &W gave up on big cylinders after experiencing burned pistons with the Hornet "B" (R-1860) in the very late 20s/ early 30s. 6.25in bore X 6.75in stroke. No such trouble being experienced with the smaller Hornet A (R-1690) 6.125 bore and 6.375in stroke.
P&W going to 14 cylinders of smaller size instead and staying with smaller cylinders than rival Wright there after.

However there is a huge difference in metallurgy between 1930 and 1942/43, and even in air cooled engines a huge difference in cylinder cooling (fin area on air cooled cylinders more than doubled in less than ten years and then radically increased evenmore during WW II) let alone liquid cooled cylinders (which went form water to prestone to pressurised water/prestone mix).

There are too many variables to really blame one item/feature but DB didn't do themselves any favors by running the compression ratios they did.
 
DB605 is just 0.06" (1.5mm) bigger bore than a Griffon, and is actually smaller than a Cyclone bore. Neither of which had any similar issues.

And they were using 100/130 or 100/150, not 87. Also not using an automated mixture/RPM/throttle computer to match all these variables with one lever, which would need to be standardized somehow and account for the manufacturing variables and individual engine quirks when running at the limit of the fuel's ability to suppress detonation.
There's a lot more than just sheer absolute cylinder compression that goes into this, including valve overlap/timing, spark timing, etc etc etc. The bottom line is that fuel with higher detonation suppression gives both the designer and the pilot a bigger margin for error...

Hot spots don't matter unless they create detonation, and this happens alot more when the fuel is 87 instead of 130PN or 150PN... ask yourself why all the USAAF bomber pilots could always spot the Luftwaffe rising from the ground to intercept them. It was because at full throttle the DB605 was way overrich to suppress detonation, and left a soot trail behind it...
 
Last edited:
The first R-1820 Cyclones used 80 octane, of course the compression ratio was 5.0 to 1 and the amount of boost was minimal.

Same with the RR Buzzard. 152.5mm bore but 5.5 compression and low boost to suit 77 octane fuel.

The Germans pushed a little too hard for the available fuel and cooling.
 

Hmmm the twin Cylone was 6.9:1...and ran 21 PSI boost on 100 Octane - a lot more than DB did for most of their engines. Again, please provide references for your postulations.
 
Last edited:
Not postulations, just notes on what existed in the late 20s or very early 30s.
Wright compression ratios and fuel from http://www.enginehistory.org/Piston/Wright/C-WSpecsAfter1930.pdf

Wright names are bit confusing, the Twin Cyclone is the 14 cylinder R-2600 and it would be a rare one indeed to operate at 21 psi boost.

Most ran at 8lbs boost (16in above seal level pressure?) max.

The duplex or double row cyclone is the 18 cylinder R-3350 and while a number of them were rated at 21in of boost(?) are even a bit more it doesn't seem to be on 100 octane fuel.
One FAA data sheet shows 2800hp at 2900rpm for one version at 54in total (24in of boost?) but using 115/145 fuel. when using 100/130 the engine was limited to 2350hp at 2600rpm and 46.5 in manifold pressure (16.5in boost?)
http://rgl.faa.gov/Regulatory_and_G...0b90ebf2f1be08525670b006c8762/$FILE/E-270.pdf

Using 21 in boost (10.5lbs ) 51in total manifold pressure with the older 100 octane fuel would be remarkable.

Wright more than doubled the fin area on the Cyclone 9 between about 1930 and 1938/39 and then added substantially to it in the late war engines. If the cylinder walls are cooler would not that aid in the transfer of heat from the piston to the cylinder walls and help keep the piston head cooler?
 
So to put this thread back on topic. I'm going to reiterate that the US made the best of the best engines during the war. The best 2 where the R-1830 and the R-2800, and the reason that makes them the best is they are still going and even many years after the war where in aircraft working making dollars.
You can not say that for the recip engines that had beginnings or manufacture during or close after the war from any other country. Especially now there is just a very small handful of non Pratt & Whitney engines running in flying aircraft. Allison's and Merlin's sure they work, but TBO is horrible, and I don't know of many that were flying passengers in the late 60's and 70's. R-3350's can be factored in as well, gosh even the R-4360 has a great 1950's into the early 70's I think it was service record. So US aircraft recips from WW2 to about 30 years past where in service. How many Napier's, Centaurus's, Allison's, Merlin's, Griffons, and all the other countries engines lived that long. Oh and I'm sure there are still R-1830's and R-2800's flying some place for hire in Canada, Alaska, and other countries as we write this. So the best are still in use.
 
Uh, try looking at the economics of air transport.
I am not saying the P&W engines weren't great engines but using the factoid that some are still in service while other engines are not doesn't really explain why.
It is bad logic.

They Built more R-1830s than any other engine. Which gives a huge supply of used engines and spare parts. They built over 10,000 C-47s, thousands of which were sold as surplus after the war creating a demand for R-1830s in the commercial market over and above the 600 or so commercial DC-3 built.

For many years the surplus C-47s were a major factor in the airline/air transport market.
There were also hundreds if not thousands od Curtiss C-46 aircraft sold as surplus to commercial operators.

There was NO commercial market for German or Japanese or Italian engines after the war and darn little way of satisfying one if there was. WIth the factories bombed to rubble and work forces scattered the old axis powers were in no shape to provide any but the most rudimentary support for any existing engines and certainly not in position to build new/improved versions.
Russia very quickly became the new enemy and the western world was not buying aircraft and engines from Russia so that leaves Britain and France.
Pre-war engines were not going to sell (although both Britain and France tried briefly, they were desperate for foreign money) and only Britain had "modern engines"(engines benefiting from war time experience) although France did try to market the Jumo 213.

Britain had four engines. The Merlin, the Sabre ( a commercial non-starter) and the Bristol Hercules and Centaurus. However they had few airframes to put them in and few (if any) commercial operators wanted liquid cooled engines if they had any choice.
remember that the Lockheed Constellation was already in production (in small numbers) during the war and the prototype DC-6 flew in Feb of 1946. The British were several years behind.

What you wind up with is a commercial market dominated by P & W and to some extent Wright you had enough of a market to develop new versions of the engines. most commercial airliners built after the war got not just new built engines but new versions, P & W for instance on the R-2800 going to connecting rods that were longer than the wartime engines with pistons modified to suit.
WHile P & W didn't build many R-1830s after WW II there was certainly a large market for spare parts.
P & W built 3078 R-2800s in 1946-49 and 2423 from 1954-60 (not counting Korea) and while many of them were military engines (or military transport engines) one can see that the market for spare parts and overhaul services for these engines would dwarf any such demand/market for the British engines which were made in comparative handfuls.
Many warbirds in the late 50s and 60s (and later?) were actually flying with surplus DC-6 airliner engines rather than the correct wartime engine types because low time airline engines were so cheap.

Use of engines now how much more to do with the availability of parts and servicing (overhauls) than with any design features or quality of original build in the 1940s. However without good design features and high quality build standards no engine of the 1940s would have survived into the 50s/60s to develop the parts supply/overhaul support needed to last longer.
 
The Bristol Hercules had a decent commercial life new engines were still being built up to 1958 iirc and spare parts were being made in the 1970s for RAF Handley Page Hastings. Bristol freighters using Hercules engines were still flying in the 1990s.

If your going to use commercial life as an indicator of the best wartime engine then the bestest engine ever in the whole wide world is the
Shvetsov ASh-62 I believe there is a company in China that can still build you a brand new engine from spare parts stock.

edit: Just found that the last commercial flight of a Hercules engine was 2004.
 
The Bristol radials were very fine engines. Their chances of true commercial success were thwarted by, perhaps high cost? figures are hard to obtain, and the Fact that only few non British airframes used them.
The Merlin only achieved commercial use because the British, in the harsh economic times of the late 1940s imposed a large tariff on imported american engines (not just for Britain but the commonwealth, including Canada) to help prevent the further erosion of the balance of trade payments.

The surplus WW II transports sapped the lower end of the transport market for a number of years (20-40 seat transports and even 5-9 seat market by surplus Beech 18s) so new designs tended to 40 seats and above.

Even P & W made a commercial blunder in bringing out the R-2180 (1/2 of an R-4360) as it was too small and there was no real market for it. Primary user being the Saab 90 with only 18 built.
 
There are too many variables to really blame one item/feature but DB didn't do themselves any favors by running the compression ratios they did.

I always had the impression that the cooling of the DB engines suffered from having dry cylinder liners. They did try wet liners in the DB609 and possibly other prototype engines, but this feature never reached production as far as I know. It may be that the usable boost was limited by cooling issues and that they made a virtue out of this situation by raising the compression ratio.
 
They only built 8,000 or so Rolls-Royce Griffons, but they survived in active military service until the 1990s.

FWIW, I think it is odd that Rolls-Royce tried to sell the Merlin as a commercial engine rather than the larger and less stressed Griffon. The big advantage of the Merlin was its name and reputation.
 

Users who are viewing this thread