Improve That Design: How Aircraft Could Have Been Made Better

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

Shortround is right when he says It is fantasy to propose 1944 engines in 1941.

Remember that the 1,525 hp Wright R-1820 built in the 1950s started life at 700 hp in 1931.
Even if you have a good basic engine it takes years to get maximum power and good reliability out of it.

The Pratt R-1830 likewise grew from 800 hp in 1932 to 1350 hp in the mid 40s.

The Sakae was first RUN in 1939. It did not have almost 10 years of development like the Pratts and Wrights did by 1941 yet it had a remarkably low sfc and reasonable reliability from day one.

Both American manufacturers were into production runs far greater than any Japanese engine long before 1941 and much development comes from the experience gained by running the engines in flying aircraft. Until any engine is in service in numbers (in the days before computers) you had no way of knowing if that cracked cylinder head or other defect is a one off, a result of the powerplant design, a result of pilot mismanagement, of pilot training, bad manufacturing, or truly an engine design fault. Even when a problem became known it still took a lot of research to determine the cause - look at the Allison reliability in the P-38 as an example. All caused by the pilots obeying the USAAC flight manual instead of Lockheed and Allison's operating instructions.

Even now on computer designed FADEC engines with the aircraft effectively flown by the FMC computer the same applies (both of which take most pilot problems off the list). Just look at all the problems operators of the newer Rolls Royce Trent engines have been having for the last five years.
Design problems on an engine has been in operation for almost 30 years. Maybe they have kept the Trent going beyond its practical stretch limits?
 
Last edited:
To expand what MiTasol said: P&WA and CW had a lot of those engines in commercial service pre-war. Commercial aircraft spend much more time flying than military ones (the expected usage of the CH-53 by the USMC was 360 hours per year; this may be lower than expected usage of military combat aircraft in the 1930s but the Marines did not pull that number from their collective behinds: they got it from peacetime and Vietnam-era usage. In contrast, commercial users would be looking at something like ten to fifteen times that. A commercial plane on the ground is losing money. A military plane on the ground isn't costing money)
 
It's not fantasy to propose Japan look at everyone else and find the same.
The Japanese were looking.

Ki -44 first flight Aug 1940
KI-60 first flight March 1941
Ki-61 first flight Dec 1941

For the navy and from wiki
"Towards the end of 1940, the Imperial Japanese Navy asked Mitsubishi to start design on a 16-Shi carrier-based fighter, which would be the successor to the carrier-based Zero. At that time, however, there were no viable high-output, compact engines to use for a new fighter. In addition, Jiro Horikoshi's team was preoccupied with addressing early production issues with the A6M2b as well as starting development on the A6M3 and the 14-Shi interceptor (which would later become the Mitsubishi J2M Raiden, a land-based interceptor built to counter high-altitude bombers). As a result, work on the Zero successor was halted in January 1941."

J2M Raiden first flight March 1942.


Unfortunately the phrase "evolutionarily unlikely or technologically infeasible" opens up a bunch of possibilities that tend to fall apart upon closer examination.

Take the Bristol Hercules engine, problems with sleeve valves aside, it went from a 1300-1400hp engine in the late 30s (in development and testing) to a 2100hp engine at the end of the war or post-war. It was NOT done by pouring higher PN number fuel into the tanks and screwing the boost control adjusting screw down.
Everything that was done could be considered evolutionary and technically feasible (although the last is questionable) in 1940-41. However along the way (evolutionary) the Hercules went through 7 different cylinder head designs with ever increasing finning for better cooling, early heads were cast and machined, later heads were forged and machined. Yes forging was known but how much forging capacity did they have in the early years? I believe (open to correction) that the last engines (200, 600 and 700 series) got alloy heads with a very high percentage of copper(if not mostly copper) for better heat transfer. While not technologically infeasible in 1930-41 (?)(depends on the state of the metallurgy) it is a major problem in wartime to be using pounds of copper per engine for better cooling fins. The later engines also got a new crankcase and crankshaft with larger bearings.

The Wright R-1820 has been mentioned already is a classic case of "improvements" as the engine could be considered a series of engines of at least 5 engines that share a common bore and stroke (and sometimes mounting points and propshaft spline) as the engine went from 600 to 1525hp.

The arguments and improvements start to get circular in that there was no reason to design and try to build the later "style" of engine early when the fuel was only 80 octane and the engine was limited to 1900-1950rpm. As better fuel became available, the compression or boost could be raised and as better ways of manufacturing cylinders with more fin area were developed (an often overlooked aspect of air cooled engine design) the rpm could be increase (along with better bearings and vibration dampers) as the heat load increased.
The early 1930s R-1820E went 850lbs without reduction gear. The 1950s 1525hp R-1820 for use in Helicopters went over 1400lbs without reduction gear. there is no sense building a 1200-1300lb 30 liter engine if the available fuel will only support 600-700hp.
One book notes the changes between the 1100hp R-1820G100 engines and the 1200hp R-1820G200 engines as (but not limited to) a new crankcase (both are steel but not interchangeable) and increase from 2800sq in of finning on the cylinder barrel and head to 3510 sq in. (this is per cylinder), the exhast valve is changed in shape to give more volume for the sodium filling. The exhaust valve guides are more streamlined and the area of the exhaust port is increased by 20%, The intake pipes from the supercharger diffuser to the inlet port of the heads was increased in size. There was a new oil scavenging system. The crankshaft had a double vibration damper instead of a single vibration damper. The supercharger diffuser had 12 blades instead of 9 blades although the impeller was the same size.

This was to go from 2350rpm and 1100hp max to 2500rpm and 1200hp max.

I have no idea what the Japanese did to go from about 70hp per cylinder on the Sakae 11 to 111hp per cylinder on the Homare but there was obviously a LOT of development.
 
...
I have no idea what the Japanese did to go from about 70hp per cylinder on the Sakae 11 to 111hp per cylinder on the Homare but there was obviously a LOT of development.

Partial answers:
- Steel crankcase + counterweights on the crankshaft = better rpm (2900-3000 vs. 2700-2800).
- increase of manifold pressure to +500 mm Hg via ADI, vs. +300 for the latest Sakae without ADI (due also to better crankcase + crankshaft?)
- size/material/type of valves?
- better pistons, crankpins?
- per TAIC, the Homare 20 series have had increased compression ratio - from 7:1 to 8:1 (my comment - if true, it was a nice way to shot oneself into foot)

Unfortunately, we don't seem to have good data on Army's Sakae (Ha 112) with ADI.
 
I see your point. Let's avoid fantasy. My thinking was to give the Japanese hp levels that others were getting from radials at the time.

Japanese radials were either equal or slightly better than Soviet, German or British radials, and far better than Italian radials.

Can we get the Japanese some influence from the Fw 190's BMW 801 for instance?

Single-lever operation (Kommandogeraet), direct injection?
Japanese improved the layout of exhaust stacks at the BMW 801 level by some time 1943-44
 
You have two or three different problems making high powered aircraft engines for service use.

1. is just getting the engine to make high power for any period of time. This requires either high BMEP (usually high boost) or high rpm or both.
2, getting the engine to survive for even a few minutes, too high a BMEP level (too much boost/compression) for a given type of fuel leads to detonation and wrecked engines pretty quick.
3, Getting the engine to survive the power level desired for a worthwhile amount of time in squadron service. The Russians would tolerate an engine life of 50 to 100 hours while the US and Britain would not. If you are going to send planes halfway round the world you want to send about 20-25% spare engines and not 2-3 engines per airplane.

When things get desperate you may accept lower service life, but there was a continual battle between more power and long engine life.
The Allison went through 4 different crankshafts. the first 3 look identical, plain alloy steel followed by shot peening, followed by shot peening and nitriding and finally a crankshaft with both shot peening, nitriding and 27lbs worth of extra counterweights. Please note the last crankshaft could be put in an early engine and there was no change in the size/dimensions of either the main bearings or the rod bearings yet fatigue life went up by an order of magnitude. (later engines may have gotten better bearings?).

The Japanese actually weren't doing too badly.
Problem is the BMW 801 was a 41.8 liter engine.
The Kasei engine was 42 liters and Kinsei engine was only 32.3liters.
The American R-2600 was 42.6 liters
The Hercules was 38.7 liters (as were the Gnome Rhone 14K, 14N and 14R and their cousins)

There was no magic technology that was going to overcome the displacement gap, especially considering the Japanese fuel situation.
 
Like it or not, fuel (Octane) is the magic elixir that will unleash the most horsepower all other things being equal. After you hit the horsepower target you start doing the engineering clean-up to fix the weak links in the design chain.
After that? you'd better have a Stanley Hooker on your staff who can find the hidden horsepower and the engineering staff to start fixing the new weak links caused by the increase in horsepower.
 
Until the LA-5FN arrived in 1943 and the Yak-3 arrived in 1944, everything the Russians had was outclassed from 1941, so they had no choice, it was either the engine or the pilot. At least us Brits had the Spitfire IX and Typhoon from 1942, the Americans, the P-38J/L, P-47D and P-51B/C/D from 1943/44.
 

In the 30's an efficient operator was probably getting about 6 hours utilization per day because very little night flying was done and heavy maintenance kept the aircraft (and engines) on the ground for a couple of weeks per year. For every DC-2 or Boeing 247 that means every aircraft flew over 4000 engine hours per year.

Engine time may have been as high as 4 to 5 hours per day as engines had relatively low overhaul lives and overhauls took time. Still that puts a civil engine at operating possible up to somewhere around 1500 to 2000 hours per year, depending on how fast the overhaul was and how few spares the operator carried.

When I was last directly involved in jet maintenance in the 90's the 767-300ER aircraft in the hands of most efficient long haul operators were averaging over 14 hours flying time a day for 365 days a year. One operator was pushing 16 hours per day on extra long haul ops. With long haul the hours between overhaul when the aircraft is averaging 10 hour legs are way beyond what any piston engine can ever dream of and usually come down to the life limit of components.
 
2000 flight-hours/year is still several times the likely usage for most military aircraft, especially fighters. Patrol aircraft may be above that.
 
Give the Zero and the Ki-43 a sufficiently powerful engine to allow for the weight of hydraulic controls

Why?
Hydraulic controls are needed when you cannot get the aerodynamics right. The Sabre has boosted controls and they killed a number of pilots including one of Reno's premier racing Merlin builders when they locked up.
The Mig 15 only has boosted aileron and fly's quite nicely with the boost off until you reach the higher end of its speed range.
If you ever watched DC-9s and MD-80 series aircraft start and taxi it was not unusual to see one elevator up and the other down. They used servo tabs instead of hydraulics.

Why add weight, complexity and more things to go wrong unless there is no other option?
 
Last edited:
I think the only WWII-era fighter with hydraulically boosted controls was the P-38. They weren't needed, as far as I know on any of the single engine piston fighters. A bit later, the B-47 used boosted controls but the B-52 (at least earlier models) and the 707 used servotabs.
 
This is not quite right, at least in the US. Due to the distances in the US overnight sleeper services started fairly early.

21 of these were built in 1933/34 with berths for 12 passengers.
The Overnight sleeper market was the main idea behind the DC-3, originally called the DST (Douglas Sleeper Transport) They wanted a plane that would hold as many passengers in sleeper berths as the DC-2 would carry in seats. Hence the fatter fuselage (which wound up seating 3 across normally and 4 across in crowded (for the 1930s) conditions.

Engine overhauls were rarely done in airline shops (although top end overhauls/repairs might be) engines were swapped out when they needed to be overhauled. The planes had to be kept in the air to make money and trying to overhaul the engines took too long. Daily averages are hard to come by but the DSTs were flying Between Chicago and New York in June of 1936 and a 10 hour non stop between Los Angles and Chicago in July. Coast to coast (with stops) was started in Sept.
By 1937 the DC-3s in a number of interior layouts (14 seat club car or sky lounge) 21 seat standard and up to 28 seats. but planes were being tasked with flying all the way from New York to San Francisco (with stop/s)The planes were refueled as a SOP and not changed during the stop/s unless something needed seeing to.
By Nov of 1937 United Airlines had completed it's 20,000th coast to coast trip ( started before the DC-3) and other air lines on other routes certainly added substantially to that number.

Obviously US AIrlines placed a premium on reliability and durability.
 

Yes and no.

Yes there were a number of overnight sleeper operations and yes the DC-3 started as the DST but as a percentage of the total airline fleet do not think that these aircraft were that significant a percentage of total fleet hours. I could be wrong of course. I would also add that long before the sleeper services started the majority of mail and some freight flew at night. Again as a percentage of the whole I believe that this was not that significant.

And yes engines were swapped out, not overhauled while the aircraft sat around idle. That is why I said that aircraft were doing some 2000 hours per year and engines up to 1500. Depending on the engine an operator would hold 30-100% spares for this reason (making my estimation of about 1500 way too high). Many operators did their own engine and component overhauls as depending on the manufacturer could result in your engines being held up in overhaul by brand X's engines so having your own shop guaranteed your engines got priority and were done the way you wanted.

Ten hours non stop for a DC-3 sounds like they were pushing the limit somewhat as my (very fallible) memory is that the DC-3 did not have that endurance. I will have to check.
EDIT - According to the USAAF C-47 flight manual the C-47 had an endurance of 11.82 hours if you made no allowances for take off and climb so that does make 10 hour flight possible but they would definitely be running on the smell of an oily rag on landing and have nothing in reserve. I suspect the earlier engines were a little less fuel efficient than the 1830-92 which would have shortened the range a little as well. I cannot find my DC-2 flight manual.
 
Last edited:

What aircraft is shown in the photo?
 

While the servo tab is an German invention (by Flettner) and was widely used in WW2 bombers and transports its perfection as the geared spring tab came from the NACA in the 1940s which allowed the technology to be expanded over a wider speed range than just bombers. The later war Hellcat and Corsair received them and it seems to have doubled high speed roll rate but at a noticeable expense in low speed roll rate. The P-38 received hydraulic boosted ailerons and enjoyed a massive increase in roll rate without it effecting low speed roll.

Traditional spring tabs tended to deflect too much at high speed which would destroy the pilots linear force feedback and he could over stress(destroy) the airframe at high speed. What I'm saying is it was an American invention unknown. The Arado 234 apparently had them and so the Germans may have figured them out but they used hydraulics on the Do 335.

The problem with spring tabs of any kind is that they are finicky to design and can be prone to flutter. Once you are in the supersonic speed range they are completely inadequate and can even reverse. American Eagle 4184 (An ATR 72) crashed because icing on the NACA 5 digit air foils led to the ailerons reversing, the aircraft inverting and a total loss of the hull and all souls. Admittedly that was the case of the FAA rusting a badly done French (Pre EASA) certification re icing.

Nevertheless geared spring tabs allowed the B36 to be flown with no trouble without boosted controls. If you go to the B737 site there is some interesting photos of the elevator spring tabs along side of which are anti balance tabs to keep the force feedback linear.

Supersonic aircraft have fully hydraulic controls not boosted because they can not tolerate any forces getting back to the pilot which may cause PCO pilot coupled oscilations. They artifical feel applied from a pitot static tube.

Because of the spacious trailing edge of the P51 laminar profile wing it used internal pressure balancing where for instance if a aileron was deflected up the high pressure air was channeled to a bellows to reduce airleron force.

Of course your trying to stop the pilot over stressing the airframe. If you have fly by wire the practice is to simply reduce deflection as speed goes up.
 

Users who are viewing this thread