could the Allison engine have done what the Rolls Royce Merlin did?

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules


Good question.
Specifically, did the intrinsic design of the V-1710 preclude development of it's supercharger in the ways that the Merlin supercharger developed?
Multi-speed, multi-stage, water cooling, etc.
 
Conspiracy theories aside, there were some practical reasons that war dept may not have pushed for two stage superchargers.

One was that GE, makers of turbochargers, was the design headquarters for mechanical superchargers for P & W and Wright (and anybody else) in the US for the 1920s and most engines in the 20s didn't use superchargers or least not ones with very high boost. What would evolve into superchargers on radial engines were sometimes referred to as "mixing fans" and turned at low rpm (sometimes crankshaft speed?) and helped mix the fuel/air after the carb and help insure that the top cylinders got the same mixture as the bottom cylinders. Sometimes they used enough rpm on the fan to recover the pressure loss through the carb and plumbing. That is they actually had 29.95 inches at the intake valve instead of 27in or so of pressure.
When you have fuel of 70 octane or less using high boost is impossible without going into detonation. A single stage supercharger could provide all the altitude performance the engine could stand.
The War dept didn't become interested in high altitude flight for some time. At least to the extent of funding more than few experiments. During the 20s they were still trying to figure out oxygen systems and rudimentary cockpit heating (or at least well insulated flying suits.)


Please remember that it just wasn't the absolute pressure in the manifold that lead to detonation it was also the temperature of the mixture. HIgher octane fuel withstands higher temperature before self igniting. So at high altitude with poor fuel it didn't matter if your manifold pressure was low if the mixture temperature was high, you still got detonation.

This is all background, once you get to the 30s things do start to change but now you are starting to run out of time. GE superchargers weren't really very good. Since they had more than enough power to drive them when using a turbo they never really looked at the efficiency of the compressor. And when they were using 70,77, 80 octane fuels they could not use much boost anyway, the the inefficiency and heat rise of the poor compressors was masked. It wasn't until 1937-38 that P & W and Wright finally figured out the GE supercharges they were buying weren't that good. Now you had 2-4 years to not only design their own single stage superchargers but try to come up with a two stage mechanical design? Which P & W did.
The whole turbo idea was sold as allowing the engine to develop sea level power at 15,000-25,000ft (depending on the year) by the turbo supplying sea level air pressure to the carb and keeping the back pressure on the exhaust valves at a near to sea level value (they never really managed this). This also meant the turbo engine didn't have to be built any stronger than the normal engine/s (same internal pressures) or need any additional cooling (except for the need to cool at the higher altitudes)
The mechanical two stage supercharger doesn't have those sales points. The power to drive the second stage comes from the engine itself so you either get less power to the prop than the turbo equipped engine or you need an engine that is making more power in the cylinders and needs heavier construction and better cooling (air or liquid flow) around the cylinders.

Somebody once said that with two stage superchargers the 2nd stage multiples any mistakes in the first stage design. An inefficient 1st stage with a large heat rise feeding a 2nd stage means some really high intake mixture temperatures (one reason just about all US turbo equipped planes used intercoolers).
 

This is correct, the efficiency of the 1st stage is the most critical, and poor 1st stage performance is multiplied by each subsquent stage. They do not add, they multiply; its just that often the efficiecies are so low that from inspecting the total pressure ratio of a 2-stage compressor, its easy to think its added (e.g Merlin-61 total PR is about 6:1, but from the individual
stages you might expect nearer 8 or 9 if multiplied - hence an easy mistake to make.).
 
Good question.
Specifically, did the intrinsic design of the V-1710 preclude development of it's supercharger in the ways that the Merlin supercharger developed?
Multi-speed, multi-stage, water cooling, etc.
The basic V-1710 mechanical supercharger seems to have been constructed so that the crankcase made up part of the supercharger housing. As far as I can tell, a mjor redesign would have been needed to install a two speed drive.
 

The Merlin's issues with plug fouling at cruise were in no way comparable to the severity of the problems the Allison had. In any event the Merlin's charge heating system actually turned out to be an unnecessary complication.
Alec Harvey Bailey describes the course of events in his book "The Sons of Martha".
"BOAC wanted an aircraft of greater all-up weight, entailing more take-off power. On 100 octane fuel this presented a difficulty, whereas two-stage military engines had full depth intercooling, the civil versions had half the intercooler casing given over to afterheating, to raise the charge temperature at cruise and avoid plug leading. This had been a TCA requirement following their experience with single-stage engines on the Atlantic, which had shown the need for charge heating. With half depth intercooling the charge temperature would be higher at take off with some loss of power and at 20 lb boost would be close to detonation. Design came up with the mixing scheme, which allowed intercooling at take-off and charge heating at cruise by bleeding engine coolant into the intercooler system. The reduced charge temperature thus achieved at take-off, plus a shade more boost, allowed the power to be increased from 1720hp to 1765hp, enabling the desired increase in weight."
A spate of intercooler pump failures ensured. The pumps, as it turned out, had always leaked to some extent.
"Forced to reappraise the mixing scheme, which had contributed to my problem, it was shown that at cruise powers the charge temperature was sufficient to avoid plug leading without charge heating. By deleting the mixing scheme and using full intercooling for take-off and zero intercooling at cruise by putting a stop valve in the circuit, the requirements of the operation were met."
The Merlin 724 was built that way from the start.
 
I realize I harp on the Allison's poor intake design, but the truth is that Allison should have known better.
Prior to computers taking over the task, engineers in the USA and Canada used Crane Technical Paper No 410 "Flow of Fluids though Valves, Fittings and Pipe" to calculate pressure drops in piping systems. I have the May 1942 version of this paper (first edition was 1935) which gives us an idea of the state of the art at that time.




It is easy to calculate pressure drops in a straight run of the pipe, the tricky bit is determining the effect of splits in flows, changes in direction and discontinuities such as valves. Crane came up with the brilliant concept of equivalent length where you convert the effect of a fitting into a length of pipe. Figure 17 is the nomograph used to determine the equivalent length of various fittings.
I don't have the dimensions for the Allison intake. But let's assume it's a 3" dia pipe with a length of 5 feet. Each short radius elbow (and the Allison elbows are very short) is the equivalent of 8 feet of pipe. To get to the mini plenums at each group of 3 cylinders there are 3 very abrupt elbows. Actually, it is even worse than that. One of the big no-nos in piping design is back to back elbows. The delta P of back to back elbows is difficult to calculate but it can be safely said that it is worse than the simple sum of the two. The two elbows that direct the air towards the rear of the engine could be modelled as a close return bend with an equivalent length of 20 feet, but the two elbows in Z formation would be worse. Throw in 2 splits in flow and you can see how much of the work done by the compressor is wasted.
In contrast the Merlin (single stage) seems to have a bigger pipe to begin with, which obviously reduces losses, but also notice how smoothly the discharge of the compressor transitions to the inlet of the plenum. A graceful sweeping curve which is so gradual that its pressure drop will be basically equivalent to its length. Note that the discharge of the compressor is tangential to the rotor preserving the direction of flow. From the plenum there are branches leading to each plenum at the cylinders. I would model these as an ordinary entrance loss, which for a 3" pipe is equivalent to about 5 feet of pipe.
Overall the Allison intake looks to have quadruple the pressure loss of the Merlin's.
 


This source
 

This source suggests that the failure to develop a better turbo-supercharger was based or prioritization of tungsten for other purposes. This is in keeping with other things I have read about very high priority being given to use of tungsten in machine tools in order to maintain high industrial production.

Allison V-1710 Engine
 
American tanks were rationed in their use of HVAP ammunition (tungsten cored) for the same reason.
 

Unfortunately, some of claims require a truckload of salt so we can digest them. Like:
The few turbo-supercharged Allisons that were made, were allocated to P-38s, making the high-altitude performance of that plane its best feature.

There was not a 'few of turbo-suercharged Allisons' made, but around 20000 thousand (there was more than 10000 P-38 produced). To put the number of 20000 in perspective, this is about as much of Spitfires produced, or Fw 190s produced. Or, almost 50% more than what P-40 production needed when it is about the engines. Trick might be that turbo installation on the P-38 actually worked, while on the Curtiss and Bell prototypes did not?

Then:
Donaldson R. Berlin, the P-40's designer, has said that P-40s experimentally equipped with turbo-superchargers outperformed Spitfires and Messerschmitts and that if it had been given the engine it was designed for, the P-40 would have been the greatest fighter of its era.

There is no picture, diagram or flight test of a P-40 that was experimentally equipped with turbo-superchargers, 75-80 years after the supposed aircraft was supposedly outperformed those two fighters (of what version - that we also don't know). Perhaps due to the simplest reason: there was no such P-40s?

Further:

It wasn't until the XP-40Q was modified with a "bubble" canopy, cut-down rear fuselage, wing radiators, clipped wing tips, a four-blade propeller, water injection and weight reduced to 9,000 lb that the XP-40Q attained a maximum speed of 422 mph.

Nowhere in the article is mentioned that XP-40Q that did 422 mph was outfitted with a 2-stage supercharger. Bubble canopy and better prop can't earn 30-40 mph. 9000 lb heavy P-40 is not a sign of weight reduction, but of gaining the weight.

Nowhere in the article is said that there was a 'failure to develop' better turboes due to the lack of tungsten.

American tanks were rationed in their use of HVAP ammunition (tungsten cored) for the same reason.

Do we have a good source for that?
 
Nowhere in the article is said that there was a 'failure to develop' better turbos due to the lack of tungsten.

One might also note the tens of thousands (if not hundreds of thousands) of turbochargers used on bombers. Like eight on every B-29. we are are also confusing quality (better) with quantity. American tubos did get better as the war went on, able to run several thousand rpm faster than early ones.
See; https://www.enginehistory.org/Turbochargers/GETurbochargerData.jpg

Do we have a good source for that?
It is pretty well known, most US tanks had 2-5 rounds of HVAP as official load if they had 76mm guns. Ammunition supply sometimes did not allow this.The Shermans with 75mm guns didn't get any HVAP. At the velocities the 75mm gun operated at they thought the increase in penetration would be marginal, still won't got through the front of a Panther or Tiger.
 
Resp:
The USN used an inline aircraft engine in their PT Boats.
 
I don't have a problem with you.

They may be good engines, but in WW2 they were largely an also ran. Relegated to aircraft that would, for the most part, be considered second tier.
Resp:
The post WWII production F-82 used the Allison engine, as the Packard Merlin production license was about to expire. The F-82 had outstanding performance.
 
I was wrong about the ammunition . It seems that by mid 1943 there were ample supplies of tungsten.
According to "Study of Experience in Industrial Mobllization in World War II":
"From a critical position in 1941 tungsten moved to a position of adequate supply in 1944. But tungsten was under strict priority and allocation control from March 1941 to July 1943. The new supply so increased that after August 1943 most dealers could purchase without priorities."
I do know that that HVAP was in short supply. Major Paul Bane: " However, at no time have we able to secure more than five rounds per tank and in recent actions this has been reduced to a maximum of two rounds, and in many tanks all this type has been expended without being replaced. " as quoted in " M4 Sherman at War"
 
Resp:
The post WWII production F-82 used the Allison engine, as the Packard Merlin production license was about to expire. The F-82 had outstanding performance.
The outstanding performance was by the Packard engined version. The famous long distance flight was Merlin powered. The Allison powered version was slower and had a lower rate of climb, when it was actually working.
The following quotes are from " Mustang Designer":
'As the Air Force Historical Office case history explained in great detail, "The major problem in the F-82 program was the failure of the government furnished engine."'
'33 hours of maintenance were required for each hour flown.'
'Persistent engine problems kept over 50 percent of these planes out of commission during most of 1949.'
'When working right, which was seldom...'
 
Last edited:
Hey guys,

On the subject of the use of tungsten in the WWII turbos. Do we know if there was any tungsten actually used in the US turbos? I ask because as far as I see it there would be no need for it, and there were no applicable(?) alloys (as far as I know) in existence at the time.

The reason that I say this is an alloy such as Stellite would be far easier to use in the high temperature environment of the turbo. At the time (late-1930s) it could be cast into complex shapes, hot drawn and forged, and electroplated in relatively thick layers. Stellite is a cobalt based alloy - there is no Tungsten used.

Otherwise, for tungsten, the only useful alloys (I think) available at the time would be the tool steels in the T series (T2 and T4 for example) but fabricating the shapes needed for the turbos would be very difficult, probably not really feasible at the time, and if you could make them they they would be enormously expensive. As far as i can imagine, the only possible practical use in a turbo might be for bushings or bearings. But, again there are other materials that would be more usable (EG Stellite and M series tool steels). Also, if Tungsten based tool steel was used, the amount of tungsten used in the alloys would not have been enough to interfere with the production of HVAP. The amount of tungsten used in tungsten-carbide cutting tools, however, might have been enough to do so.

I believe that GE (and others?) concentrated on early titanium alloys, the temperature resistance (not as high as tungsten alloys or Stellite) was high enough, at least in theory. Titanium bearing ore was difficult to process at the time, and manufacturing it was also difficult, plus extreme temperature cycles (such as in turbos) would have caused cracking in most of the alloys available at the time.

Having said all the above, I thought that the main problem was simply inexperience with the complex problem of very high temperatures combined with wear of the bushing/bearing surfaces at extremely high rpm.

I admit that I am not intimately familiar with the materials used for the various parts of the WWII GE turbos. Does any one have info on what parts used tungsten alloys? I would appreciate the info.
 
Last edited:
All in all, it was more convinient to blame it to other people (like accusing the British for their fuel - their fuel???) or to the circumstances (supposed lack of tungsten), instead of admitting that neither Curtiss nor Bell designed a honest turboed aircraft until too late (if ever), or to admit that intake manifold on V-1710 was not well designed, or that there was a fault in training when switch was made from P-38s with low-capacity intercooler to the P-38 with decent intercooler.
 
I admit that I am not intimately familiar with the materials used for the various parts of the WWII GE turbos. Does any one have info on what parts used tungsten alloys? I would appreciate the info.

The GE "B2" turbo gas turbine impeller blade composition was:

19.7% Nickel
12.7% Chromium
2.57% Tungsten
0.77% Silicon
0.72% Molybdenum
0.59% Manganese
0.5% Carbon

bal: Fe

Most of the pressings etc are Inconel, so basically Nickel-Chrome and a little Iron (no tungsten).
 
Regarding post #104, the supercharger was not a part of the crankcase. There was a center power section that had a crankcase, the main bearings - pistons & rods. There was a nose section. The nose section was either an F/G-type nose with an SAE 50 spline prop shaft or an E-Type remote drive, as used by the P-39 and PT boats, etc. The third section was the supercharger section, commonly called the auxiliary section. It mated to the crankcase, to be sure, but the design of the supercharger could have easily changed and still could have mated to the power section. The carburetor mounts to the top of the impeller and the fuel nozzle ends in a small cone with a flat top. Fuel hits the round flat top and atomizes in a conic cloud that goes directly into the impeller center. It is compressed centrifugally and then enters the intake manifold tub where it splits in the middle and then turns upward where it splits again into four 3-cylinder manifolds.

The supercharger could have changed anytime, and eventually did. They went from a 9.5" impeller to a 12.25" or a 12.18" impeller late in the series. To be specific, the first Allison (numerically by dash number) that had the larger impeller was the V-1710-97, which was the first of the G-series engines. The engine had a 9.60 : 1 gear ratio, the auxiliary section had a 7.485 : 1 gear ratio, and the propeller gear ratio was geared down 2.36 : 1.

The 12.18" impellers were in the -125 and -127 engines.
 

Users who are viewing this thread