Fuel consumption in factory tests of serial airplanes and engines

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

bf109xxl

Staff Sergeant
781
743
Aug 9, 2023
I would appreciate any references to data on fuel consumption during factory tests of series production (in gal or liter per plane/engine) - primarily in the US/UK+Commonwealth/Germany. For example, were there any fuel consumption limits? Were there any measures taken to reduce fuel consumption? Is there any indirect information that might help to estimate values?
I was just shocked by some of the numbers for Soviet factories before the war, would like to compare.
 
When I was stationed at Tinker AFB in the mid-70's, I met some people who had been there when they checked fuel flow accuracy for the specially modified B-29's used by the 509th Bomb Group. Normal procedure was to remove the fuel flow meters and check them for accuracy, recalibrate as required. But the USAAF insisted that those B-29's have the fuel flow accuracy checked and calibrated with the meters still installed in the airplanes, which would give better accuracy. The Midwest Air Depot workers did not know why at the time, but it was obvious that for some reason there was special concern that those particular airplanes not run out of fuel.

Of course, as it transpired, Bock's Car did run low on fuel after the Nagasaki bombing and had to make an emergency landing in Okinawa.
 
During most of the war every engine manufactured was run through an acceptance test, and the test results recorded. I know that Allison included these test sheets within the "Data Pack" that was shipped with each engine. These test sheets included the requested relevant fuel flow and power information. I am attaching one such report, for a V-3420-23(B-10), which happened to be the engine installed in the Fisher XP-75. I have other such reports for the V-1710, but this one is an original and is easier to read. Enjoy!
DDW
 

Attachments

  • V-3420-B10 A-59782, for XP-75.jpg
    836.5 KB · Views: 3
Thanks!
I guess I phrased it wrong. I am interested in the amount of fuel spent per engine or airplane test - more precisely, per entire unit test run in mass production.
 
Thanks!
I guess I phrased it wrong. I am interested in the amount of fuel spent per engine or airplane test - more precisely, per entire unit test run in mass production.
You are asking what may be an impossible question to get an accurate answer for.
T.O. 02-1-4A "Handbook of Specific Installation and Test Instructions with Run-In Schedules for Aircraft Engines," lists every AAF used engine and details the run-in schedule and duration at each power level for all production engines. Typically, they were required to be run about 2.5 hours before delivery, both new and overhauled. If any issues developed the engine had to be repaired and then perform a satisfactory "penalty" run. The details for each engine's testing were recorded, and as shown earlier, the test sheet accompanied the engine into the field. Individual manufacturer's testing specifications could be a lot more specific and demanding.
In Packard specification A.C. 1076, for the V-1650-7 Merlin, the statement is made that "Any engine which requires more than 16 hours of running at 90 percent or more of normal rated manifold pressure ...prior to the final run, shall be rejected." Thus a rough estimate of the total amount of fuel required for an "average" engine to be accepted by the AAC can be made. When you consider the huge numbers of engines built the amount of fuel used would have been staggering! For this reason, most of the test specifications specify use of 87 and 91 octane fuel for most of the run-in work. Grade 100/130 was reserved only for high power operation of supercharged engines.
Good luck!
Dan
 
You are asking what may be an impossible question to get an accurate answer for.
I would appreciate even rough values!
Thank you very much! Do I understand correctly that each engine ran from 2.5 to 16 hours when tested? I.e., for rough estimates, we can accept the probability distribution for runtime as highly asymmetric with a maximum around 3-4 and abrupt decay (e.g., exponential) towards larger values?
It turns out that during the war the industry consumed enormous amounts of aviation gasoline, which should be of the same order of magnitude as the aviation consumption at the front. Maybe you have seen figures of the total consumption of aviation gasoline in the aviation industry?
In one of the Russian-language sources I found a record that before the war during the tests of one serial AM-35 engine about 9 tons (!!!) of aviation gasoline was spent! I was shocked by this figure, but now I see that in any case we are talking about tons of fuel per engine.
 
I believe that it was at a P&W plant where they used the engines under test to run generators to produce electric power.

Also, note that at USAAF Air Depots engines that had been overhauled were also subjected to test runs.
 
A fuel transfer pump failed and they could not transfer the fuel from a bomb-bay auxiliary tank into the primary fuel system. This, coupled with a run on their main target (Kokura), multiple runs on their secondary target (Nagasaki), and the hauling around of 640 gallons (3,840lbs) of fuel they could not use are the reasons for running dangerously low on fuel. They were ordered bomb visually and not by radar, and to return with the bomb to Tinian if they could not. A magical "break" in clouds was claimed to have occurred by the bombardier to bomb visually. They missed the aiming point by 1.5 miles which happens to be the average error for radar bombing, but still claimed they bombed visually. They probably could not have reached Okinawa carrying both the 9,000 lb bomb and the useless extra fuel. IMO, they bombed with radar to lighten the load and give them a chance to reach Okinawa. War is just the scientific application of violence to achieve a political end. The bomb was the ultimate expression of this.
 
Last edited:

Users who are viewing this thread