Airline Crash Due to Crew Errors

Ad: This forum contains affiliate links to products on Amazon and eBay. More information in Terms and rules

syscom3

Pacific Historian
14,950
11,713
Jun 4, 2005
Orange County, CA
In another thread, I made a comment about UAV's would be less prone to crashes due to them not having the human issue, of occasional pilot/crew inattention. The following is just a few examples of civil accidents. But no doubt, the military has its share of similar aircraft loss due to "stupid and preventable" pilot errors.

The following two crashes were solely due to crew errors and were totally preventable. UAV's wouldn't have these issues because they are not human and potentially distracted.

Deradler, please look at Eastern Airlines Crash, and what the crew was doing. A jet crashed because the crew was troubleshooting an indicator lamp and not flying the jet like they should have.


United Airlines Flight 173, registration N8082U, was a Douglas DC-8-61 en route from Stapleton International Airport in Denver to Portland International Airport on December 28, 1978. When the landing gear was lowered, only two of the green landing gear indicator lights came on. The plane circled in the vicinity of Portland while the crew investigated the problem. After about one hour the plane ran out of fuel and crashed in a sparsely populated area near 158th and East Burnside Street, killing 10 and seriously injuring 24 of the 189 on board.


Eastern Air Lines Flight 401 was a Lockheed L-1011 jet that crashed into the Florida Everglades on the night of December 29, 1972, causing 101 fatalities (77 initial crash survivors, two died shortly afterward). The crash was a controlled flight into terrain as a result of the flight crew's failure to monitor the flight instruments during a malfunction of the landing gear position indicator system.
.... After descending 250 feet from the selected altitude of 2000 feet a C-chord sounded from the rear speaker. This altitude alert, designed to warn the pilots of an inadvertent deviation from the selected altitude, went unnoticed by the crew. Investigators believe this was due to the crew being distracted by the nose gear light, and because the flight engineer was not in his seat when it sounded and so would not have been able to hear it. Visually, since it was nighttime and the aircraft was flying over the darkened terrain of the Everglades, there were no ground lights or other visual indications that the TriStar was slowly descending into the swamp...
 
UAV's wouldn't have these issues because they are not human and potentially distracted.

Hi Sys. Are you advocating that there be no pilot/s on board to monitor the flight? Passengers board, press the button and away we go? Or someone sits and just monitors the system, but is not a pilot? In a way don't we already have some of this technology? Wasn't the Trident the first to pioneer automated landings?

Interesting. I can see pros and cons in what you're saying.

Pro. The technology is in control but the idiot pilot/technician overrides it and crashes. In this recent case the pilot ignored something like 15 audible warnings...

Garuda Indonesia Flight 200 - Wikipedia, the free encyclopedia

Con. The technology fails, and there's no pilot to take control and save the day. (I have an image of Hal from 2001 deliberately sending the passengers to their doom).
 
Hi Sys. Are you advocating that there be no pilot/s on board to monitor the flight? Passengers board, press the button and away we go? Or someone sits and just monitors the system, but is not a pilot? In a way don't we already have some of this technology? Wasn't the Trident the first to pioneer automated landings?

Interesting. I can see pros and cons in what you're saying.

Pro. The technology is in control but the idiot pilot/technician overrides it and crashes. In this recent case the pilot ignored something like 15 audible warnings...

Garuda Indonesia Flight 200 - Wikipedia, the free encyclopedia

Con. The technology fails, and there's no pilot to take control and save the day. (I have an image of Hal from 2001 deliberately sending the passengers to their doom).

I'm just pointing out that airplanes crash because of stupid things the pilots and/or flight crew do. And automating the cockpit can prevent those types of crashes.

UAV's cannot be distracted, therefore some of these pilot induced crashes would never have occured.

Note - I chose these two crashes to prove my point ..... the flight crew was too busy looking at a broken lamp and failed to fly the plane.
 
Sys - both incidents are basically ancient history with regards crew cockpit awareness and newer avionics and autopilots that prevent errors from that happening. Basically the pilots were not flying the aircraft - in the first one, it is SOP to try to trouble shoot a problem like a landing gear problem, but to run out of fuel is a different story - potentially a UAV operator could do the same thing and lose the UAV as well.

Neither incidents support your argument and UAVs have the same potential for failure as manned aircraft.

Here....

Canadian Army's Elbit UAV grounded by malfunctions Unmanned Aerial Vehicles (UAV) Blog Archive

ComPilots Aviation News Portal - UAV crashes in northern Gaza Strip

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs
 
Yu have to pick out situations where a manned aircraft was saved by the crew which a UAV would have been lost.

UAV is piloted remotely and anything human or designed by human has an inbuilt idiot factor which just is.
 
Syscom's point is well taken and certainly there is an inherent reduction in human factors in operating UAVs. But the human factors issues are not gone, just changed. And changed into areas that do not have the historical precedent from which to assess best practices and implement mitigation strategies.

This is the one that always comes to mind first.
___________________________________________________________
AIRFORCE TIMES
Pilot error blamed in August Predator crash

By Bruce Rolfsen - Staff writer
Posted : Friday Jan 26, 2007 13:28:13 EST

The crash of a remote-controlled MQ-1 Predator on Aug. 3 at Creech Air Force Base, Nev., resulted from a civilian contract pilot pushing the wrong button, an Air Force accident investigation board concluded in a report issued Thursday. The aircraft was assigned the Predator formal training unit, the 11th Reconnaissance Squadron at Creech.

As the aircraft flew near the base at an altitude of about 500 feet, the pilot pressed the button he thought would retract the airplane's landing gear. Instead, the button shut down the engine.

The pilot couldn't restart the motor. He tried to steer the powerless plane to a runway, but the propeller-driven plane crashed. The total cost of the damage was pegged at $1.4 million.
 
Syscom's point is well taken and certainly there is an inherent reduction in human factors in operating UAVs. But the human factors issues are not gone, just changed. And changed into areas that do not have the historical precedent from which to assess best practices and implement mitigation strategies.

This is the one that always comes to mind first.
___________________________________________________________
AIRFORCE TIMES
Pilot error blamed in August Predator crash

By Bruce Rolfsen - Staff writer
Posted : Friday Jan 26, 2007 13:28:13 EST

The crash of a remote-controlled MQ-1 Predator on Aug. 3 at Creech Air Force Base, Nev., resulted from a civilian contract pilot pushing the wrong button, an Air Force accident investigation board concluded in a report issued Thursday. The aircraft was assigned the Predator formal training unit, the 11th Reconnaissance Squadron at Creech.

As the aircraft flew near the base at an altitude of about 500 feet, the pilot pressed the button he thought would retract the airplane's landing gear. Instead, the button shut down the engine.

The pilot couldn't restart the motor. He tried to steer the powerless plane to a runway, but the propeller-driven plane crashed. The total cost of the damage was pegged at $1.4 million.

Nice one Matt.

But this reminds me of the procedures we used at my stint at HUGHES when we were doing highly critical satellite maneuvers. Something like this was usually averted by something as simple as a warning, such as "Do you want to do that, [shut off the engine]", or "command disallowed".

The aircraft software could also be written to disallow an engine shutoff when there are no detectable problems and the altitude is less than "X" feet.

In the end though, this was human error, not UAV error.
 
In the end though, this was human error, not UAV error.
Who's operating the UAV????

Read the posts I put up - and a guarantee you there were PLENTY of UAV crashes due to operator error - more than we'll know because some of the programs were classified.

The aircraft software could also be written to disallow an engine shutoff when there are no detectable problems and the altitude is less than "X" feet.

So you encounter a control problem where you have to shut down the engine less than "x" feet and you can't - so you let the aircraft fly away, possibly into civilian airspace or into a populated area where it could crash and kill someone? :rolleyes:

I don't think "software" is directly controlling operator functions like landing gear operations and engine shutdown unless there are pre-programmed parameters that prohibit such operations in certain flight modes (flap and landing gear operation speeds, etc.) It sounds like the aircraft was in a window to allow that to happen.
 
I'm just pointing out that airplanes crash because of stupid things the pilots and/or flight crew do.

:rolleyes:

And cars crash too because of human error. Should we stop driving cars???

Accidents happen, but this is not a good argument for getting rid of manned aircraft. I would fly on an aircraft with a Pilot over a UAV any day!

Case closed, until you can come up with a better argument.
 
Who's operating the UAV????

Read the posts I put up - and a guarantee you there were PLENTY of UAV crashes due to operator error - more than we'll know because some of the programs were classified.

If it was a human that caused the crash, its their fault, not the UAV itself. The more you automate, the lower the probability of human induced problems.

So you encounter a control problem where you have to shut down the engine less than "x" feet and you can't - so you let the aircraft fly away, possibly into civilian airspace or into a populated area where it could crash and kill someone? :rolleyes:

The on board computer will detect problems and allow an engine shutdown. If there are no problems, then there is no reason to shutdown the engine. Come on flyboy, I know you have seen decision tree's and how software is written around them.


I don't think "software" is directly controlling operator functions like landing gear operations and engine shutdown unless there are pre-programmed parameters that prohibit such operations in certain flight modes (flap and landing gear operation speeds, etc.) It sounds like the aircraft was in a window to allow that to happen.

I would think so. But a properly written program would disallow certein commands if conditions A, B, C etc are not met. Hey, it could have been a simple training flight where the ground personell were being trained for certein scenario's, and this happened.

My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.
 
If it was a human that caused the crash, its their fault, not the UAV itself. The more you automate, the lower the probability of human induced problems.
Same could hold true for a manned aircraft

The on board computer will detect problems and allow an engine shutdown. If there are no problems, then there is no reason to shutdown the engine. Come on flyboy, I know you have seen decision tree's and how software is written around them.
What type of on board computer? How much memory? How much authority are you going to give the computer over the UAV operator????

Decision trees? You don't fly an aircraft with decision trees, if you did you'd have the Amazon Rain Forest in your flight manual.

I would think so. But a properly written program would disallow certein commands if conditions A, B, C etc are not met. Hey, it could have been a simple training flight where the ground personell were being trained for certein scenario's, and this happened.
Could have, would have, should have - again you're coming up with stuff that is very ambiguous at best.
My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.
And again, what type of automatic cockpit? Look at the other UAV failures and some of them occurred in an "automatic" mode as you state. You're trying apply technology that isn't there and isn't practical.
 
If it was a human that caused the crash, its their fault, not the UAV itself. .

Okay... but current technology does not support completely autonomous operation of "UAVs". It is more properly termed UAS. And while I understand your point, it is meaningless in the context of your arguments. It is tantamount to saying that your examples of pilot error in manned aircraft "are not the fault of the aircraft", but of the pilot. Human factors risk mitigation strategies can consist of operational procedures and/or design contraints. This is no different for UAS', where a ground-controller is considered part of the system architecture. So I'm still confused with what your point is. More automation = less human factors concerns? Ask Boeing and Airbus about their latest airplanes. Human factors has become a rather important discipline in their latest airplane programs, where the cockpits have the most automation in their company history.

My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.

Solely? Not from my perspective. Proper systems engineering would have assessed such single point failures and mitigated their occurrence. It's always a tradeoff sys. You can remove the human from the decision loop, but at the expense of increasing your hardware/software complexity. Development assurance (hi-to-low level requirements trace, structural coverage, configuration management, quality assurance, etc) can reduce the likelihood of encountering errors once fielded, but with millions of software lines of code (SLOC) this is no guarantee that all errors will be caught. Now add to that complexity by most UAS' built with commercial off-the-shelf hardware/software whose development assurance is unknown and cannot be reverse engineered, you have only exacerbated your engineering problem. Not impossible, but demonstration/validation is much more difficult, expensive and fraught with safety risk.
 
Again Sys I ask this question:

Cars crash due to human error. Should we stop driving cars, or start using UMV's (Unmanned Motor Vehicles)?

This question is not rhetorical. Answer it...:rolleyes:

My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.

So automated cockpits can not fail? What do you do when it fails?
 
Again Sys I ask this question:

Cars crash due to human error. Should we stop driving cars, or start using UMV's (Unmanned Motor Vehicles)?

There was German company or govt research institute (I would like to say BMW but I am not sure) that demonstrated "autopiloted" cars used on both highway and city driving and it worked marvelously. Only problem was it was too expensive for consumers. But it worked. I also know there are some semi-automatic engine/throttle/steering controls that work like ABS's do right now .... prevent the car from becoming uncontrollable or unsteerable.

So automated cockpits can not fail? What do you do when it fails?

The AC crashes. But the failure points for an automated cockpit are far fewer than manned cockpits, simply because the human element is taken out.

A simple equation for you:
w = number of crashes soley due to pilot error
x = number of crashes due to a combination of human error and mechanical failure
y= number of crashes due to mechanical failure only.
z = number of crashes due to events beyond the control of the AC or pilot (like bird strikes)

N = w+x+y+z.

Now if we eliminate variable w, already we are < N

and factor in part of x, in which pilot error made a bad situation worse, or was the primary cause of the crash, then "N" is even lower.

Would a UAV that's automated to a high degree, with properly tested software to accommodate failures, could still crash? Yes, statistically it will still happen. But it is still obvious that the UAV would be inherently safer than a manned AC. And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.
 
But the failure points for an automated cockpit are far fewer than manned cockpits, simply because the human element is taken out..

Utter falsehood.

A simple equation for you:
w = number of crashes soley due to pilot error
x = number of crashes due to a combination of human error and mechanical failure
y= number of crashes due to mechanical failure only.
z = number of crashes due to events beyond the control of the AC or pilot (like bird strikes)

N = w+x+y+z.

Now if we eliminate variable w, already we are < N

and factor in part of x, in which pilot error made a bad situation worse, or was the primary cause of the crash, then "N" is even lower..

I'll agree that its a simple equation alright. :lol:

But it is still obvious that the UAV would be inherently safer than a manned AC. And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.

Obvious? Not in a million years. And statements that application of sound human factors is not "complicated" is boundlessly naive. Surely can't expect such statements to go unchallenged, sys. Properly applied human factors analyses are immeasurably complex and in many cases very subjective. And identifying system engineering solutions as means of mitigating human factors risks only adds to the complexity of the hardware and software development.
 
Would a UAV that's automated to a high degree, with properly tested software to accommodate failures, could still crash? Yes, statistically it will still happen. But it is still obvious that the UAV would be inherently safer than a manned AC
Obvious? How can you make that claim with no real data to back up your statement? Do you have UAV operational statisitcs?

It depends on operational environment and hours in the air - stress "would be."

Again I've heard the same argument with regards to arming fighters with guns, and the need for an air-to-air fighter. It's been in discussion for over 40 years now!
And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.
And describe SPECIFICALLY those "do not do or allow" parameters.

Again Sys - you're grabbing at straws and providing no specifics. It is clear UAVs are the wave of the future but how much autonomy will be given to them is questionable. Would you allow your family to fly in an unmanned UAV right now?
 
So just a process check here. Can you restate the point of this thread again? Is it to discuss whether UAS' are inherently more safe than manned aircraft?
 

Users who are viewing this thread

Back