# Airline Crash Due to Crew Errors



## syscom3 (May 3, 2009)

In another thread, I made a comment about UAV's would be less prone to crashes due to them not having the human issue, of occasional pilot/crew inattention. The following is just a few examples of civil accidents. But no doubt, the military has its share of similar aircraft loss due to "stupid and preventable" pilot errors.

The following two crashes were solely due to crew errors and were totally preventable. UAV's wouldn't have these issues because they are not human and potentially distracted.

Deradler, please look at Eastern Airlines Crash, and what the crew was doing. A jet crashed because the crew was troubleshooting an indicator lamp and not flying the jet like they should have.


United Airlines Flight 173, registration N8082U, was a Douglas DC-8-61 en route from Stapleton International Airport in Denver to Portland International Airport on December 28, 1978. When the landing gear was lowered, only two of the green landing gear indicator lights came on. *The plane circled in the vicinity of Portland while the crew investigated the problem. After about one hour the plane ran out of fuel and crashed* in a sparsely populated area near 158th and East Burnside Street, killing 10 and seriously injuring 24 of the 189 on board.


Eastern Air Lines Flight 401 was a Lockheed L-1011 jet that crashed into the Florida Everglades on the night of December 29, 1972, causing 101 fatalities (77 initial crash survivors, two died shortly afterward). The crash was a controlled flight into terrain as a result of the *flight crew's failure to monitor the flight instruments during a malfunction of the landing gear position indicator system.*
.... After descending 250 feet from the selected altitude of 2000 feet a C-chord sounded from the rear speaker. This altitude alert, designed to warn the pilots of an inadvertent deviation from the selected altitude, went unnoticed by the crew. *Investigators believe this was due to the crew being distracted by the nose gear light,* and because the flight engineer was not in his seat when it sounded and so would not have been able to hear it. Visually, since it was nighttime and the aircraft was flying over the darkened terrain of the Everglades, there were no ground lights or other visual indications that the TriStar was slowly descending into the swamp...


----------



## Graeme (May 3, 2009)

syscom3 said:


> UAV's wouldn't have these issues because they are not human and potentially distracted.



Hi Sys. Are you advocating that there be no pilot/s on board to monitor the flight? Passengers board, press the button and away we go? Or someone sits and just monitors the system, but is not a pilot? In a way don't we already have some of this technology? Wasn't the Trident the first to pioneer automated landings?

Interesting. I can see pros and cons in what you're saying. 

Pro. The technology is in control but the idiot pilot/technician overrides it and crashes. In this recent case the pilot ignored something like 15 audible warnings...

Garuda Indonesia Flight 200 - Wikipedia, the free encyclopedia

Con. The technology fails, and there's no pilot to take control and save the day. (I have an image of Hal from 2001 deliberately sending the passengers to their doom).


----------



## syscom3 (May 3, 2009)

Graeme said:


> Hi Sys. Are you advocating that there be no pilot/s on board to monitor the flight? Passengers board, press the button and away we go? Or someone sits and just monitors the system, but is not a pilot? In a way don't we already have some of this technology? Wasn't the Trident the first to pioneer automated landings?
> 
> Interesting. I can see pros and cons in what you're saying.
> 
> ...



I'm just pointing out that airplanes crash because of stupid things the pilots and/or flight crew do. And automating the cockpit can prevent those types of crashes. 

UAV's cannot be distracted, therefore some of these pilot induced crashes would never have occured.

Note - I chose these two crashes to prove my point ..... the flight crew was too busy looking at a broken lamp and failed to fly the plane.


----------



## FLYBOYJ (May 3, 2009)

Sys - both incidents are basically ancient history with regards crew cockpit awareness and newer avionics and autopilots that prevent errors from that happening. Basically the pilots were not flying the aircraft - in the first one, it is SOP to try to trouble shoot a problem like a landing gear problem, but to run out of fuel is a different story - potentially a UAV operator could do the same thing and lose the UAV as well.

Neither incidents support your argument and UAVs have the same potential for failure as manned aircraft.

Here....

Canadian Army’s Elbit UAV grounded by malfunctions Unmanned Aerial Vehicles (UAV) Blog Archive

ComPilots Aviation News Portal - UAV crashes in northern Gaza Strip

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs

Accidents involving UAVs


----------



## The Basket (May 3, 2009)

Yu have to pick out situations where a manned aircraft was saved by the crew which a UAV would have been lost.

UAV is piloted remotely and anything human or designed by human has an inbuilt idiot factor which just is.


----------



## FLYBOYJ (May 3, 2009)

The Basket said:


> Yu have to pick out situations where a manned aircraft was saved by the crew which a UAV would have been lost.


If I had the time today I could dig up a few


The Basket said:


> UAV is piloted remotely and anything human or designed by human has an inbuilt idiot factor which just is.


True


----------



## Matt308 (May 3, 2009)

Syscom's point is well taken and certainly there is an inherent reduction in human factors in operating UAVs. But the human factors issues are not gone, just changed. And changed into areas that do not have the historical precedent from which to assess best practices and implement mitigation strategies.

This is the one that always comes to mind first.
___________________________________________________________
AIRFORCE TIMES
Pilot error blamed in August Predator crash

By Bruce Rolfsen - Staff writer
Posted : Friday Jan 26, 2007 13:28:13 EST

The crash of a remote-controlled MQ-1 Predator on Aug. 3 at Creech Air Force Base, Nev., resulted from a civilian contract pilot pushing the wrong button, an Air Force accident investigation board concluded in a report issued Thursday. The aircraft was assigned the Predator formal training unit, the 11th Reconnaissance Squadron at Creech.

As the aircraft flew near the base at an altitude of about 500 feet, the pilot pressed the button he thought would retract the airplane’s landing gear. Instead, the button shut down the engine.

The pilot couldn’t restart the motor. He tried to steer the powerless plane to a runway, but the propeller-driven plane crashed. The total cost of the damage was pegged at $1.4 million.


----------



## syscom3 (May 3, 2009)

Matt308 said:


> Syscom's point is well taken and certainly there is an inherent reduction in human factors in operating UAVs. But the human factors issues are not gone, just changed. And changed into areas that do not have the historical precedent from which to assess best practices and implement mitigation strategies.
> 
> This is the one that always comes to mind first.
> ___________________________________________________________
> ...



Nice one Matt.

But this reminds me of the procedures we used at my stint at HUGHES when we were doing highly critical satellite maneuvers. Something like this was usually averted by something as simple as a warning, such as "Do you want to do that, [shut off the engine]", or "command disallowed".

The aircraft software could also be written to disallow an engine shutoff when there are no detectable problems and the altitude is less than "X" feet.

In the end though, this was human error, not UAV error.


----------



## Matt308 (May 3, 2009)

syscom3 said:


> In the end though, this was human error, not UAV error.



But wasn't that your point about manned aircraft? I'm confused... again.


----------



## Matt308 (May 3, 2009)

syscom3 said:


> UAV's wouldn't have these issues because they are not human and potentially distracted.


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> In the end though, this was human error, not UAV error.


Who's operating the UAV????

Read the posts I put up - and a guarantee you there were PLENTY of UAV crashes due to operator error - more than we'll know because some of the programs were classified.



syscom3 said:


> The aircraft software could also be written to disallow an engine shutoff when there are no detectable problems and the altitude is less than "X" feet.



So you encounter a control problem where you have to shut down the engine less than "x" feet and you can't - so you let the aircraft fly away, possibly into civilian airspace or into a populated area where it could crash and kill someone? 

I don't think "software" is directly controlling operator functions like landing gear operations and engine shutdown unless there are pre-programmed parameters that prohibit such operations in certain flight modes (flap and landing gear operation speeds, etc.) It sounds like the aircraft was in a window to allow that to happen.


----------



## DerAdlerIstGelandet (May 3, 2009)

syscom3 said:


> I'm just pointing out that airplanes crash because of stupid things the pilots and/or flight crew do.



 

*And cars crash too because of human error. Should we stop driving cars???*

Accidents happen, but this is not a good argument for getting rid of manned aircraft. I would fly on an aircraft with a Pilot over a UAV any day!

Case closed, until you can come up with a better argument.


----------



## syscom3 (May 3, 2009)

FLYBOYJ said:


> Who's operating the UAV????
> 
> Read the posts I put up - and a guarantee you there were PLENTY of UAV crashes due to operator error - more than we'll know because some of the programs were classified.



If it was a human that caused the crash, its their fault, not the UAV itself. The more you automate, the lower the probability of human induced problems.



> So you encounter a control problem where you have to shut down the engine less than "x" feet and you can't - so you let the aircraft fly away, possibly into civilian airspace or into a populated area where it could crash and kill someone?



The on board computer will detect problems and allow an engine shutdown. If there are no problems, then there is no reason to shutdown the engine. Come on flyboy, I know you have seen decision tree's and how software is written around them.




> I don't think "software" is directly controlling operator functions like landing gear operations and engine shutdown unless there are pre-programmed parameters that prohibit such operations in certain flight modes (flap and landing gear operation speeds, etc.) It sounds like the aircraft was in a window to allow that to happen.



I would think so. But a properly written program would disallow certein commands if conditions A, B, C etc are not met. Hey, it could have been a simple training flight where the ground personell were being trained for certein scenario's, and this happened.

My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> If it was a human that caused the crash, its their fault, not the UAV itself. The more you automate, the lower the probability of human induced problems.


Same could hold true for a manned aircraft



syscom3 said:


> The on board computer will detect problems and allow an engine shutdown. If there are no problems, then there is no reason to shutdown the engine. Come on flyboy, I know you have seen decision tree's and how software is written around them.


What type of on board computer? How much memory? How much authority are you going to give the computer over the UAV operator????

Decision trees? You don't fly an aircraft with decision trees, if you did you'd have the Amazon Rain Forest in your flight manual.



syscom3 said:


> I would think so. But a properly written program would disallow certein commands if conditions A, B, C etc are not met. Hey, it could have been a simple training flight where the ground personell were being trained for certein scenario's, and this happened.


Could have, would have, should have - again you're coming up with stuff that is very ambiguous at best.


syscom3 said:


> My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.


And again, what type of automatic cockpit? Look at the other UAV failures and some of them occurred in an "automatic" mode as you state. You're trying apply technology that isn't there and isn't practical.


----------



## Matt308 (May 3, 2009)

syscom3 said:


> If it was a human that caused the crash, its their fault, not the UAV itself. .



Okay... but current technology does not support completely autonomous operation of "UAVs". It is more properly termed UAS. And while I understand your point, it is meaningless in the context of your arguments. It is tantamount to saying that your examples of pilot error in manned aircraft "are not the fault of the aircraft", but of the pilot. Human factors risk mitigation strategies can consist of operational procedures and/or design contraints. This is no different for UAS', where a ground-controller is considered part of the system architecture. So I'm still confused with what your point is. More automation = less human factors concerns? Ask Boeing and Airbus about their latest airplanes. Human factors has become a rather important discipline in their latest airplane programs, where the cockpits have the most automation in their company history.



syscom3 said:


> My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.



Solely? Not from my perspective. Proper systems engineering would have assessed such single point failures and mitigated their occurrence. It's always a tradeoff sys. You can remove the human from the decision loop, but at the expense of increasing your hardware/software complexity. Development assurance (hi-to-low level requirements trace, structural coverage, configuration management, quality assurance, etc) can reduce the likelihood of encountering errors once fielded, but with millions of software lines of code (SLOC) this is no guarantee that all errors will be caught. Now add to that complexity by most UAS' built with commercial off-the-shelf hardware/software whose development assurance is unknown and cannot be reverse engineered, you have only exacerbated your engineering problem. Not impossible, but demonstration/validation is much more difficult, expensive and fraught with safety risk.


----------



## DerAdlerIstGelandet (May 3, 2009)

Again Sys I ask this question:

Cars crash due to human error. Should we stop driving cars, or start using UMV's (Unmanned Motor Vehicles)?

This question is not rhetorical. Answer it...



syscom3 said:


> My point is, this was a preventable accident where the pilot or controller was solely at fault. An automated cockpit would not have allowed this to happen.



So automated cockpits can not fail? What do you do when it fails?


----------



## syscom3 (May 3, 2009)

> Again Sys I ask this question:
> 
> Cars crash due to human error. Should we stop driving cars, or start using UMV's (Unmanned Motor Vehicles)?



There was German company or govt research institute (I would like to say BMW but I am not sure) that demonstrated "autopiloted" cars used on both highway and city driving and it worked marvelously. Only problem was it was too expensive for consumers. But it worked. I also know there are some semi-automatic engine/throttle/steering controls that work like ABS's do right now .... prevent the car from becoming uncontrollable or unsteerable.



> So automated cockpits can not fail? What do you do when it fails?



The AC crashes. But the failure points for an automated cockpit are far fewer than manned cockpits, simply because the human element is taken out.

A simple equation for you:
w = number of crashes soley due to pilot error
x = number of crashes due to a combination of human error and mechanical failure
y= number of crashes due to mechanical failure only.
z = number of crashes due to events beyond the control of the AC or pilot (like bird strikes)

N = w+x+y+z.

Now if we eliminate variable w, already we are < N

and factor in part of x, in which pilot error made a bad situation worse, or was the primary cause of the crash, then "N" is even lower.

Would a UAV that's automated to a high degree, with properly tested software to accommodate failures, could still crash? Yes, statistically it will still happen. But it is still obvious that the UAV would be inherently safer than a manned AC. And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.


----------



## Matt308 (May 3, 2009)

syscom3 said:


> But the failure points for an automated cockpit are far fewer than manned cockpits, simply because the human element is taken out..



Utter falsehood.



syscom3 said:


> A simple equation for you:
> w = number of crashes soley due to pilot error
> x = number of crashes due to a combination of human error and mechanical failure
> y= number of crashes due to mechanical failure only.
> ...



I'll agree that its a simple equation alright. 



syscom3 said:


> But it is still obvious that the UAV would be inherently safer than a manned AC. And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.



Obvious? Not in a million years. And statements that application of sound human factors is not "complicated" is boundlessly naive. Surely can't expect such statements to go unchallenged, sys. Properly applied human factors analyses are immeasurably complex and in many cases very subjective. And identifying system engineering solutions as means of mitigating human factors risks only adds to the complexity of the hardware and software development.


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> Would a UAV that's automated to a high degree, with properly tested software to accommodate failures, could still crash? Yes, statistically it will still happen. *But it is still obvious that the UAV would be inherently safer than a manned AC*


Obvious? How can you make that claim with no real data to back up your statement? Do you have UAV operational statisitcs? 

It depends on operational environment and hours in the air - stress "would be."

Again I've heard the same argument with regards to arming fighters with guns, and the need for an air-to-air fighter. It's been in discussion for over 40 years now!


syscom3 said:


> And even if there is an external operator flying the UAV, there's nothing complicated in writing "do not do or allow" software to be written to prevent accidents, or operating the AC out of its limits.


And describe SPECIFICALLY those "do not do or allow" parameters.

Again Sys - you're grabbing at straws and providing no specifics. It is clear UAVs are the wave of the future but how much autonomy will be given to them is questionable. Would you allow your family to fly in an unmanned UAV right now?


----------



## Matt308 (May 3, 2009)

So just a process check here. Can you restate the point of this thread again? Is it to discuss whether UAS' are inherently more safe than manned aircraft?


----------



## FLYBOYJ (May 3, 2009)

Matt308 said:


> So just a process check here. Can you restate the point of this thread again? Is it to discuss whether UAS' are inherently more safe than manned aircraft?



Well if it goes back to the first post, the two examples Sys gave were VERY poor.


----------



## Matt308 (May 3, 2009)

FLYBOYJ said:


> Would you allow your family to fly in an unmanned UAV right now?



Even if his answer is yes, there is not a civil aviation authority on the planet that would allow him to. 

I'm wondering if the point get's lost in the argument. Everyone seems to be in violent agreement that UAS' are a reality today, in the near future and long-term. I'm just struggling with the point of this thread.


----------



## mkloby (May 3, 2009)

I'd just like to add my 2 cents, as I fly an aircraft that is highly automated on a daily basis.

Automation is a good thing, and can significantly reduce pilot workload, but it has to be used correctly.

Syscom - it is no secret that human error plays a part in most mishaps. Human error plays a factor in approximately 80-85% of mishaps, if I recall correctly. There will still be human factors with a human operating a UAV, no doubt.

Also, you need to be VERY careful about designing aircraft that restrict the pilots freedom of action. Some automation is a great thing, but some can be very bad. You need to understand that aircraft systems often do not work as the engineers imagine. Parts fail in high performance aircraft all the time. It is not a perfect world. When aircraft systems are tied together to large extent through computers, software, data buses, etc., small part failures can have large consequences.


----------



## syscom3 (May 3, 2009)

Matt308 said:


> So just a process check here. Can you restate the point of this thread again? Is it to discuss whether UAS' are inherently more safe than manned aircraft?



Yes.


----------



## syscom3 (May 3, 2009)

mkloby said:


> Syscom - it is no secret that human error plays a part in most mishaps. Human error plays a factor in approximately 80-85% of mishaps, if I recall correctly. There will still be human factors with a human operating a UAV, no doubt



And if the human element is reduced even further, then the number of human induced accidents goes even lower. And with current software and processors, theres nothing stopping a UAV or remotely piloted vehicle from being even safer, by preventing stupid things from happening in the first place.

MK, think about the human limitations on processing and correctly coming up with a solution when faced under extreme stress, multiple failures and you only have a few seconds to "fix" it, and if not successfull, you crash and die.

UAV's and computer controlled cockpits dont have those limitations, thus by default, they are safer.



> Parts fail in high performance aircraft all the time. It is not a perfect world. When aircraft systems are tied together to large extent through computers, software, data buses, etc., small part failures can have large consequences.



Yes, Ive never doubted that. I also am a strong believer in testing in the real world, so when automation takes place, its not an ad hoc prototypical solution.

Also, you have to look at the accident statistics as a whole. If a completely automated system has a 1 in a million chance of complete failure, and we will lose an AC once every year .... but .... the automation feature means we have also prevented 3 or 4 crashes in the same time period, then things are safer.


----------



## syscom3 (May 3, 2009)

> Obvious? Not in a million years. And statements that application of sound human factors is not "complicated" is boundlessly naive. Surely can't expect such statements to go unchallenged, sys. Properly applied human factors analyses are immeasurably complex and in many cases very subjective. And identifying system engineering solutions as means of mitigating human factors risks only adds to the complexity of the hardware and software development.



Heres some simple code for you:

"Do not allow landing gear to be retracted untill altitude "X" is reached".

"Do not allow aircraft to enter a bank exceeding X degrees at Y airspeed so as to prevent stalling or exceed allowable gee loads".

"Do not allow AC to touchdown for landing if air speed exceeds Z MPH and runway length is too short".

Are you telling me this is complex? 

BTW, the B2 and F16 are unstable aircraft to fly. Dont you suppose that the avionics is whats allowing the plane to fly to begin with?


----------



## syscom3 (May 3, 2009)

FLYBOYJ said:


> Well if it goes back to the first post, the two examples Sys gave were VERY poor.



Why is that? tell me why they are poor. In fact, they're textbook examples of human failure allowing an AC to crash.

And sorry, saying that modern crew flight procedures and discipline wont let that happen again is incorrect. As long as people are people, sooner or later cockpit discipline will become lax and flight procedures "over looked" from time to time.

Just like that Russian captain who let his kids fly his airbus. Who would have thought ......


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> Why is that? tell me why they are poor. In fact, they're textbook examples of human failure allowing an AC to crash.


And a UAV operator can do the same thing while troubleshooting a problem. The L-1011 is texbook for training scenarios and would also apply to UAVs


syscom3 said:


> And sorry, saying that modern crew flight procedures and discipline wont let that happen again is incorrect. As long as people are people, sooner or later cockpit discipline will become lax and flight procedures "over looked" from time to time.


And the same thing applies to any machinery. Add a computer to run it and you have rigidity in the decision making ability


syscom3 said:


> Just like that Russian captain who let his kids fly his airbus. Who would have thought ......


Unrelated to this discussion....


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> Heres some simple code for you:
> 
> "Do not allow landing gear to be retracted untill altitude "X" is reached".


And you cannot gain immediate positive climb, something desirable in jet aircraft


syscom3 said:


> "Do not allow aircraft to enter a bank exceeding X degrees at Y airspeed so as to prevent stalling or exceed allowable gee loads".


And now you're limiting the manevability of the aircraft


syscom3 said:


> "Do not allow AC to touchdown for landing if air speed exceeds Z MPH and runway length is too short".


And how does the UAV determine runway length? What if you're operating on a dirt strip? If you have ice on the airvehicle you have to land at a higher than normal airspeed. 


syscom3 said:


> Are you telling me this is complex?


It's not but then you limit the capability of the unit. What I just described has to be determined by a human


syscom3 said:


> BTW, the B2 and F16 are unstable aircraft to fly. Dont you suppose that the avionics is whats allowing the plane to fly to begin with?


They are, but the systems that allow those aircraft to fly are synthesized to allow human decision making in most of the flight envelope.


----------



## syscom3 (May 3, 2009)

Here's another example where pilot error went to the extreme and cockpit automation would have prevented this from happening.

B-52 crash at Fairchild Air Force Base occurred on June 24, 1994, killing the four crew members of a United States Air Force (USAF) B-52 Stratofortress named Czar 52[1] during an airshow practice flight. In the crash, Bud Holland, who was the command pilot of the aircraft based at Fairchild Air Force Base, flew the aircraft beyond its operational limits and lost control. As a result, the aircraft stalled, hit the ground, and was destroyed.

1994 Fairchild Air Force Base B-52 crash - Wikipedia, the free encyclopedia


Here was complete failure of pilot dicipline and flight deck dicipline. And if the AC had a computer flying the plane in which prohibited or unsafe maneuvers were prevented from happening, then this would not have occurred. And if the computer is flying the plane, why should there be a pilot?


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> Here's another example where pilot error went to the extreme and cockpit automation would have prevented this from happening.
> 
> B-52 crash at Fairchild Air Force Base occurred on June 24, 1994, killing the four crew members of a United States Air Force (USAF) B-52 Stratofortress named Czar 52[1] during an airshow practice flight. In the crash, Bud Holland, who was the command pilot of the aircraft based at Fairchild Air Force Base, flew the aircraft beyond its operational limits and lost control. As a result, the aircraft stalled, hit the ground, and was destroyed.
> 
> ...


----------



## Matt308 (May 3, 2009)

You stole my thunder Joe. I was gonna cite the A320 crash. To this day the flight control laws of the Airbus aircraft are subject to heated debate. Boeing has the exact opposite philosophy, "You wanna bend the airframe, that is the pilot's call... not the FCC". My quote... not Boeing's.


----------



## FLYBOYJ (May 3, 2009)

Matt308 said:


> You stole my thunder Joe. I was gonna cite the A320 crash. To this day the flight control laws of the Airbus aircraft are subject to heated debate. Boeing has the exact opposite philosophy, "You wanna bend the airframe, that is the pilot's call... not the FCC". My quote... not Boeing's.



Here's more on this - on that site there are a few guys slamming the submitter and he does seem to be "Boeing Biased."

AirDisaster.Com: Investigations: Air France 296


----------



## DerAdlerIstGelandet (May 3, 2009)

syscom3 said:


> There was German company or govt research institute (I would like to say BMW but I am not sure) that demonstrated "autopiloted" cars used on both highway and city driving and it worked marvelously. Only problem was it was too expensive for consumers. But it worked. I also know there are some semi-automatic engine/throttle/steering controls that work like ABS's do right now .... prevent the car from becoming uncontrollable or unsteerable.



That fact that you actually believe that computers can do the job more safely than a human is very troublesome. You worry me...



syscom3 said:


> The AC crashes. But the failure points for an automated cockpit are far fewer than manned cockpits, simply because the human element is taken out.
> 
> A simple equation for you:
> w = number of crashes soley due to pilot error
> ...



No you are looking at this in black and white. There is a lot more to the picture than that.

Aircraft already fly themselves for the most part. Why are pilots needed? For when things go wrong. A computer can never replace a human in that area. 

Do accidents happen because of Human area? Of course, but that does not make airline travel any less safe.



Matt308 said:


> Utter falsehood.



He obviously has never worked with government computer systems...



syscom3 said:


> Heres some simple code for you:
> 
> "Do not allow landing gear to be retracted untill altitude "X" is reached".
> 
> ...




Sys that stuff is already built into onboard computers. That however does take the need away from a pilot.



syscom3 said:


> Here's another example where pilot error went to the extreme and cockpit automation would have prevented this from happening.
> 
> B-52 crash at Fairchild Air Force Base occurred on June 24, 1994, killing the four crew members of a United States Air Force (USAF) B-52 Stratofortress named Czar 52[1] during an airshow practice flight. In the crash, Bud Holland, who was the command pilot of the aircraft based at Fairchild Air Force Base, flew the aircraft beyond its operational limits and lost control. As a result, the aircraft stalled, hit the ground, and was destroyed.
> 
> ...



I can do the same thing. Here is a perfect example of human error.

Mother Drives Car Into Apartment Building - cbs2.com

Who cares, my point?. You are doing nothing but grabbing for air here, and millions of people still drive cars...


----------



## DerAdlerIstGelandet (May 3, 2009)

I think everyone agrees and is well aware that automation in aircraft is a good thing. It really does help the pilot by taking some of the work load. In the end though fact is fact and something sys fails to realize is that you can not completely replace a human in the cockpit (at least not at this time).


----------



## Matt308 (May 3, 2009)

syscom3 said:


> Heres some simple code for you:
> 
> "Do not allow landing gear to be retracted untill altitude "X" is reached".
> 
> ...



Yes I am. Integrated Flight Control Computers with airframe flight limitations is a very significant engineering problem. Now your last example couples not only flight control algorithms, but also introduces dynamic modifications based upon aeronautical information service (AIS) parameters. These AIS parameters are are identified in ICAO Annex 11, but are in no way standardized for civil aerodromes (airports). You have yet again made an engineering oversimplification that your "flowchart" you are working would say...


----------



## FLYBOYJ (May 3, 2009)

UAV reliability

great info!


----------



## Matt308 (May 3, 2009)

Phhhttt... more facts to get into the way of a good argument.


----------



## FLYBOYJ (May 3, 2009)




----------



## syscom3 (May 3, 2009)

Flyboy, explain this one .... Perris Valley Airport. 1984. Drunk pilot crashed into a DC3. I missed this crash by an hour, and had just jumped out of the DC3. Pilot error all the way. An automated cockpit wouldn't have allowed this to happen. Drunk pilots is an issue automated cockpits dont have to deal with. All the laws on the books didnt stop this pilot from some serious lapses in judgement.



> Just like that Russian captain who let his kids fly his airbus. Who would have thought ......
> Unrelated to this discussion....



So I take it you dont have an answer to pilots committing lapses in judgment? What about cockpit discipline and prcedure's? I suppose the pilot knew all about those and violated them anyways. Automated cockpits dont have these issues. They're not influenced by human frailties.

As for the A320 crash .... excellent point. And it obviously shows that thourough testing must be done to prevent errors like this from happening. But then ...... if this type of accident only happens once per decade, and the flight computers have prevented several crashes before then, then again, things are safer.

Flyboy, as for your comment about decision charts .... I am not trying to insult your intelligence, but these charts are used extensively in flight manuals and as the basis for automated flight controls, and have been for decades. So I apologize if I misread your statement or you were not clear in what youre saying.

Deradler ..


> "Sys that stuff is already built into onboard computers. That however does take the need away from a pilot."



I am pointing out that some pilot errors are just so stupid, they are or can be preventable by the on board avioncs. There is nothing complicated for a autopilot to refuse to retract the landing gear if they're still on the ground, or refuse to shutdown an engine on takeoff if there is no detectable problems, nor prevent a pilot from putting the AC into a maneuver where it will fail catastrophically or go out of control.


----------



## pbfoot (May 3, 2009)

And how many lives were saved by having a real person in the drivers seat example the Gimli Glider for one , it was a rare day when I was working that we didn't have some sort of emergency


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> Flyboy, explain this one .... Perris Valley Airport. 1984. Drunk pilot crashed into a DC3. I missed this crash by an hour, and had just jumped out of the DC3. Pilot error all the way. An automated cockpit wouldn't have allowed this to happen. Drunk pilots is an issue automated cockpits dont have to deal with. All the laws on the books didnt stop this pilot from some serious lapses in judgement.


An automated cockpit on a GA airplane???Again more fantasy. How will any "automated system" prevent this? Looks like a Cessna 150 or 172 - so tell me, explain to me how this would be set up and the cost?!?!?! Sorry pal, another bad example.

There are thousands of people killed every year by drunk drivers - again I see little technology around to prevent the common Joe from getting behind the wheel drunk in the same manner you described above.

How about bus and train engineers - are we going to automate those functions as well, I mean there are human error fatalities there too?



syscom3 said:


> So I take it you dont have an answer to pilots committing lapses in judgment? What about cockpit discipline and prcedure's? I suppose the pilot knew all about those and violated them anyways. Automated cockpits dont have these issues. They're not influenced by human frailties.


And automatic cockpits cannot totally replace pilots in many operations such as you described. They cannot make judgements in certain conditions that only a human can see or feel nor can they respond to passengers in the same manner any type of artificial intelligence can - can an automated cockpit see immediately that a passenger is suffering from hypoxia? Oh yea, the computer will take care of that, but so far you have failed to say what type of computer and how it will function. I'm sorry but you're examples are half fantasy and half wish full thinking. In many cases its the human frailties that are the most important and only another human at the controls can fully deal with them.


syscom3 said:


> As for the A320 crash .... excellent point. And it obviously shows that thourough testing must be done to prevent errors like this from happening. But then ...... if this type of accident only happens once per decade, and the flight computers have prevented several crashes before then, then again, things are safer.


To a point they are, but when you safer, in what operation are you talking about? Military? Airlines? GA?


syscom3 said:


> Flyboy, as for your comment about decision charts .... I am not trying to insult your intelligence, but these charts are used extensively in flight manuals and as the basis for automated flight controls, and have been for decades. So I apologize if I misread your statement or you were not clear in what youre saying.


Decision charts or decision trees? Show me a manual that has one in the manner you described. There are trouble shooting guides for the maintainers that have these, but give me a specific flight manual that you have seen that has this format.

You keep saying that this stuff is out there, well if its that viable everyone would have jumped on it 5 years ago - again, I agree UAVs will be the wave of the future but we are decades away from even beginning to think about having fully autonomous aircraft carry passengers.


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> There is nothing complicated for a autopilot to refuse to retract the landing gear if they're still on the ground, or refuse to shutdown an engine on takeoff if there is no detectable problems, nor prevent a pilot from putting the AC into a maneuver where it will fail catastrophically or go out of control.


Actually there are electrical and mechanical devices that do what you describe. An "autopilot" does one thing - fly the aircraft to a specific programmed course.


----------



## FLYBOYJ (May 3, 2009)

syscom3 said:


> As for the A320 crash .... excellent point. And it obviously shows that thourough testing must be done to prevent errors like this from happening. But then ...... *if this type of accident only happens once per decade*, and the flight computers have prevented several crashes before then, then again, things are safer.



Do They????

http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgAD.nsf/0/88952b887f210c988625707e005944f2/$FILE/2005-19-10.pdf

Air Canada 190 - A319 Autopilot Failure?  Tech Ops Forum | Airliners.net

A320 flight crew experiences dual flight augmentation computer failure shortly after takeoff causing...


----------



## BombTaxi (May 3, 2009)

I have been following these UAV threads for a while, and would just like to ask a question, if I may  

sys, the basis of you're argument is that humans are inherently prone to making errors of judgment that crash planes, correct? Yet all present-day UAVs are flown by human pilots on the ground, which means a UAV is no safer than a manned aircraft because your primary source of error is still there to mess things up.

The alternative that you seem to have assumed is that UAVs can be flown entirely by computers. While an admirable aspiration, this simply isn't possible it the present timeframe or, IMHO, anytime soon. AIs are still fairly rudimentary and struggling to pass the Turing Test, never mind fly a fighter. A computer can only make decisions in black or white - 1 or 0. As I think any account of flying or aerial combat makes clear, there is a lot of grey involved in aerial warfare, which the present state of the computing art cannot deal with. This is why we still depend on squishy, error-prone humans to take these massively advanced weapon systems into combat...

So, do you envision UAVs as being 'flown' by remote human operators (in which case you render your own argument null and void). or as being flown by AIs we do not yet possess, nor seem to be capable of developing soon (in which case your concept is pure speculation at best)?


----------



## FLYBOYJ (May 3, 2009)

BombTaxi said:


> in which case your concept is pure speculation at best



I think that sums it up!


----------



## gumbyk (May 3, 2009)

Umm, how many members of the public would willingly get into an aircraft without a pilot sitting up the front who can do the flying? I would hazard a guess that the answer is not many (almost certainly not enough to make the proposition a commercial reality).

Sys, do you fly? (As a pilot I mean, not a pax). this thread would suggest that you don't, as I have never heard a pilot even bringing up this topic. Maybe if you did, you would see the other points of view, and not resort to blindly sticking to the statistics.

Issues on landing such as standing water/ice on the runway, wind gusts, wind-shear, all mean that automation in every circumstance is pretty near impossible. 

And, yes, there are occasions when you have to bust a limitation to avoid bending the aircraft, its a judgment call, and no computer out there can make a judgment on which rules can be 'bent' safely, and which options to take.

I sure as hell know I wouldn't get in an unpiloted spam can...


----------



## Matt308 (May 3, 2009)

gumbyk said:


> I sure as hell know I wouldn't get in an unpiloted spam can...



Me either... yet. But the time is coming.

This is what is so disconcerting about the discussion. Everyone agrees that UAS' are the wave of the future. Perhaps it is the timeline that Sys proposes that hangs us all up. It is NOT near future. Hell, industry can't even automate routine ATC voice clearances via data link in the next 5 years, let alone integrate UAS' into the NAS.

And what many people don't realize are the categorical differences in UAV capabilities. Remember there are those that are almost fully autonomous like the Globalhawk/Predator/EagleEye/Hummingbird class. And then there are those that are less functional like the ScanEagle or other tactical applications.


----------



## FLYBOYJ (May 3, 2009)

Matt308 said:


> And what many people don't realize are the categorical differences in UAV capabilities. Remember there are those that are *almost* fully autonomous like the Globalhawk/Predator/EagleEye/Hummingbird class. And then there are those that are less functional like the ScanEagle or other tactical applications.


And let's stress "almost."


----------



## Matt308 (May 3, 2009)

That is a lost adjective, isn't it.


_View: https://www.youtube.com/watch?v=byS2UcA1QKk_


----------



## syscom3 (May 3, 2009)

pbfoot said:


> And how many lives were saved by having a real person in the drivers seat example the Gimli Glider for one , it was a rare day when I was working that we didn't have some sort of emergency



Another good point. And its lessons were learned so as to make sure the probability of its happening again were minimized.

Human error completely.


----------



## syscom3 (May 3, 2009)

> An automated cockpit on a GA airplane???Again more fantasy. How will any "automated system" prevent this? Looks like a Cessna 150 or 172 - so tell me, explain to me how this would be set up and the cost?!?!?! Sorry pal, another bad example.



-172. Automated cockpit would have landed the airplane safely and then taxi it safely without it running into people or other structures. 



> There are thousands of people killed every year by drunk drivers - again I see little technology around to prevent the common Joe from getting behind the wheel drunk in the same manner you described above.



Ever hear of ignition locks?

How about bus and train engineers - are we going to automate those functions as well, I mean there are human error fatalities there too?



> can an automated cockpit see immediately that a passenger is suffering from hypoxia?



Sounds like something a flight attendant can do. BTW, avionics are immune to altitude and are not subject to hypoxia. Are you willing to debate me on the issues of pilot incapacitation due to oxygen starvation vs automated cockpits?



> To a point they are, but when you safer, in what operation are you talking about? Military? Airlines? GA?



All of them.



> Decision charts or decision trees? Show me a manual that has one in the manner you described. There are trouble shooting guides for the maintainers that have these, but give me a specific flight manual that you have seen that has this format.



Again, I am not insulting you, nor intend to do so. But just WTF are you talking about? Again, I might be misreading you, so again I say, I am not insulting you .... but the first instance I know of for decision charts was the B-17 pilots checklist. Dont you suppose decision charts and diagrams have been used since 1939?

And yes, as part of my current job is to look at failure modes, and see what the symptoms are and what our monitoring program will report.





> .....decades away from even beginning to think about having fully autonomous aircraft carry passengers.



Decades? The future is far sooner than that. But yes, I agree that when it comes to civil aviation, it will take some time. But for military aviation where the pilot risk is high?


----------



## FLYBOYJ (May 4, 2009)

syscom3 said:


> -172. Automated cockpit would have landed the airplane safely and then taxi it safely without it running into people or other structures.


Again, how will it work? What types of servos are connected to the flight controls? How do you intergrate this system to the mixture controls? How will an automated cockpit know where and how far to taxi the aircraft? How about weight of the system as GA aircraft don't have a wide useful load? *Cost?* This rationale is like me saying "we could build a plane that could fold up into a suit case, just like the Jetsons." Answer these questions and I'll start to take you serious.



syscom3 said:


> -Sounds like something a flight attendant can do. BTW, avionics are immune to altitude and are not subject to hypoxia.


Why have a flight attendant, just put a robot in place! 


syscom3 said:


> Are you willing to debate me on the issues of pilot incapacitation due to oxygen starvation vs automated cockpits?


No, but I will point out the silliness of some of your claims.



syscom3 said:


> -
> Again, I am not insulting you, nor intend to do so. But just WTF are you talking about? Again, I might be misreading you, so again I say, I am not insulting you .... but the first instance I know of for *decision charts *was the B-17 pilots checklist. Dont you suppose decision charts and diagrams have been used since 1939?


And again, show me one - I have a B-17 pilots manual and I see no so called *"decision chart." *There are performance charts, weight and balance charts, fuel consumption charts, etc., so tell me what YOU'RE talking about?!?!?!?

I'll put this out to the other pilots - Has anyone ever heard of the charts in a -1 or POH called "decision charts?"


----------



## Matt308 (May 4, 2009)

sys is thinking like an engineer again (e.g., FMEA, Fault trees, FHA, etc).

And just to correct sys again, avionics ARE susceptible to altitude. That is why civil avionics must be tested and qualified to environmental considerations contained in RTCA, Inc DO-160 Section 4. Military avionics have similar standards. Working for Hughes, sure you would know that.


----------



## DerAdlerIstGelandet (May 4, 2009)

syscom3 said:


> Flyboy, explain this one .... Perris Valley Airport. 1984. Drunk pilot crashed into a DC3. I missed this crash by an hour, and had just jumped out of the DC3. Pilot error all the way. An automated cockpit wouldn't have allowed this to happen. Drunk pilots is an issue automated cockpits dont have to deal with. All the laws on the books didnt stop this pilot from some serious lapses in judgement.



Explain this one to my sys? Drunk Driver crashes his Ford Explorer into a Chevy S10. All the laws in the books didn't stop this driver from some serious lapses of judgement.

Get my point? You are still grabbing at air...



syscom3 said:


> Deradler ..
> 
> I am pointing out that some pilot errors are just so stupid, they are or can be preventable by the on board avioncs. There is nothing complicated for a autopilot to refuse to retract the landing gear if they're still on the ground, or refuse to shutdown an engine on takeoff if there is no detectable problems, nor prevent a pilot from putting the AC into a maneuver where it will fail catastrophically or go out of control.



Sys there are already systems in place that prevent your from pilot from doing such things. 

As for the landing gear on the ground. You do no even need an Auto Pilo to keep you from doing that. That would be overkill! *Ever heard of a WOW switch?* Probably not, since you think that something needs to be put in place to keep you from retracting the wheels while on the ground.

Do a search for a WOW switch, and come back to me when you are ready? Okay...

But then again, since I have caught you in something you have no clue about, you will say that my post is rhetorical and not respond to it.

There are also already systems in place that keep pilots from doing that. You are describing something new. 

Again you are doing nothing but grabbing at air at something you do not seem to grasp.


----------



## DerAdlerIstGelandet (May 4, 2009)

pbfoot said:


> And how many lives were saved by having a real person in the drivers seat example the Gimli Glider for one , it was a rare day when I was working that we didn't have some sort of emergency



Ditto, I would not want to fly an aircraft that has no pilot.


----------



## evangilder (May 4, 2009)

This is getting ridiculous. Why the hell would anyone want a GA plane that flies itself??? I sure as hell wouldn't buy one, nor want to fly in one. A big part of GA aviation is people who learn to fly and love to fly. I can't think of any GA or warbird pilot that thinks flying is "just okay". They love it, and would never give up piloting the airplane to a machine.

I am not a pilot, but have spent quite a bit of time in the cockpit of many different types the last few years. I fly with people I know and trust, and people with the judgment to tell me the pilot is good to go. Part of that is also the kinship and camaraderie that goes with it. 

Are there errors in judgment that cause crashes? Sure. Are there errors in judgment cars that cause crashes? Yes. There are going to be people that make errors and in some cases can cause things to go wrong. But does that mean that we should replace all of that with automated controls? No. 

Because you can do something doesn't always mean you should. I honestly wouldn't want to fly in a pilotless craft.


----------



## DerAdlerIstGelandet (May 4, 2009)

evangilder said:


> This is getting ridiculous.



Agreed, I am going to go and eat now. I am getting hungry waiting for sys to grab at more air.


----------



## The Basket (May 4, 2009)

The Airbus crash is slightly misleading as the aircraft was already at a high angle of attack...for a civie jet...and the increase in power wasn't allowed coz the engines were full of trees.

American Airlines Flight 96 in 1972 is probably the greatest feat of civilian flying and that is something a computer will never do...have the will to survive.


----------



## DerAdlerIstGelandet (May 4, 2009)

The Basket said:


> American Airlines Flight 96 in 1972 is probably the greatest feat of civilian flying and that is something a computer will never do...have the will to survive.



Agreed, I am sure that if two lists were made, one that lists every crash that was attributed to human error and a second one that lists every crash that was avoided because the pilot did his damn job, the 2nd list would be longer.

People make mistakes, no one is going to argue that. Lets just be realistic and not play around in a fantasy world.


----------



## FLYBOYJ (May 4, 2009)

As far as more "computers" in aircraft, in terms of a GA bird, look at the Cirrus SR-22, it seems like one crashes each month. There have also been some real foul ups because of the ballistic chute. Things that should have made the plane safer are actually make it more dangerous.


----------



## evangilder (May 4, 2009)

I sometimes think the safer you make an airplane, the dumber the things that people do in them. Saying an airplane is uncrashable or completely safe though is dangerous thinking. I seem to recall a certain ship that was declared unsinkable...


----------



## gumbyk (May 5, 2009)

> American Airlines Flight 96 in 1972 is probably the greatest feat of civilian flying and that is something a computer will never do...have the will to survive.


Also, the UAL flight 232, in Sioux City is right up there, not a perfect outcome, but far better than most would think, given the circumstances.

From what I've heard, pilots in a simulator couldn't reproduce the outcome... There's something to be said for having your ass on the line.


I'm with Eric, more automation hasn't made cars any safer, why would it make aircraft safer? It just seems to make people stop thinking when operating them.


----------



## Kiwikid (May 23, 2009)

Hmm please don't anybody Mention Turkish Airlines FLT 1459 at Schipol.

Radio Altimeters gave false height signals and the autopilot commanded the throttles to close at an altitude of 400ft. What the Dutch aviation safety Board did not disclose was that they had kept secret 14 similar instances of false signals to radio altimeters on KLM flights causing premature shut downs on approach. The Dutch do not want anyone questioning electromagnetic interference with flights. 

I have a few hours on the Boeing 738. Autopilots and Autothrottles are sure fun labour saving devices in the cockpit but give me a real pilot in the cockpit any day.

Cough Cough... please don't mention Colgan Air FLT 3047 okay ?


----------



## syscom3 (May 23, 2009)

gumbyk said:


> Also, the UAL flight 232, in Sioux City is right up there, not a perfect outcome, but far better than most would think, given the circumstances.
> 
> From what I've heard, pilots in a simulator couldn't reproduce the outcome... There's something to be said for having your ass on the line.
> 
> ...



Automation prevents stupid errors from being performed in the first place.


----------



## FLYBOYJ (May 23, 2009)

syscom3 said:


> Automation prevents stupid errors from being performed in the first place.



Great - can you explain in what part of the normal flight profile you're talking about? The SR22 has all kinds of "automation" and yet people a slamming them into the ground at record rates - why is that?


----------



## syscom3 (May 23, 2009)

FLYBOYJ said:


> Great - can you explain in what part of the normal flight profile you're talking about? The SR22 has all kinds of "automation" and yet people a slamming them into the ground at record rates - why is that?



If its pilot induced error, then its preventable.


----------



## syscom3 (May 23, 2009)

Kiwikid said:


> Hmm please don't anybody Mention Turkish Airlines FLT 1459 at Schipol.
> 
> Radio Altimeters gave false height signals and the autopilot commanded the throttles to close at an altitude of 400ft. What the Dutch aviation safety Board did not disclose was that they had kept secret 14 similar instances of false signals to radio altimeters on KLM flights causing premature shut downs on approach. The Dutch do not want anyone questioning electromagnetic interference with flights.



Link please to the summation for the cause of the crash?

And if its true what youre saying, why is it only happening at this airport?


----------



## FLYBOYJ (May 23, 2009)

syscom3 said:


> If its pilot induced error, then its preventable.


But this "automated" equipment has to be operated!!!! It boils down learning how to operate it or use somethng simplier, and in GA aircraft it doesn't get any simpler or safer than a well trained pilot.


----------



## Matt308 (May 23, 2009)

syscom3 said:


> If its pilot induced error, then its preventable.



Why do you always assume that more automation (system engineering) is inherently more safe than humans? And for what operational scenarios?

They both are inherently covered under the "system engineering" mantra. If system engineering (i.e. proper specification of high-to-lo level requirements, structural coverage, module testing, verification and validation testing, human factors, crew resource management principles, etc.) is so flawless, why don't we catch more accident/incident causal factors directly related to human machine interface shortcomings?

[I know, big words again. I'm sorry.]


----------

