From: (Jean-Francois Bosc)
Date:         06 Jun 95 10:11:04 
References:   1 2
Followups:    1 2 3 4
Next article
View raw article
  or MIME structure

In article <airliners.1995.695@ohare.Chicago.COM>, (FMCDave) writes:
> Mr. Bosc writes
> >Now, my opinion is that systems are safer than pilots.

> Wow, I guess I have to disagree with that.  While we can point to
> many accidents as being "pilot induces", I think we might have a
> hard time absolving the participation of the system in those failure
> responses.  I guess that I would also like to state that I have
> had the opportunity to work closely with some fine Airbus engineers
> in safety related industry forums (specifically RTCA SC-167 and
> Working Group 12 from EUROCAE) and feel that your opintion (that
> systems are safer than pilots) is not held by them.  While there
> is a basic difference in philosophy, I think that the Airbus
> engineers understand the integrity requirements and limitations of
> aircraft systems.

I didn't say that we are ready to suppress pilots right now.
So OK, maybe a fully automatic aircraft is not safer than the
current pilot-system _cooperation_ (but systems replaced
humans for many functions, right ?)
Nonetheless, it's an absolute certainty that it will come.
In 30 years from now, ATC will have to be totally automatic.
This raises much more difficulties than the automatization of
There will still be unpredicted cases, and they will for sure
cause losses. But at some point it will become safer to accept
these losses than pilot induced losses.

I think the main point is that systems are careful 60 seconds
per minute and 24 hours a day. And software reliability is
increasing, and will keep increasing.

I understand that most people will feel unconfortable if
they hear that a computer is holding their lives, but it's
an irrational feeling. I also understand that pilots feel
self-confident enough, and won't be pleased to see their
job evolve in that way. To me a pilot should even regret
"fly-by-guts" DC6 :)

In another post somebody cited nuclear stations as a system
that worried him. But all major nuclear failures were human
failures, and there would have been much more accidents
without automation.

One of the consequences of automation is that at some point
humans loose their knowledge of "what's going on", and therefore
become useless. Even if something goes wrong, they won't have
the ability to react correctly.