From: (James R Ebright)
Organization: The Ohio State University
Date:         15 Jun 95 14:25:25 
References:   1 2 3
Next article
View raw article
  or MIME structure

In article <>,
Jean-Francois Bosc <> wrote:
>In another post somebody cited nuclear stations as a system
>that worried him. But all major nuclear failures were human
>failures, and there would have been much more accidents
>without automation.

Three Mile Island was a man-machine interface problem as much as
it was a man-decision problem.  Some have argued the bad decisions by
the operating staff could have been avoided with better information
to the operators.  (Of course, the fact that all the senior operational
staff were at a major staff party at the time...leaving only 'low men on the
totem pole' to run the reactors might have aggrivated the problem...
[to the best of my knowledge, this has never been publicly reported])

>One of the consequences of automation is that at some point
>humans loose their knowledge of "what's going on", and therefore
>become useless. Even if something goes wrong, they won't have
>the ability to react correctly.

I think that is precicely the point in this thread...the removal of feedback
to the pilots by the A320-type systems CAUSE humans to loose knowledge of
"what's going on".  To use this to then remove them from the control loop
seems like a type of recursive argument.

 A/~~\A   'moo2u from osu'   Jim Ebright   e-mail:
((0  0))_______  "'Eternal Vigilance Is The Price of Liberty' used to mean
  \  /    the  \  we watched the government - not the other way around."
  (--)\   OSU  | 			- Bill Stewart, AT&T