Autonomous Landing Guidance System (ALG)

From:         hrz090@aixrs1.hrz.uni-essen.de (Dr. Erdelen)
Date:         07 Dec 94 01:15:46 
View raw article
  or MIME structure

In its issue of Nov 2, HPCwire, an e-zine type information service on
High Performance Computing affairs, describes a new Autonomous Landing
Guidance (ALG) system. According to the article,

"[...] ALG enables commercial and military aircraft pilots to land in
foggy conditions. ALG provides a clear real-time view of the runway and
ground, even in the worst visibility conditions, through the use of a virtual
reality heads-up display. Any airplane equipped with the ALG system could
land in low visibility conditions (CAT III) at unmodified runways around the
world. Only 41 U.S. airports are modified for CAT III landings."
[...]

The article continues to describe in some detail the computer technology
(CNAPS: Coprocessing Nodes Architecture for Parallel Systems, by Adaptive 
Solutions) used for the system to process "sensor information", but does 
not mention what type of sensors is used. 

Can anyone supply more details?

(The institutions involved in the project:
"Lear Astronics Corp. (Santa Monica and Ontario, Calif.) is serving as the
lead member of the ALG consortium, with responsibility for system integration
and coordination. Other members of the consortium include Northwest Airlines,
United Airlines, the U.S. Air Force and the Maryland Advanced Development
Laboratory.
Wright Laboratory is leading the government research and development team
consisting of Wright Laboratories, NASA Ames, Rome Air Development Center,
and associated government laboratories.")

[A trial subscription of HPCwire can be obtained by sending an e-mail, with
 no text in the body, to trial@hpcwire.ans.net. The article quoted above has
 the reference number 4824.]

MArtin Erdelen