Printer Friendly

Current and future developments in cockpit design.

Current and Future Developments in Cockpit Design

Are We Heading for the Pilotless Combat Aircraft?

Although the basics of flying and the equipment are more or less standard for every aircraft it takes many years of training before a pilot can be declared proficient to fly a fighter of the latest generation. Experts agree that the cockpit design of the McDonnell Douglas F-18 is one of the best available today, but it also demands almost superhuman efforts to master all the incorporated features. The multiplicity of electronic systems designed to aid the pilot might actually complicate his job. The cockpit features three Head-Down Displays (HDD), each of which is surrounded by 20 programmable push-buttons in lieu of the customary nav/attack controls of earlier aircraft. A single integrated electronic engine display replaces the customary dials. The HUD (Head-Up Display) is the primary instrument for both navigation and combat and shows in condensed form the information displayed on the HDDs. Small conventional instruments (attitude, airspeed, climb rate and altitude) are retained as back-ups, while communication and life support system controls, banks of circuit breakers, etc. further clutter up the cockpit. The pilot has to memorize over 675 acronyms that are liable to appear on any of the three HDDs. There are 177 different symbols available for the HDDs and each of these can appear in four different sizes. In addition 73 threat indications, warnings, caution and advisory messages may appear at any time. There are 59 indicator lamps and 22 different HUD configurations can be selected. Although they employ the same basic symbology, they appear, however, in different locations on the combiner. Some 40 display formats can be called up on the HDDs.

Below the HUD a panel controls the operation of two radios, the ILS, JTIDS and other data links, TACAN, ADF and autopilot. Nine switches alone are mounted on the throttle lever, most of which are multi-function, and seven more are integrated in the control stick.

The reader may try to imagine a pilot flying his aircraft in an active combat situation. He may be engaged in a ground-attack mission, flying at very low-level at high speed with hostile guns and missiles threatening him from all sides, or he may be involved in a free-for-all dogfight, plagued by an occasional gray-out due to high g-loads. Moreover, he has to be prepared to react to electronic threats - the disregard of which may mean his untimely end. Now, will he be able under such conditions to memorize the enormous amount of information needed to operate his cockpit systems? Granted, in combat the pilot does not require those options offered to him by the cockpit which pertain to navigation, take-off and landing, but enough remains, and experience shows that many aviators can become so stressed that they end up by making mistakes. Misinterpreting or disregarding the information displayed can culminate in a catastrophic situation or, at best, an aborted mission.

There is no doubt, the F-18 cockpit and its man-machine interface are of fantastic and very advanced design - in fact the best technology can provide today. But it cannot also be denied that its efficient operation and the exploitation of all its wonderful features can only be handled by a few elite pilots in perfect mental and physical condition and selected after a long training period. In essence, this "superman" has to perform brilliantly as pilot, weapons operator, flight engineer, navigator, computer expert and wireless operator. Unfortunately, current cockpit designs would like to make one believe that electro-optic displays, microprocessors and certain operating aids reduce the pilot's workload. This is not so. In fact what the combat pilot urgently requires is the automatic display of pertinent information in simple and clear format, and only when it is actually needed. Moreover, certain actions requiring numerous hand movements, e.g. switching on knobs, etc., should be combined in one single operation executed by voice command or eye control.

Throughout industry and research establishments solutions are being sought and experimented with. In fact, it all started with the development of the HUD in the late `60s - an out-growth of the time-honored stabilised gunsight - and reached its current form thanks to digital computing. During the past 20 years HUD technology has grown by leaps and bounds and has resulted in a tool without which no pilot could operate a modern combat aircraft efficiently. Europe has traditionally been a leader in the design and construction of HUDs, to the extent that GEC Avionics supplies HUDs to all US and European F-16s. Two different types are produced for the F-16A/B and the F-16C/D, both featuring a pilot display with an electronic unit that provides the symbology which is projected on the combiner. The projections include the airspeed and altitude/attitude data and air-to-air missile and gun delivery parameters. The HUD of the F-16C/D versions in addition provides a raster picture of the forward scene derived from an onboard sensor on which is superimposed flight-pertinent information. The scene covers an angle of 25[degrees] of forward view.

This system has become the technological base for the LANTIRN (Low Altitude Navigation and Targeting Infra-Red for Night) of which some 450 have been ordered by the USAF from GEC Avionics and Martin Marietta, the US second-source producer. This type of HUD serves just as well for day or night operation. Its unusual feature is the use of a holographic combiner which offers a field of view of 30 [degrees] in azimuth and 17 [degrees] in elevation. A HUD of similar performance has been designed by Thomson-CSF Aerospace for use in the Dassault-Breguet Rafale-B. Designated VER-3020, it is also a dual system using a holographic combiner offering very high light-transmissibility without impairing the symbology projected onto it. The HUD has viewing angles of 30 [degrees] and 20[degrees] in azimuth and elevation respectively. The HUD produced for the Panavia Tornado by Smiths Industries, Teldix and OMI can still be considered an advanced type, though it will be due for replacement during the "combat value enhancement" of the aircraft in the '90s. Designed particularly for low-level flying under all weather conditions, it features a rich store of symbols and offers a 25 [degrees] forward field of view.

In the United States, the systems generally used in today's frontline USAF and US Navy fighter aircraft were designed by the aircraft manufacturers, while special subsystems such as combiners and other optical parts were subcontracted. A typical HUD design is that of the McDonnell Douglas F-15, which is a MDD-Kaiser joint effort.

However, the HUD by itself does not decrease the workload of the pilot in the cockpit: it merely enables him to keep his head up during critical situations instead of constantly moving his eyes from the instrument panel to the outside world and back again. Duplicating the readings of vital instruments on the combiner has not decreased the number of instruments as such. Furthermore, a forward viewing angle of 35 [degrees] in azimuth, as offered by the best HUD, is definitely too limited for air combat, where angles of 120 [degrees] in azimuth and 50 [degrees] in elevation, i.e. as far as the head can be physically moved, are essential. This requirement arose because the modern fighter is a multi-role aircraft which must be capable of equal performances in interception, air combat and ground attack. As the traditional tactical fighter aircraft became a more potent multi-role fighting machine, two - unfortunately incompatible - trends were noted. With the accumulation of new and old combat mission requirements, additional displays and controls appeared in the cockpit, but at the same time the physical size of the cockpit shrank.

At this point the design engineers came to the rescue by developing a HUD which is integrated in the pilot's helmet, allowing the pilot to receive the HUD information wherever he turns his head. This system is called Helmet-Mounted Display (HMD) and seems to offer the best solution today to the cockpit problem in general. The technical basis for this method is the widely introduced helmet sight. The position of the helmet's axis is measured exactly by electronic or magnetic sensors. A computer calculates the direction in which the pilot is looking in relation to the longitudinal and vertical axis of the aircraft. The result can be used to mark targets, guide weapons or to navigate under difficult conditions. BAe, SFENA/Crouzet, Honeywell, MDD, Hughes and other manufacturers mass-produce these equipments. The helmet sight is provided with a simple reticle, used to fix the target. The HMD incorporates a combiner on which the complete HUD information is projected in miniature form from either a helmet-integrated source or via fiber-optic cabling connecting the helmet to a projector placed behind the pilot's head. The combiner may either be the helmet visor or a special combiner for one eye only which is slid into position when needed. FLIR images can be projected simultaneously with the HUD symbology to give the pilot full night vision capability. Since the basic technology is readily available, HMDs could be produced in quantity for retrofitting to all aircraft equipped with a HUD driven by a central computer. A typical example is the Agile Eye HMD which has been designed by MDD and Kaiser Electronics. Honeywell produces the IHADDS (Integrated Helmet and Display Sight System) for use on the AH-64 Apache attack helicopter. Hughes is experimenting with FLIR projection on HMDs. In Europe Ferranti, Smiths, GEC Avionics, SFENA, Crouzet and others are actively engaged in this field.

However, the limited amount of information which can be displayed on a HUD or HMD is no substitute for the far more detailed information available from the cockpit-mounted instruments. This led to the concept of the integrated cockpit, which marks an intermediate step towards the cockpit of the future. In addition to a standard HUD, the integrated cockpit usually consists of three multi-function displays which replace many of the traditionally singly-mounted instruments, displays, subsystem indicators, dials and gauges. The displays resemble TV screens on which the desired information can be called up according to need. The screens are full-color cathode ray tubes which provide on request any information and data available to the aircraft's central computer. That approach represented a major step forward because large information complexes could be integrated into single comprehensive information packages. For example, the pilot does not have to perform mental acrobatics to calculate how much time can be spent in combat to have enough fuel to return to base. Before the arrival of the integrated cockpit he had to look at the fuel gauges, the navigation instruments, calculate the distance to alternate bases, consider as a precaution the possible use of the fuel-consuming afterburner and include the wind parameters in his calculations. By pressing the right buttons, grouped around one of the monitors, the proper information will be presented to him in a short and precise message. The same is applicable to the status of his engines, threat situation, weapon availability or anything else concerning his aircraft and the mission. Still, even in this integrated cockpit the pilot cannot afford to disregard the outside world for at least a short while to look at the cockpit panel and to push the proper buttons. Other actions regarding the aircraft's mechanical status still have to be taken, switches flipped, buttons pushed and ready-lights observed.

The following example may illustrate that. When the aircraft goes into combat it has to be changed from its cruise and navigation condition to fighting readiness. This means that external fuel tanks have to be jettisoned, the EW system has to be turned on, radar has to be switched from navigation to combat mode, the weapons and bomb fuzes have to be activated, the IR cooling system has to be turned on, etc. All this has to be performed by the pilot by blindly reaching for the proper switches because he has to keep his head up looking for missiles and guns being fired at him or hostile fighters maneuvering into attack position. It is quite possible that these actions have to be taken while maneuvering at high g-loads to evade an enemy threat, thereby making every precise movement of the arms impossible or at least difficult. This well-known problem has been calling for a remedy for sometime, and it seems that in the near future voice control may bring the potential solution. To make his aircraft combat-ready, the pilot would simply have to speak the word "Engagement" and the central computer would at once take over and perform all the above-mentioned actions. It can also confirm orally that all activities have been successfully implemented or warn the pilot that this or that system has failed to function as ordered.

Voice recognition is a sub-division of artificial intelligence and still in the realm of experimentation. Its application to the cockpit has proved to be an elusive task. Virtually all computer designers, software firms and artificial intelligence experts are working on the problem, since a computer capable of conducting a dialogue with its user simplifies the operation enormously. On the aerospace level all major manufacturers, such as BAe, Boeing, Dassault-Breguet, MBB, MDD, Northrop, Rockwell, Westland et al. are engaged in voice recognition research with the aim of making their planes easier to fly.

Voice recognition is already being used in offices and laboratories. Physicians dictate their diagnoses to computers, which transform the spoken word into written text with only a small failure rate, and it will not be long before the same can be done during normal office work. The digitally generated text, however, must be read and corrected. And here lies the real problem for the application of voice recognition to the cockpit. A single command which is misunderstood and acted upon by the central computer may lead to disaster, since aircraft simply cannot make any allowance for false commands.

Unfortunately the electronic pattern of a word varies according to the person pronouncing it. To that must be added the fact that the same person may produce different patterns if the voice cords are stressed by g-forces or if he has a common cold and running nose. For the first problem the solution is to let each pilot prepare his own personalized digital recording of the essential commands in his own voice pattern. The recording can then be loaded into the central computer prior to the mission. For the second problem of voice distortion, unfortunately, no remedy has been found yet. Software engineers believe that the problem could be solved by giving the computer the ability to speak. It could thus conduct a dialogue with the pilot, asking for confirmation before an order was executed. Allied Signal Aerospace's Bendix Division and Crouzet have joined forces to create such an interactive voice recognition system. Crouzet has already built prototypes which have been tested in an AMD-BA Mirage 2000 simulator and in flight in a Mirage III. A fully matured system is due to become standard equipment in the AMD-BA Rafale.

Another option for communicating with the cockpit monitors and to activate or deactivate systems is eye contact, i.e. looking at a switch and pressing a button on the control stick. This proposed method requires the pilot to wear a special helmet sight: when the sight reticle is pointed to a particular switch, the central computer is provided with its exact location and pressing the "activate" button on the stick sends the corresponding message to the computer. As a safeguard the computer could project a request for confirmation in analog form on the helmet's combiner. Another very attractive possibility is offered by the use of an HMD. Very important commands could be projected as a menu on the combiner. In this case, the movement of the eyes would be scanned electronically to calculate the position of the desired command.

McDonnell Douglas, in cooperation with the USAF Systems Command, on the other hand, is currently experimenting with a concept called Big Picture. The hardware consists in essence of a large high resolution monitor in the cockpit which is not unlike a standard 24-inch home TV screen. It shows in digitized and colored graphic format the complete environment around the aircraft plus all flight-pertinent parameters needed for its operation. The pilot would thus fly his fighter as he would steer it in an arcade game. One of the many reasons given by the supporters of this idea, which is also being pursued by Boeing and Bell for the LHX advanced combat helicopter, is that laser or other directed energy weapons expected at the latest by the first decade of the coming century would prohibit the pilot from looking at the combat scene with unprotected eyes.

The Big Picture concept is based on intensive studies which have resulted in the definition of three areas which do not directly relate to the flying skills of the pilot. These three areas are Secretarial; Perception; and Weapon Management. The first involves communications, frequency changes, handling of air traffic control, etc. Under Perception the designers group the mental transformations and information integrations currently required of the pilot during the scanning of his gauges, dials and other instruments. The data thus gathered are numerous and often provide duplicates of the same information collected from different parts of the aircraft. This job demands a concentration which may distract the pilot from other possibly more pressing tasks. Weapon Management involves combat maneuvering, armament delivery, electronic warfare. The actual distribution of the three information complexes on the monitor is not known, but it can well be imagined that the pictorially displayed environment surrounding the aircraft takes up most of the space. The aircraft itself is shown in the center. The threats and targets as seen by its sensors are shown in distinctly colored symbols. Location of friendly forces and aircraft are provided to the system by JTIDS or other data-link sources. Location of confirmed or suspected hostile anti-aircraft batteries and their range of action are marked. Electronic warfare measures are initiated automatically. The computer can even suggest paths to be flown and can indicate safe altitudes. The scenery itself is naturally not static and is adjusted by the central computer according to the course flown by the aircraft. Weapon delivery takes place with the help of an HMD and by looking at the targets in the real world, but it would be just as feasible to locate and fire the weapons by pointing at them with the helmet's reticle on the Big Picture monitor.

The other data required for the "secretarial" and "perception" tasks may be called up as "windows" on the monitor, and actions might be initiated by pointing out commands with a mouse-type device. Two of the major problems standing in the way of this concept are the size of the CRT and the production of reliable and maintainable software. A CRT of the 24 in. size takes up too much space in depth, the same volume as a TV-set. The solution will come only with the development of absolutely flat screens, possibly of the colour LCD (Liquid Crystal Display). As regards the software it is realized that artificial intelligence must in large measure be employed, which is why most larger software producers and aircraft manufacturers themselves are heavily engaged in this field. But the Big Picture concept has not remained a theoretical engineering study. Since late 1987 an experimental model is being tested at McDonnell Douglas and has already been demonstrated to the USAF. One astonishing result was that the pilot needed only one second for lock-on and launch of a Sidewinder missile instead of the commonly required five to six seconds. It is obvious that a system like Big Picture cannot be retrofitted to existing aircraft, but is seen as a standard fit for combat aircraft likely to become operational in the first decade of the next century.

Boeing too is deeply involved in research which might lead to a new cockpit configuration. Under an USAF contract covering the development of the CAT (Cockpit Automation Technology), ways are being studied to relieve the pilot of all tasks which do not primarily relate to the mission and can be performed automatically. Boeing also opted for graphically elaborate, pictorial displays from which the pilot can make his target selection visually and even ask by voice command for a close-up. A further voice command would order the computer to select the weapon to be used and to define the launch coordinates needed to achieve the best hit probability. Naturally, this also calls on artificial intelligence. Lockheed is concentrating on artificial intelligence in attempt to reduce the workload of the pilot in the YF-22A ATF prototype. Jointly with Boeing and General Dynamics, Lockheed is responsible for the ATF prototype's cockpit controls and displays and is working on a program called Electronic Co-Pilot (ECOP). One project for the application of ECOP technology is to group the CRTs in such a way as to provide the illusion of a panoramic view. Northrop for its part, as the designer and producer of the other ATF prototype, the YF-23A, has constructed a reconfigurable cockpit simulator in which various concepts can be tested for their feasibility.

Many experts show little confidence that the concepts for advanced cockpits are fully compatible with the sensory and perceptual capabilities of the pilot. The machines can simply do everything better. They can see farther under all weather conditions at night or day, they can sense threats and react to them infinitely faster than Man. With perfected and reliable artificial intelligence, the machine alone can also fight much better because it does not have to heed the built-in performance limits of manned aircraft, which have been set by the designers to protect the very fragile human occupant. The machine would not need to carry any life support systems like oxygen or an ejection seat and could operate at g-loads which would crush a pilot, etc. The weight saved could be translated into longer range or a higher weapon load. Above all, a robot never tires, does not know fear or hate and is expendable, attributes which a human pilot does not posses.

In conclusion, it is not all that fanciful to imagine that the large majority of tactical combat aircraft of the next century will be flown without either pilot or cockpit. Even so, there will always be certain missions for which a human in the cockpit is essential. The new cockpit technology now in the development phase promises to ease their workload and permit optimal operation of the aircraft under all conditions.

PHOTO : McDonnell Douglas's Big Picture concept under test in simulators since 1987 can suggest

PHOTO : flight paths and altitudes.

PHOTO : In an effort to ease the pilot's tasks, BAe is experimenting with a simplified HUD

PHOTO : symbology which shows speed, target range and missile firing envelope.

PHOTO : McDonnell Douglas is also working on a programme called "Pilot's Associate" aimed at

PHOTO : supporting the single fighter pilot with artificial intelligence.

PHOTO : Ferranti is engaged in integrated cockpit design. The lower display is used for

PHOTO : navigation. A similar configuration might be used in the EFA.

PHOTO : The Saab JAS39 Gripen features an integrated cockpit consisting of three CRT displays and

PHOTO : a diffraction type HUD developed and produced by Hughes Aircraft.

PHOTO : Thomson-CSF has developed an integrated helmet sight for use in fighter aircraft. When not

PHOTO : needed, the combiner unit can be flipped up over the helmet.

PHOTO : Honeywell produces this integrated helmet and display system for the AH-64 Apache

PHOTO : helicopter. It is a day and nigth vision HMD with targeting options.

PHOTO : GEC Avionics and Martin Marietta produce the holographic LANTIRN night navigation and

PHOTO : attack system for the USAF.

PHOTO : This view of a Goodyear Aerospace F-15A cockpit simulator gives an idea of the complexity

PHOTO : and the magnitude of a fighter pilot's workload.

PHOTO : Ferranti provides moving map displays for integrated cockpits. Shown here is a digital

PHOTO : tactical display indicating the aircraft's position and landmarks.
COPYRIGHT 1989 Armada International
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1989, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Geisenheyner, Stefan
Publication:Armada International
Date:Jun 1, 1989
Words:4000
Previous Article:A survey of Western air transport capabilities.
Next Article:A report on Defense Asia '89 in Singapore; plenty to see but nothing very new.
Topics:


Related Articles
Glasnost and the glass cockpit.
Russian Helos' New Facelift.
Inside JCI Northwood. (Produce).
SPACE SHUTTLES STREAMLINE FOR NEW MILLENNIUM; NEW DESIGNS KEEP FLEET APACE.
Developing the breed: the modern fighter represents a combination of complex systems, taking decades to develop.
Useful fleet technology is battle cry at Air Force lab.
MICROVISION AWARDED ADDITIONAL $3.3 MILLION FROM U.S. ARMY.
Marine Corps mulling over options for heavy lift helos.
Intersense awarded army Phase I SBIR contract for miniature inertial-optical Cockpit Helmet Tracking System.
Cockpits of the Cold War.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters