Possible applications of neurocomputing in defense.Possible Applications of Neurocomputing in Defense
From Artificial Intelligence to Acquired Wisdom
Despite their phenomenal number-crunching properties, even the super-computers that are capable of solving complex mathematical equations in seconds cannot stand any comparison with the universal computing capacity of the human brain when it comes to performing numerous fundamental functions simultaneously. Conventional computers, for example, cannot rapidly recognize patterns, work from unprepared input from sensors, or anticipate the consequences of situations, even if they make full use of the "knowledge" stored in their databases. This situation, however, might change in the near future with the advent of neural networks (also called neurocomputers or neural systems). Although nobody has yet been able to explain conclusively how the brain functions, a number of scientists and engineers are currently looking at ways of designing electronic structures that mimic naturally intelligent networks, such as the human brain. The successful development and application of neural networks to military and civil uses are bound to change the fabric of society on every conceivable level just as profoundly as did nuclear and genetic technologies. By extrapolating the present level of experiences with neural networks, scientists confidently predict that by the end of the first decade of the coming century, trained - as opposed to programmed - robots will be able to perform multiple tasks and that 30 years later, androids, i.e. humanoid structures, will assist man in labor-intensive tasks. They even anticipate that by the end of the next century neural networks may well be endowed en·dow
tr.v. en·dowed, en·dow·ing, en·dows
1. To provide with property, income, or a source of income.
a. with creative intelligence. A different school of thought, on the other hand, downplays those predictions by claiming that the rather disappointing performance of artificial intelligence (see Armada International April/May 1989), initially much touted, gives every reason to believe that neural networks are heading towards a similar fate because unexpected and as yet unpredictable obstacles are bound to block progress at crucial points.
By their very nature neural networks neatly sidestep side·step
v. side·stepped, side·step·ping, side·steps
1. To step aside: sidestepped to make way for the runner.
2. the learning/programming problems of artificial intelligence because they cannot be programmed: a neural network must be trained and it can also learn by example (i.e. it can create its own database). Such capabilities have been well known for decades, so it remains somewhat of a mystery why scientist have invested so much effort in the artificial intelligence approach and consider the neural network concept an also-ran. According to according to
1. As stated or indicated by; on the authority of: according to historians.
2. In keeping with: according to instructions.
3. Dr. Castelaz of Hughes - one of the leading US authorities on neural networks - there were two main reasons: the lack of the necessary technology in the 1950s and 1960s for implementing large neural networks; and the lack of the necessary digital computer simulations to overcome the problems of analog implementation, since digital systems themselves were in their infancy. Besides, more had to be learned on how the human brain operates before work could continue.
Initial attempts to develop neural networks were based on research in the late 1940s, when scientists in the USA were able to devise plausible models of the function and structure of the human nerve cell nerve cell
1. See neuron.
2. The body of a neuron without its axon and dendrites. . Engineers endeavoured to give these models a structure in mechanical, chemical and/or electrical forms, and by the 1960s considerable progress had been made.
Electro-chemically operated neural systems proved to be able to recognize graphic patterns. In 1968 a machine called Perceptron 1. perceptron - A single McCulloch-Pitts neuron.
2. perceptron - A network of neurons in which the output(s) of some neurons are connected through weighted connections to the input(s) of other neurons. A multilayer perceptron is a specific instance of this. even showed a learning capability. But then, analog systems began to be replaced by vastly superior digital processors. This created a change in attitude towards computing, resulting eventually in the oft-cited computer revolution. Since artificial intelligence depends on digital processing Digital processing is the process of altering digital data in any form.
The most common situations where digital processing is involved are computer graphics and digital audio processing. while neural networks work best in the analog environment, pursuit of the latter avenue of research appeared to be somewhat antiquated. Conventional artificial intelligence had thus won the day, dominating research in the United States United States, officially United States of America, republic (2005 est. pop. 295,734,000), 3,539,227 sq mi (9,166,598 sq km), North America. The United States is the world's third largest country in population and the fourth largest country in area. and elsewhere. In Japan, the Fifth Computer Generation project was launched, while the EEC-initiated Alvey and ESPRIT programs got slowly underway in Europe. Work on neural networks was kept alive only as an esoteric laboratory curiosity at a number of universities.
Participation of Industry
In the early 1980s, when the limitations of digital-based artificial intelligence became apparent, neural network research began to gather momentum again. Today, numerous governmental agencies, universities and industries engaged in computer and software R & D are once more investigating the possible applications of neural systems. By the middle of the past decade efforts were at last supported by adequate funding from defense and science budgets, and industry began to finance extensive in-house projects. It appears that Japan was the first to recognize the enormous potential of neural networks, and some European sources claim that this country is already working on a second-generation neurocomputer Neurocomputer can mean:-
Two major research projects on neural networks are currently supported in Europe by the EEC EEC: see European Economic Community. to the tune of 5 million ECUs (European Currency Unit) each. Both are facets of the ESPRIT program. The first project, named Annie, is an exploratory project to investigate potential neural network applications.
Those involved are a number of universities and companies like British Aerospace British Aerospace (BAe) was a UK aircraft and defence systems manufacturer, now part of BAE Systems. History
The company was formed as a statutory corporation on April 29, 1977 as a result the Aircraft and Shipbuilding Industries Act. , Siemens, CETIM CETIM Centre Europe-Tiers Monde (French: Europe-Third World Centre; Geneva, Switzerland)
CETIM Centre d'Etudes et de Technologies de l'Industrie des Matériaux de Construction (French) (France) and Alpha (Greece). The second European project was christened Pygmalion and centers its efforts on software tools and applications. Project leader is Thomson-CSF, partnered by SEL (SELect) A toggle switch on a printer that takes the printer alternately between online and offline.
1. SEL - Self-Extensible Language.
2. SEL - Subset-Equational Language. (Germany), Philips (Netherlands), CSELT CSELT Centro Studi e Laboratori Telecomunicazioni (Turin, Italy) (Italy) and universities in Great Britain Great Britain, officially United Kingdom of Great Britain and Northern Ireland, constitutional monarchy (2005 est. pop. 60,441,000), 94,226 sq mi (244,044 sq km), on the British Isles, off W Europe. The country is often referred to simply as Britain. , France, Greece, Portugal and Spain. This program is aimed at creating a network description language and a software environment leading to a European standard.
European efforts, unfortunately, appear miniscule min·is·cule
Variant of minuscule.
Adj. 1. miniscule - very small; "a minuscule kitchen"; "a minuscule amount of rain fell"
minuscule by comparison with American activities in neural network research, which is primarily fed by military funds derived from SDI (1) (Serial Digital Interface) A physical interface widely used for transmitting digital video in various formats. For electrical transmission, it uses a high grade of coaxial cable and a single BNC connector with Teflon insulation. and similar advanced programs and projects. These are sponsored by the DARPA DARPA: see Defense Advanced Research Projects Agency.
(Defense Advanced Research Projects Agency) The name given to the U.S. Advanced Research Projects Agency during the 1980s. It was later renamed back to ARPA. (Defense Advanced Research Projects Agency Defense Advanced Research Projects Agency (DARPA), U.S. government agency administered by the Department of Defense (see Defense, United States Department of). ) with substantial funding (unofficially estimated at anything between $500 million and $2 billion over a period of 5 years).
That an advanced stage has been reached already is evidenced by the fact that TRW TRW The Real World (TV reality show)
TRW The Right Way
TRW Tactical Reconnaissance Wing
TRW The Retriever Weekly (University of Maryland, Baltimore, MD)
TRW Thompson Ramo Wooldridge Inc is offering neural network products on the open market. These systems are apparently intended for experimental use by universities and industry, with the aim of broadening the nation's experience with neural networks. General Dynamics General Dynamics Corporation (NYSE: GD) is a defense conglomerate formed by mergers and divestitures, and as of 2006 it is the sixth largest defense contractor in the world. The company has changed markedly in the post-Cold War era of defense consolidation. also conducts intensive neural network research, presumably pre·sum·a·ble
That can be presumed or taken for granted; reasonable as a supposition: presumable causes of the disaster. with the aim of perfecting missile guidance systems A system which evaluates flight information, correlates it with target data, determines the desired flight path of a missile, and communicates the necessary commands to the missile flight control system. See also missile control system. . Long-term projects for aerospace applications are underway at Hughes, and even Du Pont Du Pont (dpŏnt), family notable in U.S. industrial history. The Du Pont family's importance began when Eleuthère Irénée Du Pont established a gunpowder mill on the runs a project aimed at tailoring neural networks to the needs of the chemical industry. IBM (International Business Machines Corporation, Armonk, NY, www.ibm.com) The world's largest computer company. IBM's product lines include the S/390 mainframes (zSeries), AS/400 midrange business systems (iSeries), RS/6000 workstations and servers (pSeries), Intel-based servers (xSeries) and DEC, traditional software and hardware producers, support neural network programs for future military and commercial purposes. NASA NASA: see National Aeronautics and Space Administration.
in full National Aeronautics and Space Administration
Independent U.S. is of course deeply involved too, since neural networks lend themselves as coprocessors to many space flight applications. In the space agency's NNETS (Neural Network Environment Transputer System) 40 transputers have been linked to attain an extremely high operating speed The operating speed of a road is the speed at which motor vehicles generally operate on that road.
The precise definition of "operating speed", however, is open to debate. .
Neural Systems Technology
Neural systems can be defined as non-programmed, adaptive information processing information processing: see data processing.
Acquisition, recording, organization, retrieval, display, and dissemination of information. Today the term usually refers to computer-based operations. networks which develop their own algorithms in direct response to their environment. The two main features of neural networks are hidden in this definition. The first is that neural networks cannot be programmed, which implies that they have to be trained. The second is that a trained neural network can adapt itself to external influences, i.e. it can learn by experience. Neurocomputing is thus a radical departure from hitherto employed methods, and neural networks have the potential to become an essential part of tomorrow's self-programming computer. The information processing architecture of a neural network is based on a massive parallel structure, containing numerous simple processing elements - called neurons - interconnected to obtain collective computational capabilities. These interconnections can be fashioned in several ways called network paradigms. About 20 different types of paradigms (a term which essentially means "model of any form") are currently used, but intensive research is being conducted worldwide to develop new and more efficient types.
These silicon neurons are patterned in structure and behaviour after the human brain cell (at least as the biologists believe it is constructed). Each neuron can have any number of inputs but has only one output: this, however, branches out to connect in the form of an input to numerous other neurons i.e. processors. In a typical network, only a few processors are connected directly with the outside environment from which they receive their inputs. Each neuron has essentially the same function and structure. During processing it sums up, or figuratively speaking "weighs" the inputs received and if the sum exceeds a given level it "fires" a signal to the connected processors. The learning process of the multiple neurons is based on the "weights" of the inputs they handle. The firing threshold in each element is progressively modified by both the learning process and the data entering from outside.
This process is implemented by one of the learning laws, of which several are currently available. Scientists have developed models on how the human being absorbs knowledge, i.e. learns. One of the simplest models explains that each time an input results in the firing of a neuron, the weight of this particular type of input is automatically increased. If a particular input does not significantly change the balance within the neuron, that input type's "weight" is decreased. This means that spurious or weak inputs are discarded, while only those that frequently add to the weight's sum are enhanced. Broadly speaking Adv. 1. broadly speaking - without regard to specific details or exceptions; "he interprets the law broadly"
broadly, generally, loosely , information in a neural network is processed by a complex interaction of neuron activity and the continuous adjustment of weights that causes it. All information is processed in a perfectly parallel, well-nigh amorphous, manner which does not require the decomposition of the input into single data, as needed as needed prn. See prn order. for a conventional computer.
Training the System
The self-organizing nature of neural networks is possibly their most interesting feature. As mentioned above they are not programmed, but have to be trained. This process is in fact simple: the operator inserts samples of typical inputs into the network, e.g. a distorted radar or sonar signal, and in parallel presents the system with samples of the expected output, i.e. clear and readable radar or sonar signals. After repeated runs, the neural network develops on its own the algorithms needed to transform the marginal input quality into a usable output. Depending on the complexity of the task a neural network requires some hundred to a thousand runs of this training cycle, which is naturally performed in an automatic repeat mode and can be completed in seconds. The network can of course be re-trained for handling other tasks. It can be readily seen that this self-programming feature radically alters the way conventional computers are traditionally operated.
A neural network is not a computer by itself, but rather a co-processor added to any suitable digital computer system. With proper interfacing it will be able to cooperate with any current software language such as UNIX UNIX
Operating system for digital computers, developed by Ken Thompson of Bell Laboratories in 1969. It was initially designed for a single user (the name was a pun on the earlier operating system Multics). , DOS and the like. This is the crucial point where neural networks have a distinct advantage over artificial intelligence. An expert who enters his knowledge into an artificial intelligence database has first to master its complicated software language and associated operation techniques. This is not the case with neural networks, provided they work with a standard computer, which can be a cheap commercial PC/AT See AT. (the pictures accompanying this article were generated on such a machine). It was found that the best results were achieved by familiarizing fa·mil·iar·ize
tr.v. fa·mil·iar·ized, fa·mil·iar·iz·ing, fa·mil·iar·iz·es
1. To make known, recognized, or familiar.
2. To make acquainted with. experts with the features and limitations of neural networks and by letting them enter their specific knowledge with the usual language and keyboard. Neural networks are in fact ideally suited to work with very large data-bases or data input flows because they very quickly discover repetitions in any data flow. If a comprehensive knowledge data-base already exists, neural networks can conceivably be commanded to extract all the pertinent data and so generate an artificial intelligence expert system without any special programming effort.
Another most exciting application is the ability of neural networks to generate highly complex algorithms which are simply too complicated to be created by human beings in an economically acceptable timeframe. This applies to the defense field, particularly to algorithms required for target acquisition and tracking, sonar evaluation and pattern recognition. The drawback is that the solution offered by the neurocomputer cannot be as exact as the human-written version because, after all, it represents merely an approximation, a near-optimum solution. Whether this approximation is technically acceptable depends on the accuracy or performance required by the user device.
It was discovered during research and application experiments that the neurons could be arranged in different architectures to yield either excellent self-teaching or highly developed optimization properties. For the latter type of architecture it is obvious that all the available parameters will have to be very specific, but it offers considerable advantages if the neural network architecture has been custom-designed. For example, an architecture can be created as part of a radar signal data processor which eliminates clutter by intelligent optimization instead of the customary threshold filtering. A typical application would be the search for features of interest in reconnaissance satellite reconnaissance satellite, artificial satellite launched by a country to provide intelligence information on the military activities of foreign countries. There are four major types. Early-warning satellites detect enemy missile launchings. data.
Merging Neural Nets neural nets - artificial neural network and Artificial Intelligence
In spite of their formidable, inherent properties neural networks are not the whole answer. There are distinct limitations to what a neural network can do in knowledge processing. The basic problem with neural networks is that they do not output clear-cut information but a complex pattern of signals resulting from the output neuron firing the message. This is where conventional artificial intelligence and neural networks can merge their respective strengths. An artificial intelligence expert system is far too slow to be of any practical use if it has to work with a very large data-base. This can be compensated by the addition of a co-processing neural network. It could be used to scan even the largest data-bases of any conceivable artificial intelligence expert system within microseconds, select via the optimization process the most likely solutions to the problem and feed them back into the artificial intelligence system for further processing. This mimics almost to perfection Adv. 1. to perfection - in every detail; "the new house suited them to a T"
just right, to a T, to the letter the problem-solving process of the human brain.
Since the neural network concept also offers a number of interesting advantages at the hardware level over conventional computing, there is intensive research aimed at producing the optimal chip designs. As seen earlier, the parallel architecture of neural networks disperses the information among a large number of neurons. This renders the system highly resistant to damage, since the electrical or mechanical failure of some neurons does not impair the operation of the processor. It may slow it down somewhat but only to an insignificant extent, as the operating speed is vastly higher than that of a conventional computer, where partial loss of memory or failure of its central processing unit See CPU.
(architecture, processor) central processing unit - (CPU, processor) The part of a computer which controls all the other parts. Designs vary widely but the CPU generally consists of the control unit, the arithmetic and logic unit (ALU), registers, temporary buffers leads to a collapse of the system. Another advantage of neural networks is that they lend themselves well to VLSI VLSI: see integrated circuit.
(1) (Very Large Scale Integration) Between 100,000 and one million transistors on a chip. See SSI, MSI, LSI and ULSI.
(2) (VLSI Technology, Inc., Tempe, AZ, www.semiconductors. (Very Large Scale Integrated) chip technology. While even slightly defective VLSIs built for conventional computers have to be discarded, it is relatively unimportant if some neurons in the VLSI chip are inoperative Void; not active; ineffectual.
The term inoperative is commonly used to indicate that some force, such as a statute or contract, is no longer in effect and legally binding upon the persons who were to be, or had been, affected by it. . This tolerance lowers the industrial rejection threshold, and consequently the price of the chip.
Most of the present operational neural systems employ conventional hardware which is readily available at acceptable prices, but this type of hardware does not yet really permit the creation of true neural networks. Currently, familiar digital processors are conventionally programmed to simulate neural networks because this approach is far cheaper than designing and fabricating neural network-dedicated VLSI chips. But it is only a matter of time... The demand for neural network processors has opened up a completely new research and market situation for the semiconductor industry, which is rising to meet the challenge. It can be assumed that the traditional American semi-conductor manufacturers and technical universities are working on the design of embedded neural networks. A typical example is AT + T which has already produced prototypes of a hybrid silicon-based analog/digital network. In Japan, Fujitsu, Nippon Telephone & Telegraph Co. and others are working on neural chips. In the UK University College London “UCL” redirects here. For other uses, see UCL (disambiguation).
University College London, commonly known as UCL, is the oldest multi-faculty constituent college of the University of London, one of the two original founding colleges, and the first British is working on neural network chip design procedures and hopes to produce a compiler which will be able to generate neural chips for any given architecture. Siemens is engaged in neural network chip design, as are Thomson-CSF and the Intel group.
Nevertheless, most currently available chips are still primarily digital or hybrid analog/digital devices. The ideal solution for neural systems can only be offered by analog techniques which, according to one authoritative source, can "reach 100 000 times the efficiency of digital computing" if employed in neural networks. But even then, such cost-effective neural networks may never reach the switching speed and efficiency of the now proposed optical networks which transmit data by laser, store them in holographic See holographic storage. form and process them by optical switches. But this seems to be only the dawn of yet another revolution. A team at the University of Stuttgart The University of Stuttgart (German Universität Stuttgart) is a university located in Stuttgart, Germany. It was founded in 1829 and is organized in 10 faculties. , is for example working on molecular electronics where the customary anorganic silicon is replaced by organic, in part super-conducting, substances. If successful, this line of research may result in chips which are easy and cheap to produce and vastly faster than conventional silicon devices. From this point onwards the step towards an organic/chemical system (a bionic A machine that is patterned after principles found in humans or nature; for example, robots. It also refers to artificial devices implanted into humans replacing or extending normal human functions. See biomimicry. structure which not only mimics but also resembles the human brain both in its operating mode and architecture) would be conceivable. Concern is sure to be voiced sooner or later by fundamentalist philosophers, but just as tools are extensions and force multipliers of the human arm and hand, neurocomputers are bound eventually to become extensions of the human mind. This aspect should be borne in mind while looking at the attached Table of the potential applications of neuro-computing for military and commercial purposes.
As explained above, currently known neural network architectures can be divided into two basic groups: learners (i.e. adaptive) and optimisers. Their properties dictate their application. Among the most obvious uses of self-organizing adaptive neural networks are for realtime pattern-recognition tasks. The detection of target signatures buried in clutter and noise is possible with conventional computer systems, but due to the long search cycles it cannot be done in real time. The assistance of a neurocomputer will speed up the process no end and thus provide the few extra seconds needed for defense against an incoming missile, for example. The same process can enhance the effectiveness of ESM (1) (Enterprise Storage Management) Managing the online, nearline and offline storage within a large organization. It includes analysis of storage requirements as well as making routine copies of files and databases for backup, archiving, disaster recovery, and ECM (1) (Enterprise Change Management) See version control and configuration management.
(2) (Error Correcting Mode) A Group 3 fax capability that can test for errors within a row of pixels and request retransmission. far beyond presently conceived systems. Applied to assist the sensor-image evaluation suites of robotic vehicles, for which they are now clearly a prerequisite, neurocomputers can offer a cross-country mobility which far outperforms current models, since it can quickly and automatically be taught to select the best path across any terrain. As an experiment JPL (language) JPL - JAM Programming Language. (Jet Propulsion Laboratory “JPL” redirects here. For other uses, see JPL (disambiguation).
Jet Propulsion Laboratory (JPL) is a NASA research center located in the cities of Pasadena and La Cañada Flintridge, near Los Angeles, California, USA. ) in the USA has designed an embedded adaptive neural network chip to control a robotic arm A robotic arm is a robot manipulator, usually programmable, with similar functions to a human arm. The links of such a manipulator are connected by joints allowing either rotational motion (such as in an articulated robot) or translational (linear) displacement. which may find an application in automated space vehicles. Neural networks can be employed for target identification and tracking, weapon allocation, missile guidance, intelligence-gathering and data-merging functions. They can greatly improve the operation of man-machine interfaces or handle resource allocation resource allocation Managed care The constellation of activities and decisions which form the basis for prioritizing health care needs on the logistic level.
An excellent example of the capability of a learning neural network structure is an experimental simulation system designed by Hughes and currently under test. It has simulated the missile defense Missile defence is an air defence system, weapon program, or technology involved in the detection, tracking, interception and destruction of attacking missiles. Originally conceived as a defence against nuclear-armed ICBMs, its application has broadened to include shorter-ranged of a high-value target A target the enemy commander requires for the successful completion of the mission. The loss of high-value targets would be expected to seriously degrade important enemy functions throughout the friendly commander's area of interest. Also called HVT. See also high-payoff target; target. against incoming threats. The neural network initially "observed and registered" the actions of a human operator and learnt the necessary missile-launching procedures and correct launch time of an air defense battery. The command of the battery was turned over to the neurocomputer after it had observed 40 exercises in changing scenarios. It achieved a consistent intercept success rate of 85 to 90% against multiple targets in rapidly changing scenarios. The success rate against single targets averaged more than 95%. It should be noted, however, that the above results were not achieved with dedicated chips and that considerable improvements may be expected when a specifically designed VLSI chip, now under development at Hughes, becomes available.
Also at Hughes, under the direction of Dr. Castelaz, an experiment in optimization processing achieved really striking results in multi-sensor passive tracking initiation in an air defense scenario. The problem involved the rapid location of a small number of true targets among a very large number of false ones. The difficulty of solving this problem increases exponentially with the number of targets. The impact of this fact on conventional processing is significant. For example, in a typical scenario involving 15 true targets hidden among numerous ghosts, a digital VAX (Virtual Address eXtension) A venerable family of 32-bit computers from HP (via Digital and Compaq) introduced in 1977 with the VAX-11/780. VAX models ranged from desktop units to mainframes all running the same VMS operating system, and VAXes could emulate PDP models 11/780 computer can solve the problem within 10 seconds. But if the true targets number more than 40, the computing time required is estimated at years. In a simulated environment, and employing a neural network, Hughes scientists arrived at an average processing time of 15 microseconds for the discovery of 36 true targets among a total of thousands of ghosts. This represents six orders of magnitude faster processing than with conventional computer systems. Such experiments and their results are still preliminary but are already indicative of neural networks' capabilities.
The progress achieved so far has doubtless reached a high level, but numerous problems remain to be solved for advancing neural networks from the experimental to the operational stage in defense or civil systems. It was discovered for example that with their increasing complexity neural networks tend to "forget" what they have learnt, a phenomenon which has yet to be fully explained. At least another decade will pass before neurocomputing processors have reached the maturity required for functioning as clever assistants in standard computers and as essential subsystems in the robots of the future.
PHOTO : This amazing computer image, produced by Hughes, clearly shows the output of a neural
PHOTO : network hunting for true targets among thousands of ghost signals, as might be produced by
PHOTO : decoys during an intercontinental ballistic missile intercontinental ballistic missile: see guided missile. attack.
PHOTO : Submarine navigation and weapon control is a typical area in which neurocomputers and
PHOTO : artificial intelligence could be put to work together.
PHOTO : Neural network processors like this Hughes prototype are relatively small and can be
PHOTO : fitted easily to a standard extension board of a conventional computer.
PHOTO : Neural networks can be used as "terrain classifiers" for civil and military purposes. In
PHOTO : the latter case, they could be used in missiles for TERCOM TERCOM Terrain Comparison
TERCOM Terrain Contour Mapping/Matching navigation.