Printer Friendly

The Influence of Errors in Visualization Systems on the Level of Safety Threat in Air Traffic.

1. Introduction

Air traffic in controlled airspace is managed by air traffic controllers. The airspace is divided into smaller volumes called sectors, every one of which has a radar controller responsible for the safety of aircraft. This task is accomplished by issuing clearances for flight with certain parameters (like altitude, airspeed, and heading). They are selected in such a way that the flight path is nonconflicting with paths of other aircraft and parts of the airspace where flight is not permitted.

One of the most important determinants of the ability to issue appropriate clearances is precise information about the traffic situation. There are multiple sources of flight data, like surveillance radars, automatic dependent surveillance-broadcast (ADS-B) system, flight plans, and others. Given the current traffic volume in European airspace, it is impossible to process this information efficiently enough to produce a traffic situation picture solely in the controller's mind [1]. This is why the controllers are supported by different air traffic control (ATC) systems of various levels of complexity. Their subsystem of key importance is the traffic situation visualization system (TSVS) that is responsible for schematic representation of the aircraft's positions in relation to each other, different airspace structures, and ground objects.

ATC systems are constructed with the awareness of their role in air traffic safety. Therefore, various protection systems are used, there are hardware and functional redundancies, and advanced decision support algorithms are used. Nonetheless, software bugs and hardware failures still occur sometimes. They may manifest differently, but one of the most severe effects is visualization systems operation errors.

An error in the visualization system causes a partial loss of the controller's situational awareness [2]. Depending on the traffic volume, the complexity of the situation, and the error type, various risks of air traffic safety may be caused. In this paper, different types of visualization systems' errors are analyzed, their impact on the air traffic safety is assessed quantitatively, and, finally, some case studies are conducted, vulnerability is examined, and opportunities to reduce the risk are sought.

1.1. Literature Review. There are many studies examining the various types of factors affecting air traffic safety, as well as evaluating it. Wong et al. [3] presented an analysis of meteorological factors' influence on the probability of an air accident. Remawi et al. [4] analyzed safety management systems (SMS) and their impact on the possibility of dangerous behavior in aviation. A general model of air safety assessment was proposed by Shyur [5]. In turn, Leveson [6] suggested a systemic approach to safety analysis. Literature review on the relation between the complexity of the task and the operator's reliability can be found in [7]. Many authors suggest a quantitative approach to the analysis and assessment of safety and security of both the whole air transport system and its elements [8-17]. Netjasov and Janic [18] presented an interesting overview of risk and safety analysis methods in civil aviation.

In typical ATC systems, information about three-dimensional (3D) scenery is displayed with a two-dimensional representation. Mertz et al. [19] proposed extending the available controller-ATC system communication means. In modern implementations, many of the proposed solutions have already been applied. Bagassi et al. [20] presented an innovative concept based on a four-dimensional (4D = 3D space + time) visualization display. This issue is particularly important given the need for spatiotemporal analyses by the controller [21]. Using several different advanced controller support systems forces the application of special solutions in visualization systems. Callantine et al. [22] studied terminal-area controller-workstation interface variations for interoperability between three new capabilities being introduced by the FAA: Terminal Sequencing and Spacing (TSAS), Automated Terminal Proximity Alert (ATPA), and Wake Turbulence Separation Recategorization (RECAT). A new working environment containing special information was proposed by Rohacs et al. [23].

Many papers have been devoted to analyzing controllers' information perception from ATC systems. Specific requirements for visualization systems and controllers' work are imposed by the recently introduced virtual tower concept, where one air traffic controller serves more than one aerodrome [24]. ATC systems' functional problems from the perspective of the human-air traffic controller were analyzed in [25]. Ahlstrom [26] conducted an analysis of the results of improper construction of visualization systems, especially displaying redundant information, on the probability of causing a threat to air traffic safety. A similar analysis, but one made using different methods, was presented by Giraudet et al. [27]. Kesseler and Knapen [28] highlighted the need to consider the interactions between the controllers, ATC systems, and functions offered by individual systems and proposed the use of a new human-centered approach which contrasts with the traditional technology-centered approach that is mainly driven by the capabilities of the technology employed.

Many papers suggest using the fuzzy logic methods and tools in the area of air traffic management. One of the most interesting papers [29] describes the expert system that was created to assist the take-off and landing risk assessment. The problem of flight crews' duty hours planning was undertaken by Teodorovic and Lucic [30]. Fuzzy sets were also used to analyze traffic incidents in aerodrome traffic [31]. Psychological aspects of pilots' behavior were analyzed by Wanyan et al. [32]. The research by Lu and Huang [33] and Skorupski and Uchronski [34, 35] includes the attempt of airport security assessment where the human factor was partly taken into consideration. The other examples of fuzzy methods utilization in air traffic management can be found in [36, 37]. The other technology sectors also use the fuzzy methods in risk assessment of key elements damage [38], as well as human factor influence on the reliability of systems [39].

1.2. Concept of the Study. The literature review indicates the need to analyze the visualization systems' errors that occur in air traffic controllers' practice. This analysis will be the basis for assessing the risk caused by different types of errors. This assessment has a significant degree of subjectivity and is impossible to be quantified unequivocally. There is, however, practical knowledge that may be gained from air traffic controllers. In such situations, methods which deal well with uncertain and imprecise information expressed in natural language are applied. In our paper, fuzzy logic systems, more precisely fuzzy reasoning systems, are used to develop an expert advisory system that will categorize different error types to hazard classes.

It is also important to look for the kinds of errors that have the greatest impact on safety. Several distinct factors are considered, such as the ability to identify the error quickly or the availability of backup resources. Simulation studies are used for this purpose, to determine the sensitivity of hazard assessments to individual factors. This approach allows detecting those errors that require the greatest attention.

The rest of the paper is organized as follows. Section 2 outlines the essence of air traffic control systems, particularly emphasizing visualization systems' errors encountered in practice of air traffic controllers' work. Section 3 provides a brief introduction to the theory of fuzzy sets and fuzzy reasoning systems. Section 4 describes a fuzzy model for assessing the hazard caused by TSVS errors. The form of linguistic variables and the knowledge base represented as fuzzy inference rules are also discussed. Section 5 shows the results of several simulation experiments using a computer tool created in the SciLab environment. Section 6 provides a summary and conclusions.

2. Air Traffic Control Systems

Air traffic control (ATC) systems are sophisticated computer systems. Their main task is to assist air traffic controllers in ensuring a safe and effective flow of air traffic, but their range of applications is much wider: they support also airspace management (ASM) or air traffic flow and capacity management. This is why they are often called air traffic management (ATM) systems.

The construction and functionality of ATM systems vary considerably depending on the needs and operating conditions in different airspace services. In many cases, such systems are built according to the specifications of a particular air navigation services provider (ANSP). Therefore, it is not possible to describe the construction and operation of ATM systems in a way that the description would be appropriate for each of them. However, general principles for most typical modern ATM systems can be discussed.

In most air navigation services providers, the ATM system is used by many people at different operational positions at the same time. Usually, such systems are built using client-server architecture. Operational workstations are client applications, whereas data processing takes place on servers. The same data are available simultaneously on all workstations. The user interface at operational workstations varies according to types of the services provided by the position. For instance, a radar controller workstation, the main part of which is the traffic situation display, is different from the tower controller workstation or the ASM operator workstation.

2.1. General Structure of ATC Systems. The main part of an ATM system is usually a cluster of servers processing data from multiple sources, such as

(i) surveillance radar systems;

(ii) automatic dependent surveillance-broadcast (ADSB) system, where aircraft determine their locations based on onboard navigation systems and then broadcast them so that they can be tracked;

(iii) flight data processing (FDP) systems, which are databases of all planned and performed flight operations.

Data coming from these sources are processed in such a way so as to fulfill the functions expected by various recipients. The following main data processing modules can be listed:

(i) Surveillance data processing module: it collects data from different surveillance systems such as radars, ADS-B, or multilateration systems.

(ii) Tracker: its task is to follow objects based on surveillance data; positions of the aircraft obtained from different sources may vary slightly, so data fusion is necessary to create a so-called track, representing the position of the object.

(iii) Flight data processing module: it is responsible for receiving messages containing information about planned operations from aeronautical fixed telecommunication network (AFTN), online data interchange (OLDI) coordination messages, and current flight plans processing based on data obtained from system operators and the tracker.

(iv) Decision supporting modules: based on the current flight data and surveillance data, they perform auxiliary functions to assist air traffic controllers in their routine tasks and, if necessary, generate alerts about separation loss, inaccessible airspace incursion, collisions of aircraft trajectories, or dangerously low flight altitude.

2.2. Air Traffic Controllers' Work Technology. Actions performed by radar controllers vary and depend on many factors, like type and distribution of traffic streams, traffic volume, or airspace availability. Nevertheless, some elementary activities appear in their work independently of these factors, which can be distinguished. An air traffic controller in general

(1) issues clearances and instructions to the flight crews, mainly using radio communication;

(2) coordinates the traffic with other air traffic service units, primarily using telephone communications.

These actions are not directly related to the use of TSVS. However, to ensure safety and efficiency of air traffic, the controller needs information from various sources. The most important source of traffic information for the radar controller is the visualization system. It allows determining visually the position of the aircraft related to different airspace structures (such as prohibited areas where flying is forbidden by law or temporarily segregated and reserved areas for military needs), navigational aids and other ground objects, navigational points, and other aircraft. As a rule, prior identification of the aircraft is required. For this purpose, information transmitted by the aircraft's transponder in mode A, mode S, or SPI (Special Position Indication) is used. These data are received by the cooperating secondary radar. If none of these can be used to identify the aircraft, the controller compares the position reported by the crew with the position of the aircraft symbol on display or checks the changes of this symbol's position in detail to compare it with the aircraft's maneuvers.

To ensure that the distance between the aircraft or between the aircraft and the airspace structure is appropriate, the distance measuring tool implemented in the ATC system is used. This tool also allows determining the magnetic direction between two points, which is applicable in vectoring (i.e., instructing the crew to fly specific headings) and accurately defining the position of the aircraft in relation to navigational aids, other ground objects, or navigational points.

The separation between aircraft [40] needs to be continuously provided; therefore, anticipating future positions of the aircraft is an essential element of the controller's work. A function of displaying routes according to current flight plans and so-called vectors (predicted trajectories of the aircraft in a preset time if the flight parameters are unchanged) is used for this purpose. Vectors and route visualization functions are often used together with the distance measuring tool mentioned above. More advanced systems provide predicted minimum horizontal distance between two aircraft or an aircraft and a selected point directly and sometimes are even able to show predicted aircraft positions at the moment when the expected distance between them is the lowest.

Another action of the controller interacting with the TSVS is to read the predicted time necessary for the aircraft to arrive at a specific point. This is one of the elements of planning the landing sequence or the order of passing the point where aircraft's routes merge.

A major part of the radar controller's work is also monitoring the aircraft maneuvers to ensure that they are consistent with the expectations (including clearances) and do not jeopardize safety.

A cardinal part of the work is verifying of the data displayed by the system, such as the aircraft's flight level, its speed (both the groundspeed and the airspeed), the heading, the altitude selected by the crew in FMS (Flight Management System), and the mode A transponder code. These data are essential for making the right decisions.

Modern visualization systems also generate warnings of many kinds. For example, STCA (Short-Term Conflict Alert) works by analyzing the actual movement of the aircraft, alerting the controller about the possibility of separation minimums infringement in a short time, and informing when such a violation occurs. The controller, when encountering such a warning, must make a quick assessment of the situation, determine whether the threat exists and, if necessary, take immediate actions to ensure the safety of the aircraft. Another alerting function, MTCD (Medium-Term Conflict Detection), works based on current flight plan data and aircraft performance data. The warnings are produced in a time perspective much longer than the STCA and serve as a guide to plan the traffic situation. Other messages generated by the visualization systems are warnings about aircraft approaching or violating the airspace structure or maintaining an altitude that is dangerously low in a particular area.

2.3. The Role of the Visualization System. The elementary actions performed by the controller, which are presented in Section 2.2, clearly show that the visualization system is a fundamental component of the air traffic control system. It concentrates all information about the location and maneuvers of the aircraft. The quality of the visualization depends both on the quality and performance of the hardware used and on the processing algorithms used. Both aspects affect the time needed to calculate the position of the aircraft and present it on display, as well as the accuracy of its location. The aim is to minimize the latency resulting from data processing and to minimize the positioning error.

Functionally, the most important role of the TSVS is to assist the controller in creating an image of the current and future traffic situation. As this is the basis for decision-making, the visualization system is the controller's essential tool. Its role is further enhanced by integrating the TSVS with some executive features. For example, transfer of control to an adjacent sector may be preceded by the use of the visualization system for coordinating the flight level at the sector boundary, informing the next sector's controller about the transfer of communication, or informing the previous controller about establishing the communication on the new frequency.

2.4. Errors in ATC Systems. Experience in using TSVS shows that some errors may occur. This section contains their classification. All the described errors have been observed during operational work at the air traffic control position at approach (APP) and area control (ACC) units (one of this paper's coauthors is an active air traffic controller).

Common visualization errors in ATC systems are discussed below.

2.4.1. Incorrect Indication of Aircraft Position. This error lies in showing the aircraft position symbol at a distance from its actual position. The most likely cause is a malfunction of the tracker algorithm due to an internal error or erroneous input. The significance of this error is determined by the number of incorrectly positioned symbols and the type of misrepresentation. The most dangerous situation occurs when the symbols on display are spaced apart while, in fact, aircraft are close together. In case of significant divergences, the error is relatively easy to spot, but it generates stress and high workload to clarify the situation. Additional coordination with other controllers and increased radio communication are often necessary. Furthermore, this situation is dangerous also because of distraction from observing other traffic.

As a remedy, a switchover to a backup tracker is used. The effectiveness of this action depends on the error's source. If this is an internal error in the tracker algorithm, then such action should provide the correct position indication. However, if the cause of the error is incorrect input data, then wrong indications are also expected when using the backup tracker.

2.4.2. Incorrect Linking Flight Plan to the Track. This type of error may appear as a total or partial lack of flight plan data that should be available when the aircraft's position symbol is indicated. Another variant of this error is the inability to update the current flight plan. This function is integrated with the visualization system as standard. For example, a level change clearance issued by the controller should be recorded in the current flight plan by the so-called cleared level assignment. Failure to make such an update results in a situation where information in the flight plan is untrue.

The threat to safety is dependent on the number of aircraft affected by the error, the period for which the issue exists, and the number of flight plan parameters which are incorrect. As a remedy, manual description of the aircraft symbol can be used, as well as traditional paper flight progress strips. They provide constantly updated flight information sets. The controller completes them with all issued instructions and clearances, and their arrangement reflects the location of the aircraft in the area of responsibility. This allows for the detection of potential conflicts. Of course, this also generates additional workload and distracts attention from other traffic. Additionally, in such cases, many support systems, like MTCD, do not work.

This category also includes the incorrect assignment of procedures such as level constraints, SID (Standard Instrument Departure), and STAR (Standard Terminal Arrival Route) to the flight plan. The error is easy to fix by manual assignment, but this adds an extra workload. What is more, there is a risk that the manual correction of the flight plan will be done after another controller issues instructions for the procedure. Then, the current flight plan data is different from the crew intentions.

2.4.3. Disappearance of Aircraft Position Symbol. The threat level caused by this error depends on the time when the symbol is not displayed. During a straight and level flight, a temporary disappearance of the symbol of a single aircraft makes a relatively small trouble. However, when more complicated maneuvers are performed or when the disappearance prolongs, the threat created by such a failure is much higher. Another factor affecting the nuisance level is the flight phase when the error occurs. Vectoring an aircraft to final approach is impossible when its symbol disappears from the display. Also, aircraft symbols' disappearance prevents the controller from monitoring the aircraft's maneuvers which in some cases (such as unintentional deviation from the flight route or level) can substantially increase the safety risk.

2.4.4. Delayed Aircraft Position Update. It is evident that it takes some time from the moment of measuring the aircraft's position to the moment of displaying it in the TSVS. This time consists of the data transmission time, processing time, and hardware delay. As long as the delay is approximately constant, the error may be compensated. The problem occurs when this value is variable. In such a situation, an aircraft movement may be displayed unrealistically. For instance, we may observe that the aircraft makes a turn with a small radius, corrected later by a turn in the opposite direction. At that time, the aircraft makes a normal turn (without any corrections), but of a slightly bigger radius.

The error of this kind is easy to spot and is not a serious problem as long as the image is constantly visible and the deviations are not large. A problem, however, arises when the crew performs an incorrect maneuver or crosses the final approach track. If this error is common, the controller is convinced that it is a false image and any necessary corrective decisions may be delayed. Typically, in ATC systems, there is no way to correct this problem from the functionality available to the controller.

2.4.5. Conflict Warning Systems Malfunctions. As already mentioned, ATC systems are equipped with the functions of short- and medium-term conflict detection (STCA and MTCD). All of the above-mentioned positioning algorithms' shortcomings and errors such as information delay and incorrect input data may indirectly lead to conflict warning systems malfunctions. The most serious is the lack of necessary STCA or MTCD action or its too late activation. However, this type of error is extremely rare. The so-called false alarms are much more widespread. False alarms may also be caused by errors in STCA and MTCD algorithms. Whatever the cause, false alarms can pose a threat.

It seems that it is not a threat to draw the controller's attention to a traffic situation, even if the separation minimums are not violated. However, it is the opposite, especially in heavy and complex traffic. Such situation causes a few seconds of distraction from the rest of the traffic, and this can be dangerous. The problem is compounded by the fact that the STCA alert means the possibility of a real threat to the aircraft in a very short time, so drawing the controller's attention is practically unconditional. Additionally, often a general warning message (sometimes associated with an aural warning) appears before the aircraft concerned are indicated by the system. In case of false alarm, the controller concentrates fully on the insignificant situation for several seconds, which is a dangerous event. Furthermore, every such situation means a notable stress load to the controller. As a remedy, STCA might be simply disabled, which would eliminate false alerts but also removes the true ones, which obviously is also a dangerous situation.

2.4.6. Total Loss of Image. An error of this type maybe caused by a restart of the workstation resulting from an internal system error, a major technical failure, or even a terrorist activity. Essentially, this error can be divided into two kinds: the image disappears entirely or just stops refreshing. The latter situation is obviously better from a safety point of view, as it allows the controller to use historical data to build a picture of the traffic situation for some time. Depending on the number and type of the protection systems (additional power sources, additional data lines), the period for which the image is lost may differ. However, in any case, it causes a temporary air traffic control position's inability to work and as such is a very dangerous occurrence. Lack of image on several workstations is even more critical as there is no opportunity to use the picture at the workstation nearby.

3. Fuzzy Reasoning Systems

The problem discussed in this paper is distinct by two major characteristics. On the one hand, we consider a sociotechnical system where the role of the human factor is crucial. The result is an intense subjectivism of opinions. This is because the controllers are not equally vulnerable to make mistakes arising from a faulty indication in the TSVS.

On the other hand, high ambiguity and lack of precision characterize the problem. Errors in the visualization system are concerned, which are unpredictable. Not only can we not predict the time of their appearance, but also we are unable to define their type precisely. This is caused by the fact that errors can result from numerous causes. It is evident that ATM systems designers are aware of the visualization system importance to the traffic safety. Therefore, it can be assumed that all common and straightforward errors have been eliminated in the preproduction test phase. In consequence, we may expect errors of complex causes, which makes their precise analysis difficult.

In such situations, the literature recommends using the tools and methods suitable for problems of epistemic uncertainty, that is, ones in which full knowledge of the phenomenon is unavailable. In such cases, it is required to use expert opinions. As we have to handle expert opinions, it is a known fact that very often they are formulated in a descriptive and an imprecise way. We must, therefore, view the decision-making problem in the context of uncertainty related to decision-making [41]. All this locates the decision-making problem in an area described by, for example, the theory of fuzzy sets or rough sets [42].

Among the possible approaches, we have chosen to use the fuzzy logic, in particular, fuzzy reasoning systems. Zadeh [43] created the basis for modern applications of fuzzy logic.

A fuzzy set A will denote a set of

A = {(x, [[mu].sub.A] (x)):x [member of] X, [[mu].sub.A] (x) [member of] [0,1]}, (1)

where [[mu].sub.A] is the membership function of this set and X is a set of considerations.

A linguistic variable is a variable whose values are words or sentences in a natural or artificial language. These words or sentences will be called the linguistic values of a linguistic variable.

Our models will most often assume that the membership functions of linguistic variable values have a trapezoidal shape and that a standard membership function with the parameters (a, b, c, d) is as follows:

[mathematical expression not reproducible]. (2)

Within the scope of the reasoning process, we will use the input value fuzzification block, reasoning block using some fuzzy rules, and the defuzzification block. The rule sets will be created using experts' opinions, in particular, air traffic controllers. Such set may contain classic nonfuzzy implications as well as fuzzy implications. In the second case, we will use the so-called compositional method of reasoning introduced by Zadeh [44] which uses a generalized "modus ponens" fuzzy reasoning rule. This results in the following reasoning scheme [45], where P, P', Q, Q', and S are fuzzy relations:

[mathematical expression not reproducible], (3)

where I denotes implication, P denotes premise, and C denotes conclusion, while "[omicron]" is a max-min composition, defined on the sets X, Y, and Z, whose result for fuzzy relations A [subset] X x Y and B [subset] Y x Z is a fuzzy relation A [omicron] B [subset] x Z with a membership function:

[mathematical expression not reproducible]. (4)

The following notation has been used in formula (4) and subsequently:

[mathematical expression not reproducible]. (5)

Relations P and P' are often constructed based on the AND operator. We will use implications in the form of fuzzy conditional sentences (rules), that is,


The conditional sentence is equal to a certain fuzzy relation R [subset] X x Y; we will use the max-min rule that has been selected from the numerous definitions of such fuzzy relations found in the literature:

R = (P AND Q) OR (~ P AND S), (7)

which is expressed in the form of the following membership function:

[mathematical expression not reproducible]. (8)

If we write down the reasoning scheme (3) in the form of a fuzzy reasoning system, we will reach the following form:

I: IF P THEN Q P: X IS P' C: y IS Q', (9)

where P, P' [subset] X, Q, Q' [subset] Y.

The result of reasoning Q' is specified according to the compositional rule of inference as

[mathematical expression not reproducible], (10)

where R [subset] X x Y is a fuzzy relation specified by formulas (7)-(8). In such cases, the membership function of the result has the final form of

[mathematical expression not reproducible]. (11)

A fuzzy reasoning system described in Section 4 uses the above inference reasoning method specified in formulas (9)-(11).

In the fuzzy inference systems, the parameters listed below have been chosen for the determination of linguistic variable values:

(i) s-norm of algebraic sum type:

[[mu].sub.A [union] B] (x) = [[mu].sub.A] (x) + [[mu].sub.B] (x) x [[mu].sub.B] (x). (12)

(ii) t-norm of algebraic product type:

[[mu].sub.A [intersection] B] (x, y) = [[mu].sub.A] (x) x [[mu].sub.B] (y). (13)

(iii) Implication of minimum type:

[[mu].sub.A [right arrow] B] (x) = min ([[mu].sub.A] (x), [[mu].sub.B] (x)). (14)

(iv) Rules aggregation of maximum type:

[[mu].sub.A [direct sum]] B (x) = max ([[mu].sub.A] (x), [[mu].sub.B] (%)). (15)

(v) Defuzzification of weighted average type.

4. Fuzzy Reasoning System for Threat Level Assessment Caused by Errors in Visualization Systems

Assessing the degree of threat to air traffic safety caused by errors in the visualization system is impossible to be made in the strict quantitative sense. This is mainly due to the lack of an adequate measure of the level of safety of the traffic situation. Its dependence on many factors, mostly subjective, excludes finding functional relationships. However, such assessment is essential for the proper management of available hardware and human resources, as well as deciding to modernize equipment or additional training.

In this section, the fuzzy reasoning system and its computer implementation are described, which allow for an adequate assessment of the threat to the air traffic caused by errors in the visualization system. In the first place, factors influencing threat assessment are introduced as well as their representation by linguistic variables that are inputs to the fuzzy inference system. The knowledge base plays a critical role in this system. These data have been obtained from domain experts, in this case air traffic controllers working for the Polish Air Navigation Services Agency, including safety management experts and incident investigators on a voluntary basis; a number of consultations have been conducted to improve the credibility A computer application created in SciLab environment allows for assessment of specific breakdown situations.

4.1. Factors Influencing the Threat Level Assessment. The level of threat to air traffic safety is dependent on many factors. Some of them cannot be expressed as physical quantities, so it is hard to assess them clearly and precisely. Below, the most important of these factors are discussed.

4.1.1. Degree of Situational Awareness Loss. The concept of "situational awareness loss" generally describes all situations when an important air traffic participant (controller, pilot) is not entirely aware of what the current traffic situation is. Visualization errors cause safety risks only if they cause a loss of situational awareness for the air traffic controller. Obviously, the level of threat depends on the degree of loss of situational awareness, that is, how the image of the traffic situation created in the controller's mind differs from reality. For example, if the controller receives information about the position of the aircraft, which is actually at a slight distance from that location, the risk is relatively small, especially when this distance difference is less than the minimum separation in the given airspace. However, when the controller, due to a visualization system error, does not know completely about the existence of several aircraft in his/her area of responsibility, then the threat is very high.

4.1.2. Awareness of Errors in the Situation Image. The problem of the loss of situational awareness is inextricably linked with the issue of the controller's belief that the picture of the traffic situation he/she has created in mind is correct. In some cases, he/she may be convinced or even sure that the image is proper, while reality is far from that. This is the most dangerous case because the controller will continue to work based on the wrong image without taking any verification action. At the other extreme, we concern a situation when the controller is perfectly aware of the almost complete loss of the correct image. He/she will then take steps to restore at least partial situational awareness, such as asking flight crews about their positions or flight levels.

Importantly, knowledge of the existence of a malfunction may appear after some time, which depends on the obviousness of this error. For example, in the event of total loss of visualization, the controller is aware of the problem immediately, while a slight deviation of the aircraft's position shown from its actual location may remain unnoticed for a long time. However, over time, the controller will probably notice that the visualization system is not working properly. The period from the moment when the error occurs until the controller learns about it depends not only on the error type but also on the level of training and experience of the controller and the daily condition, so the human factor plays a significant role here. The degree of safety threat depends on that time so that it will be used as an evaluation criterion for this factor.

4.1.3. Time of Situational Awareness Loss. Another factor affecting the degree of threat is the time interval in which the controller, because of a system error, does not have full situational awareness. It may range from a few seconds to dozens of minutes. The longer the time, the greater the safety threat. In the event of a major system failure that cannot be fixed quickly, actions are taken to cut off the flow of air traffic to the sector. For this factor, we will use a judgment based on the anticipated time of situational awareness loss.

4.1.4. Backup Resources Availability. When a controller is aware of the error, especially if the error persists for an extended period, he/she will try to use other available resources to keep himself/herself aware of the situation. A straightforward and efficient manner is to use

(i) an image in another workstation (in the event of a failure affecting only a part of an ATC system),

(ii) a backup system,

(iii) another data source of aircraft positions (e.g., data from another radar or ADS-B system instead of the radar generating wrong data or not providing the data at all).

An important remedy is the use of traditional flight progress strips, which can be employed when no traffic picture is available.

Such backup resources are not available in every situation. An ATC backup system is not available in every ATC unit, the air traffic control sector can only be covered by a single radar (i.e., there is no possibility of changing it), or a transmission link may be down. While it is possible and reasonable to use flight progress strips in the event of a problem with the flight plans processing system, there is no way to print them in case of power supply loss.

Backup resources availability, in the context of threat to traffic safety, should be considered in two ways. Firstly, it is necessary to determine whether backup resources are available at all and, secondly, to assess their quality. In the case of the availability of an image on another workstation, which is within a short distance from the faulty one, the threat is relatively small. On the other hand, if there is a need to use a backup system, often less functional and without the flight plans data, the risk is higher. Moreover, it is even worse when the only remedy is the flight progress strips without any visualization. The assessment of the impact of the use of backup resources on the degree of safety threat will consider both aspects.

4.1.5. Human Factor. A vital, perhaps even the most important, factor influencing the assessment of the safety threat in case of visualization system failure is the human factor. One of the key components of its evaluation is the level of training of the controller. Despite certain standards and requirements that must be met by all controllers, the differences between individuals may affect their ability to deal with an emergency.

An example of such a difference is the capacity to provide procedural control, which is carried out without the use of surveillance systems but only by flight progress strips and flight crews' position reports. Controllers trained in radar units today do not usually have this skill; it is not required. However, many of the older, experienced controllers have previously worked in procedural units, and therefore it will be easier for them to ensure air traffic safety without visualization system support.

Besides, the experience of the controller, which can be expressed both by the number of years of work and by the number of hours worked at the position, affects the level of threat. Firstly, a more experienced controller will find out that he/she is handling a system error sooner. Secondly, their actions will probably be more suitable to the situation.

On the other hand, the perception abilities of the human being fall with age, so an older person is less able to perceive and remember, and this can have an adverse impact on actions in a particular situation. It will be harder to recall the call signs, positions, and altitudes of all aircraft in the sector after the loss of the image.

Another equally important component of the human factor that affects the level of threat is the psychophysical condition of the controller. The controller who is well rested, in a good condition, has a shorter response time and more resistance to stress than the one tired after a full day of work, who will be less efficient in the situation of visualization system failure.

4.2. General Structure of the Threat Level Fuzzy Inference System. Analysis of the factors influencing the assessment of the threat to the traffic situation safety resulting from errors in the imaging system indicates that there are two aspects of this assessment.

On the one hand, it is affected by the relevance (size) of the irregularities. This aspect is strictly technical, and its source lies in the nature of the error and the place of its appearance. It is closely related to the ability to quickly restore normal system operation or to use an alternative solution. This depends on the awareness of the possibility of an error at the managerial level of the air traffic control and the protective measures provided.

On the other hand, there is a group of psychological factors. In general, they can be defined as the ability to recognize the existence of a visualization system error and the resulting deficit of information needed for the safe air traffic control. The individual capabilities of a particular controller strongly condition them, resulting from their personality traits and psychophysical condition.

The scheme of the fuzzy model for assessing the level of threat caused by the error in the visualization system is shown in Figure 1. The output variable threat level ([z.sub.t]) depends on five input variables. These are loss of situational awareness ([]), time without knowledge of error ([x.sub.wk]), time of situational awareness loss ([x.sub.lsa]), quality of remedies ([x.sub.q]), and human factor ([y.sub.h]). The last of these input variables is the output of the local fuzzy reasoning system, with four input variables: experience ([x.sub.e]), age ([x.sub.a]), training ([]), and psychophysical condition ([x.sub.p]).

The form of membership functions of linguistic variables values, the basis for their determination, and the knowledge bases of both fuzzy reasoning models will be presented in subsequent sections.

4.3. Input Linguistic Variables of Fuzzy Reasoning System. The set of influencing factors discussed in Section 4.1, the general scheme of the threat level fuzzy model, and the form of the fuzzy sets describing particular linguistic variables have been established after consulting domain experts: air traffic controllers. The experts have made their assessments independently, without discussions with each other, to avoid cross-influencing their opinions. Differences between the judgments have provided the basis to define the shape of a particular linguistic variable. In general, the experts had been asked about crisp set membership of values. For the values found belonging to a given set by all the experts, a membership degree equal to 1 was adopted (core of the fuzzy set). For the values found belonging to the set by at least some of the experts, we adopted a membership degree greater than 0 (support of the fuzzy set). In some cases, for example, when defining loss of situational awareness linguistic variable, some objective reference values defined by international regulations [40] have been utilized.

4.3.1. Loss of Situational Awareness. The degree of divergence of the traffic situation picture in the controller's mind relative to reality can be assessed based on the difference between the actual position of the aircraft and the position indicated (incorrectly) by the visualization system. We compare this difference to the separation minimum, that is, the minimum distance between aircraft in a vertical or horizontal plane, which is obligatory in a given airspace [40]. Based on expert knowledge, it has been assumed that the linguistic variable loss of situational awareness will take one of the six values defined in the general way as follows:

(i) Slight: when the positioning (visualization) error concerns only one aircraft and it is not bigger than half of the vertical or radar (horizontal) separation minimum.

(ii) Small: when the positioning error concerns only one aircraft and it is greater than half of the separation minimum but not higher than that minimum, and in case there is an error of positioning more than one aircraft no larger than half of the minimum separation for each one.

(iii) Significant: when the positioning error of one aircraft is greater than the separation minimum.

(iv) Serious: when more than one aircraft is in a location distant by more than the separation minimum from the position of that aircraft shown by the controller visualization system, or the controller is unaware of the presence of one aircraft in his/her area of responsibility.

(v) Large: when there are substantial errors in determining positions of many aircraft, or if the controller is not aware of the presence of more than one aircraft in the area of responsibility.

(vi) Total: complete loss of air traffic situation image.

Such description of values of linguistic variable loss of situational awareness, due to the generality, requires the use of an expert in each case to determine that the situation belongs to one of the categories. This is particularly challenging in case of complex situations or when comparing different error types. The additional problem is that the required separation in the horizontal plane is much larger than in the vertical plane. Therefore, we propose the use of an integrated indicator with the following form to determine the value of a linguistic variable:

[mathematical expression not reproducible], (16)

where d is an integrated indicator determining the degree of situational awareness loss, n is the number of aircraft, whose visualized positions do not correspond to their actual locations, [[delta].sub.r] and [[delta].sub.v] are the amount of deviation of the imaged position from the actual position of the aircraft in the horizontal and vertical plane, respectively, and [s.sub.r] and [s.sub.v] are the separation minimum obligatory in the considered airspace in the horizontal and vertical plane, respectively.

Based on the indicator d, trapezoidal membership functions of values of linguistic variable loss of situational awareness have been adopted and are shown in the logarithmic scale in Figure 2.

4.3.2. Time without Knowledge of Error. The level of safety threat depends on whether the controller is aware of the occurrence of an error or, more specifically, how long he/she is not. Accordingly, we have used the linguistic variable time without knowledge of error, which can take five values: very short, short, average, long, and very long. The trapezoidal membership functions of the values of this linguistic variable were adopted based on expert knowledge, and their logarithmic form is shown in Figure 3.

4.3.3. Time of Situational Awareness Loss. Depending on the type of error, the period in which the situational awareness of the controller is disturbed differs. In case of a short-term disappearance of the track of individual aircraft, the controller's situational awareness is usually maintained all the time. In case of major system malfunction, the time of situational awareness loss may be long and is not necessarily the same as the time when the system works incorrectly. For example, in the event of total loss of visualization, situational awareness may be preserved at an initial stage, especially when the traffic situation is not complicated.

What is more, when a backup system is available, it is possible to restore the controller's situational awareness much earlier before the main system resumes proper operation.

The linguistic variable time of situational awareness loss may take five values: very short, short, average, long, and very long. The trapezoidal membership functions of the values of this linguistic variable were adopted based on expert knowledge, and their logarithmic form is shown in Figure 4.

4.3.4. Quality of Remedies. A controller who is aware of the dysfunctional operation of the TSVS will seek to use other available means to ensure air traffic safety. As already mentioned, the possibility of using them in each situation will be considered within two categories: availability and quality of available remedies. Of course, in the absence of backup resources, there is no point in evaluating their quality, which means that only one linguistic variable, quality of remedies, can be used to assess this factor. It will take four values:

(i) None: no remedies are available or possible to use.

(ii) Low: low quality means are available to use, such as flight progress strips or other text information on selected flight parameters of the aircraft, no visualization.

(iii) Average: imaging is available, but there is no correlation between data from the surveillance system and data from flight plans.

(iv) High: visualization of radar data correlated with flight plan data is available, although the functionality is not necessarily the same as the main system.

4.3.5. Human Factor. For an assessment of the influence of the human factor on the degree of safety threat in the event of a TSVS failure, we will use an integrated indicator that combines information about the professional experience, age, the level of training, and psychophysical condition of the controller. The linguistic variable human factor, which describes the ability to cope with a failure of a visualization system in general, will take five values: very low, low, average, high, and very high. It will be determined by the result of the local fuzzy reasoning system with four inputs: experience, age, training, and psychophysical condition.

The linguistic variable experience will take three values--low, average, and high--and will be determined by the number of years of operation at the radar control position. The form of membership functions is shown in Figure 5.

The linguistic variable age will take three values: young, middle, and old. The form of accepted membership functions is shown in Figure 6.

The linguistic variable training will take one of three values--poor, average, and good--and will be determined by the controller's level of training in procedural control. As good training, we will identify the situation when the controller has been trained for procedural control in the air traffic control unit in which he/she is currently working. Average training occurs when the controller has the knowledge and skills of procedural control but in another unit. Poor training is a situation in which the controller has never held a procedural control license. In the sense of fuzzy logic, the individual values will be the fuzzy singletons determined by the allocation of the controller to the corresponding group.

The linguistic variable psychophysical condition will take one of three values: poor, average, and good. For a good shape, we will consider the case when the controller does not feel any discomfort and is rested and there are no problems that would distract his/her attention from work. The average state is defined as the case when the controller is slightly tired, after slightly shorter than the proper night rest, or is experiencing minor ailments such as a mild headache. The poor state, in turn, is when a controller is fatigued or overwhelmed or feels pain in average intensity or is affected by a situation that has a negative impact on his/her emotional state. Values of the linguistic variable psychophysical condition will also be fuzzy singletons, and the choice of the appropriate value will be based on the self-evaluation of the investigated controller.

4.4. Output Variables of the Fuzzy Reasoning Systems. It has been assumed that both local models human factor and threat level would be Takagi-Sugeno-Kang models with singleton output values. The values of the human factor variable are discussed in Section 4.3 since it is also the input variable for the threat level model. In turn, this variable will take five values: very low, low, average, high, and very high. The form of the values of this linguistic variable is shown in Figure 7. The membership function of the linguistic variable human factor has been determined in the same way. As a result, at the output of the fuzzy reasoning system, we evaluate the level of safety threat caused by the visualization system error as a real number from the interval [1,5].

4.5. Knowledge Base of the Fuzzy Reasoning System. As already indicated, the knowledge in inference systems describing complex sociotechnical systems is subjective to some degree and as such impossible to quantify precisely. Therefore, fuzzy inference rules based on expert knowledge have been applied. One of the authors of the paper is an active air traffic controller, but for more credibility of the knowledge base, the rules have been verified with other field experts. The procedure of obtaining this data has been similar to one described in Section 4.3. Existing inconsistencies between the rules have been eliminated using a method described by Skorupski [13].

In the human factor fuzzy reasoning system, 81 fuzzy inference rules are defined, some of which are shown in Table 1.

In the threat level fuzzy reasoning system, 49 rules have been defined, some of which are shown in Table 2.

4.6. Computer Implementation of the System. The fuzzy model for evaluation of the safety threat to air traffic caused by visualization system errors has been implemented in the SciLab 5.4 environment with Fuzzy Logic Toolbox package. This software enables both defining and editing the fuzzy reasoning systems as well as conducting simulation experiments, including the sensitivity analysis. In the latter case, it is possible to trace the results for the entire range of values of selected input variables. Also, it is possible to perform additional calculations, which allows integration of input data preparation with the main fuzzy inference system. This, for example, allows getting the value of input linguistic variable loss of situational awareness, which is very convenient.

5. Simulation Experiments

The developed model together with its computer implementation allows us to assess the influence of errors in TSVS on traffic safety. Simulation experiments have been carried out to check the usability of this software tool for threat assessment and also for the selection of the most critical system components that influence the safety of air traffic control. Some of these experiments are described in this section. Experiments consisted in calculating the level of threat for common errors in visualization systems and then analyzing the sensitivity of the output to changes in input parameters. The described emergency situations are hypothetical (they have not occurred in reality), but very similar situations, varying in the size and type of traffic under control at the time of failure, can be met in day-to-day air traffic controller practice. For all experiments, the basic version assumed that the controller who encountered an error has been characterized by the average values of parameters related to their individual characteristics (age, experience, training, and psychophysical condition). The result of human factor variable evaluation for such parameters is shown in Table 3.

5.1. Scenario S1: Minor Failure. As the first case, a relatively minor failure of the visualization system is analyzed, with a limited impact range, due to a small number of aircraft concerned.

The analyzed scenario (S1) can be described as follows. Due to an anomaly in the tracker subsystem, one of the tracks representing an aircraft stops and does not change its position. Because of heavy traffic in the airspace, the air traffic controller does not notice an error, and after one minute the system automatically switches to a backup tracker. Given the performance characteristics of the aircraft affected by the visualization system failure, it can be estimated that the maximum deviation of the radar position shown in relation to the actual aircraft location was 7NM, which equals the minimum radar separation in that airspace. The aircraft involved was performing a level flight. Remedies were not available.

The input parameters of the fuzzy inference system and the results of the experiment for Scenario S1 are given in Table 4.

For the scenario being analyzed, the result obtained from the fuzzy inference system places the emergency in a low threat area with a slight shift towards the average rating (Figure 7). The situation in which the controller, when looking at the traffic visualization display, is informed about the position of the aircraft which is different from its real position may lead to wrong decisions or to failure to take actions to resolve a potential conflict, especially if the controller is not aware of the failure for the time it exists. As a mitigating factor, it can be pointed out that the fault concerns only one aircraft and, in particular, that it lasts so short that the deviation does not exceed the radar separation minimum.

5.2. Scenario S2: Major Failure. In this experiment, the case of a major breakdown involving a large number of aircraft, but limited in time, has been analyzed.

Scenario S2 can be described as follows. Because of power failure, the controller completely loses indications from TSVS and the monitor goes blank. At the time, there are 10 aircraft in the control sector. The backup display, with the same functionality as the main system, located nearby is available, so after one minute the air traffic controller starts to work with it. This finishes the emergency. The controller's characteristics are the same as in Scenario S1.

The input parameters of the fuzzy reasoning system and the results of the experiment for Scenario S2 are given in Table 5.

For the scenario being analyzed, the result obtained from the fuzzy reasoning system places the situation in a high threat area with a slight shift towards the average rating. Complete disappearance of the visualization of a traffic situation is a very dangerous situation, especially when no remedial measures are available. An important mitigating factor is that such a failure is difficult to overlook so the controller immediately becomes aware of it and takes actions limiting the time of failure.

5.3. Scenario S3: Complex Failure. In the third experiment, a more complicated case has been analyzed. Its most important feature is that several dangerous events occur at the same time.

Scenario S3 can be described as follows. Because of an error during processing radar data, one of the aircraft entering the controller's area of responsibility has not been shown on the TSVS for two minutes. The controller of the previous sector, which the aircraft has already flown through, works with the same ATC system, so the crew has not received instructions to establish radio communication at the new frequency. The aircraft in question is ascending at a vertical speed of 2000ft/min. No effective remedies are available. After approximately 60 seconds, the controller is aware of the lack of aircraft track on display, but there are no indications of where the aircraft maybe located at that time. Two minutes after the TSVS failure, the corresponding track appears again. The controller's characteristics are the same as in Scenario S1.

The input parameters of the fuzzy inference system and the results of the experiment for Scenario S3 are given in Table 6.

For this scenario, the result obtained from the fuzzy reasoning system places the emergency in a very high threat area. This assessment is justified because here we are handling an accumulation of some severe factors. Firstly, the malfunction lasts two minutes, which in the case of a level flight causes the position to be approximately twice the radar separation minimum apart from the original one, which, in the absence of a controller response, can itself lead to a collision. Secondly, the aircraft ascends. Again, for two minutes the separation minimum distance is exceeded several times, and there is a possibility of collision with aircraft flying at adjacent flight levels. As the third essential factor deepening the danger, one may mention quite a long time without awareness of the failure.

5.4. Sensitivity Analysis in Scenario S1. As mentioned in Section 5.1, Scenario S1 is characterized by a relatively small deterioration of safety. The main reason for this is that TSVS quickly switches to a backup tracker. At this point, we will consider the analogous situation, but we assume that the switch to the backup tracker does not take place, and the track does not move for a few minutes. We will mark it as Scenario S1a. We assume that after about 90 seconds the controller notices the error and, using the available previous generation ATC system, continues the work after about two minutes. At that time, the difference between the displayed and the actual position of an aircraft is 15 NM.

The input parameters of the fuzzy inference system and the results of the experiment for Scenario S1a are set out in Table 7.

As we can see, extending the duration of the failure causes the threat level to fall into the high rating area. This clearly shows how dangerous these errors can be and how important it is to implement effective self-diagnostic means in TSVS that are responsible for detecting, for example, a tracker error and switching to a backup. Lack of such a mechanism would cause an even greater increase in duration of the error which in this case could lead to a collision, with the air traffic controller being helpless in that situation.

5.5. Possibility of Threat Reduction in Scenario S3. We will now consider the possibility of threat reduction for a complex error described in Scenario S3. To this end, we propose the introduction of a self-diagnostic feature in the ATM system, which would indicate a disappearance of a track in a way, which would be impossible for a controller to miss. This way, we would make the controller of the previous sector able to spot the error and inform the next sector controller about it as well as instruct the flight crew to switch to a new frequency. Thus, he/she can learn about the error, determine the aircraft position, altitude, and maneuvers, and eventually use a backup ATC system installed in the same room after one minute. We will mark this as Scenario S3a.

The input parameters of the fuzzy inference system and the results of the experiment for Scenario S3a are set out in Table 8.

As can be seen, in the case of the analyzed type of error in TSVS, the new self-diagnostic feature causes the threat assessment shift from 1.0 (very high) to 2.5 (between average and high). We are still handling a dangerous situation, but this simple functionality improvement considerably reduces the threat. It is worth mentioning that such feature could sometimes produce false alerts and distract the controller when an aircraft leaves the surveillance system coverage area.

5.6. Validation of the Results. To check the correctness of the outcomes received, an opinion of four independent experts, air traffic controllers, has been used. The group had not been earlier engaged in knowledge base creation and had no impact on the form of the linguistic variables membership functions. Experts have been asked to assess the threat level in all scenarios analyzed in the simulation experiments. Five terms could have been used to evaluate the safety threat--very high, high, average, low, and very low--which are the same as possible linguistic values of the output variable from the model (Figure 7). Also, intermediate valuation has been approved, such as "between average and high." The classification has been made for the same conditions as in the experiments (i.e., without considering the traffic volume and complexity). The experts also rated the safety threat level not concerning their own skills and experience, but assuming that a controller of average skills is working at the position, according to Table 3. Results of these analyses are presented in Table 9.

The "overall score" shows simplified averaging of the experts' judgments using values as in Figure 7. The comparison of the model results with the experts' ratings shows an evident convergence. Each case has been evaluated using the same descriptive terms, differing by not more than half of the grade.

It is worth noting that there is full compatibility regarding the order of the scenarios by the threat level. Both model results and experts evaluation arrange the scenarios from the most to the least dangerous as follows:

(S3, S1a, S2, S3a, S1). (17)

6. Summary and Conclusions

Traffic situation visualization modules are essential elements of air traffic control systems. They constitute the basis for building situational awareness for air traffic controllers. At the same time, they focus all the hardware failures and software errors which, despite the use of technology with a very high level of reliability, can happen in practice.

Regardless of the origin of malfunctions of the system, they can result in several typical situations that have been categorized in Section 2.4. The essence of this paper has been to analyze the level of threat to air traffic resulting from errors of each category, taking into account factors such as the controller's experience. Considering the importance of the human factor and the subjective nature of the relationship between the determinants affecting the assessment of the degree of threat, fuzzy sets theory, namely, fuzzy reasoning systems, has been used to achieve the objective of the paper. A model has been created that has been later implemented in the SciLab environment. These solutions are based on expert knowledge gained from air traffic controllers. Since no objective criteria exist, the only practical possibility for validating the obtained results and then the proposed method is to obtain an independent opinion from domain experts, which has been done. The results of the simulation experiments have been discussed with several radar controllers and were rated as well corresponding to the real safety threat assessment in given situations.

Experiments have shown that one of the most important factors influencing threat assessment is the amount of time a controller does not have full knowledge of the traffic situation. The time depends, among other things, on the awareness that we are handling an abnormal image of the TSVS. That, in turn, depends, among other things, on the type of error. The results of experiments carried out using the created computer tool confirm these observations. In addition, they allow for quantitative assessment. It is worth noting that the results indicate a crucial role of diagnostic modules built into ATC systems. Waiting for the controller to notice an error in TSVS and to take any corrective action can significantly increase the time spent without complete knowledge of the traffic situation. It is therefore possible to provide a general recommendation to extend and further develop such systems as well as alerting functions to warn the controller that he/she is dealing with a possibly incorrect traffic situation image and notify the technicians of the failure. The former are this way able to use other available sources to restore and maintain their situational awareness immediately and the latter might start looking for the cause of the failure and restoring proper system operation. Self-diagnostic features can be even more important than redundancy that is usually used for increasing reliability of the system. In the case of redundancy, duplication of the same error can occur on all backup devices, especially when backup devices are similar to the main ones. In contrast, self-diagnostic systems can restore system performance even without the controller being aware of the malfunction. What is more, the controllers should be made aware of possible TSVS errors during their training. Simulations of such events could be also included in their recurrent training. Further formal and quantitative analysis of possible ways to prevent TSVS errors and to counteract their consequences is to be the subject of further studies, especially regarding those errors, which influence the safety of air transport the most.

We have analyzed the threat level that different categories of errors in TSVS may represent. The study covers both the technical aspect and the threat associated with air traffic controller using these systems. In the following research, we plan to relate these results to a specific traffic situation, especially to the volume and complexity of the traffic. This will allow us to perform a more detailed quantitative assessment of the impact of errors in TSVS on air traffic safety. We also plan to develop research by applying the theory of rough sets to a formal determination of those factors that have the greatest impact on threat assessment.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.


[1] J. Rasmussen, Information processing and human-machine interaction: an approach to cognitive engineering, Elsevier Sciences Inc, New York, NY, USA, 1986.

[2] M. R. Endsley and M. W. Smolensky, "Situation Awareness in Air Traffic Control," in Human Factors in Air Traffic Control, Academic Press, SanDiego, 1998.

[3] D. K. Y. Wong, D. E. Pitfield, R. E. Caves, and A. J. Appleyard, "Quantifying and characterising aviation accident risk factors," Journal of Air Transport Management, vol. 12, no. 6, pp. 352-357, 2006.

[4] H. Remawi, P. Bates, and I. Dix, "The relationship between the implementation of a Safety Management System and the attitudes of employees towards unsafe acts in aviation," Safety Science, vol. 49, no. 5, pp. 625-632, 2011.

[5] H.-J. Shyur, "A quantitative model for aviation safety risk assessment," Computers & Industrial Engineering, vol. 54, no. 1, pp. 34-44, 2008.

[6] N. Leveson, "A new accident model for engineering safer systems," Safety Science, vol. 42, no. 4, pp. 237-270, 2004.

[7] M. Rasmussen, M. I. Standal, and K. Laumann, "Task complexity as a performance shaping factor: A review and recommendations in Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) adaption," Safety Science, vol. 76, pp. 228-238, 2015.

[8] W.-K. Lee, "Risk assessment modeling in aviation safety management," Journal of Air Transport Management, vol. 12, no. 5, pp. 267-273, 2006.

[9] B. S. Ali, W. Y. Ochieng, W. Schuster, A. Majumdar, and T. K. Chiew, "A safety assessment framework for the Automatic Dependent Surveillance Broadcast (ADS-B) system," Safety Science, vol. 78, pp. 91-100, 2015.

[10] L. Flavio Vismari and J. B. Camargo Junior, "A safety assessment methodology applied to CNS/ATM-based air traffic control system," Reliability Engineering & System Safety, vol. 96, no. 7, pp. 727-738, 2011.

[11] Y. Tian, L. Wan, C.-H. Chen, and Y. Yang, "Safety assessment method of performance-based navigation airspace planning," Journal of Traffic and Transportation Engineering (English Edition), vol. 2, no. 5, pp. 338-345, 2015.

[12] J. Skorupski, "The risk of an air accident as a result of a serious incident of the hybrid type," Reliability Engineering & System Safety, vol. 140, article 5279, pp. 37-52, 2015.

[13] J. Skorupski, "Automatic verification of a knowledge base by using a multi-criteria group evaluation with application to security screening at an airport," Knowledge-Based Systems, vol. 85, pp. 170-180, 2015.

[14] J. Skorupski, "The simulation-fuzzy method of assessing the risk of air traffic accidents using the fuzzy risk matrix," Safety Science, vol. 88, pp. 76-87, 2016.

[15] A. Pasquini and S. Pozzi, "Evaluation of air traffic management procedures--Safety assessment in an experimental environment," Reliability Engineering & System Safety, vol. 89, no. 1, pp. 105-117, 2005.

[16] R. Patriarca, G. Di Gravio, and F. Costantino, "A Monte Carlo evolution of the Functional Resonance Analysis Method (FRAM) to assess performance variability in complex systems," Safety Science, vol. 91, pp. 49-60, 2017

[17] S. Stroeve, B. Van Doorn, B. Bakker, and P. Som, "A risk-based framework for assessment of runway incursion events," in Proceedings of the 11th USA/Europe Air Traffic Management Research and Development Seminar, ATM '15, pp. 1-11, Portugal, June 2015.

[18] F. Netjasov and M. Janic, "A review of research on risk and safety modelling in civil aviation," Journal of Air Transport Management, vol. 14, no. 4, pp. 213-220, 2008.

[19] C. Mertz, S. Chatty, and J. Vinot, "Pushing the limits of atc user interface design beyond S & M interaction: the digistrips experience," in Proceedings of the 3rd USA/Europe Air Traffic Management R&D Seminar Napoli, vol. 19, pp. 1-9, 2000.

[20] S. Bagassi, F. De Crescenzio, and F. Persiani, "Design and evaluation of a four-dimensional interface for air traffic control," Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, vol. 224, no. 8, pp. 937-947, 2010.

[21] S. Rozzi, W. Wong, P. Woodward et al., "Developing visualizations to sup- port spatialtemporal reasoning in ATC," in Proceedings of the International Conference for Research in Air Transportation, (ICRAT '06), pp. 1-10, Belgrade, 2006.

[22] T. J. Callantine, T. Prevot, N. Bienert et al., "Human-in-the-loop investigation of interoperability between terminal sequencing and spacing, automated terminal proximity alert, and wake-separation recategorization," in Proceedings of the 16th AIAA Aviation Technology, Integration, and Operations Conference, Virginia: American Institute of Aeronautics and Astronautics, Reston, USA, June 2016.

[23] J. Rohacs, D. Rohacs, and I. Jankovics, "Conceptual development of an advanced air traffic controller workstation based on objective workload monitoring and augmented reality," Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, vol. 230, no. 9, pp. 1747-1761, 2016.

[24] C. Moehlenbrink and A. Papenfuss, "ATC-monitoring when one controller operates two airports: Research for remote tower centres," in Proceedings of the 55th Human Factors and Ergonomics Society Annual Meeting, HFES '11, pp. 76-80, USA, September 2011.

[25] S. Inoue, K. Furuta, K. Nakata, T. Kanno, H. Aoyama, and M. Brown, "Cognitive process modelling of controllers in en route air traffic control," Ergonomics, vol. 55, no. 4, pp. 450-464,2012.

[26] U. Ahlstrom, "Work domain analysis for air traffic controller weather displays," Journal of Safety Research, vol. 36, no. 2, pp. 159-169, 2005.

[27] L. Giraudet, J.-P. Imbert, M. Berenger, S. Tremblay, and M. Causse, "The neuroergonomic evaluation of human machine interface design in air traffic control using behavioral and EGG/ERP measures," Behavioural Brain Research, vol. 294, pp. 246-253, 2015.

[28] E. Kesseler and E. G. Knapen, "Interactions: advanced controller displays, an ATM essential," in Proceedings of the 3rd USA/Europe Air Traffic Management R&D Seminar Napoli, pp. 1-15, 2000.

[29] M. Hadjimichael, "A fuzzy expert system for aviation risk assessment," Expert Systems with Applications, vol. 36, no. 3, pp. 6512-6519, 2009.

[30] D. Teodorovic and P. Lucic, "A fuzzy set theory approach to the aircrew rostering problem," Fuzzy Sets and Systems, vol. 95, no. 3, pp. 261-271, 1998.

[31] M. Lower, J. Magott, and J. Skorupski, "Analysis of Air Traffic Incidents using event trees with fuzzy probabilities," Fuzzy Sets and Systems, vol. 293, pp. 50-79, 2016.

[32] X. Wanyan, D. Zhuang, H. Wei, and J. Song, "Pilot attention allocation model based on fuzzy theory," Computers & Mathematics with Applications, vol. 62, no. 7, pp. 2727-2735, 2011.

[33] X. Lu and S. Huang, "Airport safety risk evaluation based on modification of quantitative safety management model," in Proceedings of the 2012 International Symposium on Safety Science and Engineering in China, ISSSE 2012, pp. 238-244, China, November 2012.

[34] J. Skorupski and P. Uchronski, "A fuzzy model for evaluating airport security screeners' work," Journal of Air Transport Management, vol. 48, pp. 42-51, 2015.

[35] J. Skorupski and P. Uchronski, "A fuzzy system to support the configuration of baggage screening devices at an airport," Expert Systems with Applications, vol. 44, pp. 114-125, 2016.

[36] O. Babic and T. Krstic, "Airspace daily operational sectorization by fuzzy logic," Fuzzy Sets and Systems, vol. 116, no. 1, pp. 49-64, 2000.

[37] F. Netjasov, "Fuzzy expert model for determination of runway in use case study: Airport Zurich," in Proceedings of the 1st International Conference on Research in Air Transportation ICRAT, pp. 59-64, Zilina, Slovakia, 2004.

[38] A. Doostparast Torshizi and J. Parvizian, "A hybrid approach to failure analysis using stochastic Petri Nets and ranking generalized fuzzy numbers," Advances in Fuzzy Systems--Applications and Theory, Article ID 957697, 2012.

[39] M. Bertolini, "Assessment of human reliability factors: a fuzzy cognitive maps approach," International Journal of Industrial Ergonomics, vol. 37, no. 5, pp. 405-413, 2007

[40] ICAO International Civil Aviation Organization, Doc 4444 Procedures for Air Navigation Services-Air Traffic Management, PANS-ATM, Montreal, Canada, 16th edition, 2016.

[41] D. Dubois and H. Prade, "On the relevance of non-standard theories of uncertainty in modeling and pooling expert opinions," Reliability Engineering & System Safety, vol. 36, no. 2, pp. 95-107, 1992.

[42] S. Greco, B. Matarazzo, and R. Slowinski, "Rought sets theory for multicriteria decision analysis," European Journal of Operational Research, vol. 129, no. 1, pp. 1-47, 2001.

[43] L. A. Zadeh, "Fuzzy sets," Information and Control, vol. 8, no. 3, pp. 338-353, 1965.

[44] L. A. Zadeh, "Outline of a New Approach to the Analysis of Complex Systems and Decision Processes," IEEE Transactions on Systems, Man, and Cybernetics, vol. 3, no. 1, pp. 28-44, 1973.

[45] J. Kacprzyk, Fuzzy Sets in System Analysis, Panstwowe Wydawnictwo Naukowe, Warsaw, 1986.

Pawel Ferdula (1,2) and Jacek Skorupski (iD) (1)

(1) Faculty of Transport, Warsaw University of Technology, Warsaw, Poland

(2) Polish Air Navigation Services Agency, Warsaw, Poland

Correspondence should be addressed to Jacek Skorupski;

Received 29 May 2017; Revised 24 November 2017; Accepted 26 December 2017; Published 23 January 2018

Academic Editor: Juan C. Cano

Caption: Figure 1: General scheme of the threat level fuzzy model.

Caption: Figure 2: Membership functions of values of loss of situational awareness linguistic variable.

Caption: Figure 3: Membership functions of values of time without knowledge of error linguistic variable.

Caption: Figure 4: Membership functions of values of time of situational awareness loss linguistic variable.

Caption: Figure 5: Membership functions of values of experience linguistic variable.

Caption: Figure 6: Membership functions of values of age linguistic variable.

Caption: Figure 7: Membership functions of values of threat level linguistic variable.
Table 1: Fuzzy inference rules for the local model human factor.

Rule number   Experience ([x.sub.e])    Age ([x.sub.a])

8                     Average                 Old
27                     High                   Old
49                      Low                  Middle
59                    Average                Middle
78                     High                  Middle

Rule number   Training ([])        Psychophysical
                                       condition ([x.sub.p])

8                      Poor                     Poor
27                     Good                     Poor
49                     Good                   Average
59                     Poor                     Good
78                     Good                     Good

Rule number   Human factor ([y.sub.h])

8                     Very low
27                       Low
49                     Average
59                      Good
78                    Very good

Table 2: Fuzzy inference rules for the local model threat level.

Rule number      Loss of      Time without        Time of
               situational    knowledge of      situational
                awareness         error       awareness loss
              ([])    ([x.sub.wk])     ([x.sub.lsa])

1                 Total            Any         = Very short
9                 Large            Any             Short
27             Significant     Very short         Average
30             Significant       Average          Average
40                Small            Any            Average
45               Slight            Any          Very short

Rule number   Quality of    Human factor    Threat level
               remedies      ([y.sub.h])     ([z.sub.t])

1                None            Any          Very high
9                None            Any            High
27               None         Very high        Average
30              Average        Average          High
40                Any         Very high          Low
45                Any            Any          Very low

Table 3: The results of human factor fuzzy reasoning system for a
typical controller.

Parameter                   Value    Human factor

Experience (years)            4
Age (years)                  40          3.0
Training                   Average
Psychophysical condition   Average

Table 4: Results of the experiment for Scenario S1.

Parameter                                      Value   Threat level

Human factor                                    3.0
Loss of situational awareness (indicator d)     2.7
Time without knowledge of error (s)             60         3.8
Time of situational awareness loss (s)          60
Quality of remedies                            None

Table 5: Results of the experiment for Scenario S2.

Parameter                                      Value   Threat level

Human factor                                    3.0
Loss of situational awareness (indicator d)    27.2
Time without knowledge of error (s)              0         2.3
Time of situational awareness loss (s)          60
Quality of remedies                            None

Table 6: Results of the experiment for Scenario S3.

Parameter                                      Value   Threat level

Human factor                                    3.0
Loss of situational awareness (indicator d)    54.6
Time without knowledge oferror (s)              60         1.0
Time of situational awareness loss (s)          120
Quality of remedies                            None

Table 7: Results of the experiment in Scenario S1a.

Parameter                                    Value   Threat level

Human factor                                  3.0
Loss of situational awareness                 8.5
Time without knowledge of error (s)           90         2.0
Time of loss of situational awareness (s)     120
Quality of remedies                          None

Table 8: Results of the experiment in Scenario S3a.

Parameter                                    Value   Threat level

Human factor                                  3.0
Loss of situational awareness                 74
Time without knowledge of error (s)            0         2.5
Time of loss of situational awareness (s)     60
Quality of remedies                          None

Table 9: Comparison of the results from the model and expert

Scenario               S1                   S2

Expert 1               Low                 High
Expert 2             Average               Low
Expert 3             Average               Low
Expert 4           Low/Average        High/Very high
Overall score   3.4 (Low/Average)   2.8 (Average/High)
Model           3.8 (Low/Average)   2.3 (Average/High)

Scenario                 S3                   S1a

Expert 1                High                 High
Expert 2                High                 High
Expert 3           High/Very high            High
Expert 4              Very high         High/Very high
Overall score   1.5 (High/Very high)      1.9 (>High)
Model              1.0 (Very high)        2.0 (High)

Scenario               S3a

Expert 1             Average
Expert 2               Low
Expert 3             Average
Expert 4               High
Overall score     3.0 (Average)
Model           2.5 (Average/High)
COPYRIGHT 2018 Hindawi Limited
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Research Article
Author:Ferdula, Pawel; Skorupski, Jacek
Publication:Journal of Advanced Transportation
Date:Jan 1, 2018
Previous Article:A Novel Approach for Operating Speed Continuous Predication Based on Alignment Space Comprehensive Index.
Next Article:Changes in Trust after Driving Level 2 Automated Cars.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |