Printer Friendly

A tale of two allied defence departments: new assurance initiatives for managing increasing system complexity, interconnectedness and vulnerability.

1. Introduction

This paper reports on the challenges for acquisition in modern complex systems-of-systems and families-of-systems and how assurance initiatives can be used to help ensure such capabilities remain sufficiently integrated, interoperable and information-assured (I3). The limitations in current assurance activities are drawn mainly from the Australian Department of Defence (DoD), while the comparative I3 assurance initiatives are drawn from the U.S. DoD. However, the technological and societal challenges apply to differing extents in many industries and as such, the means to systematically assure such acquisition should resonate more broadly.

The Australian DoD has successfully validated a number of challenging Australian complex platforms and systems, such as the E-7 Wedgetail aircraft, Jindalee Over-the-Horizon radar (JORN) upgrades and the Land 200 Army Battle-Management System. However, there are now significant complex adaptive forces acting on multiple domains in much of modern project management and systems engineering, as illustrated in Figure 1 from the work of Keating, Katina, and Bradley (2015) and Keating, Katina, Bradley, Jaradat, and Gheorghe (2017) on complex systems governance.

The many possible factors shifting projects towards more complex adaptive behaviours are too many to cover here, hence the summary in Figure 1; however, there are four main trends in DoD acquisitions that seriously challenge DoD acquisition to find fundamentally better I3 assurance:

* First, the systems are becoming so synthesised or fused, complex and interdependent that they can, even without taking into account human agency, have emergent properties or exhibit behaviours that vary to an extent that is not easily predicted. Moreover, the number of permutations of modern software-intensive systems make classical rigorous testing of them, all but impractical (Cofer, 2015), such that there has to be a reliance on some modelling (Hecht, 2015) and continuous through-life monitoring (Normann, 2015), both of which challenges mission and safety-critical assurances, (Tutty, 2016).

* Second, as the software-intensive systems enable higher order human-like functions (i.e. strategies and decision-making not simply control), the difficulty in specifying what the system must do becomes harder and it is more crucial to include representative human agency and decision-making to adapt the systems during development.

* Third, the threat to weapon systems has adapted as a result of the push for information exploitation and dominance, so as to exploit the broader cyber-attack surfaces of such inter-connected systems, not just with malicious attacks but as part of multi-layered hybrid (1) or hyper (2) warfare--in short the threat is more complex and probably adaptive.

* Fourth, there is a requirements stasis during development and build of large complex systems, largely through an emphasis on project management cost and schedule achievement on contracts, and unfortunately including their processing and software. Such a requirements stasis soon creates an alternative reality that is too far out of alignment with the contemporary family-of-systems and strategic/operational reality into which that complex new system must go into service - more so today because of the three previous change factors.

Each of these four challenges will be examined in much more detail before outlining the initiatives the U.S. DoD has applied to meet these challenges and then comparing these U.S. initiatives with contemporary assurance in the Australian DoD so as to derive useful recommendations. While such recommendations are focused on the Australian DoD, they can be adapted for broader industry and government use in Australia and other countries facing similar challenges.

Before outlining I3 assurance initiatives, some key background and definitions are necessary

2. Background - C4ISR and interoperability

Arguably the most significant technological revolution in warfare will continue to be in the information domain, (3) and in particular, in the degree of situational awareness made possible by the increasing number of interconnected communications and information systems supporting combat forces and their supporting infrastructure, both deployed and fixed (Ryan & Frater, 2007). Defence forces initially focused on specifying the high-level functions that this interconnectivity enabled, first command and control (C2, ca. 1970s), then adding communications and computers (C4, ca. 1980s) and finally adding intelligence, surveillance and reconnaissance (C4ISR, ca. 1990s). A useful elaboration of the impact of information technology is the concept of network-centric warfare, which was defined as
an information superiority-enabled concept of operations that generates
increased combat power by networking sensors, decision makers and
shooters to achieve shared situational awareness, increased speed of
command, higher tempo of operations, greater lethality, increased
survivability, and a degree of self-synchronization. (Alberts, Garstka,
& Stein, 2000). (4)

In modern warfare, therefore, the network is a considerable force multiplier. Consequently, C4ISR systems must be ubiquitous across the battlespace and must be fluid, flexible, robust, redundant, and real-time; have integrity and security; have access and capacity; and be joint- and coalition-capable. Early views of the architecture of such systems were provided in the late 1990s through the C4ISR Architecture Framework Version 2.0 (1997), which also recognised that the key to the desired properties of C4ISR systems was interoperability. The DoD C4ISR Architecture Working Group published the Levels of Information Systems Interoperability (LISI) framework, specifying how to measure the extent to which systems are interoperable. There are four LISI levels above isolated: connected, functional, domain, and enterprise as shown at Figure 2 (LISI, 1998). Another framework for interoperability was developed by the allied Combined Communications Electronic Board (CCEB) through an allied communications plan, with standards that defined the interoperability levels of: basic document exchange; full document exchange; network connection; basic intranet connection; web connection; organisational messaging; directory services; secure database access/exchange; and distributed applications (CCEB, 1999).

The issue of interoperability (5) is generally considered a technical one. Regardless of the ability to interoperate and any commercial or intellectual property issues, the major issue for allied defence forces is sovereignty control; that is, whether information and databases can be accessed by other command systems is normally subordinate to national security issues. Therefore, the desired degree of interoperability must be described and justified with respect to: personnel involvement required; access and security aspects; operational, procedural, technical aspects. (6) Additionally, coalition operations call for greater network interoperability and virtualisation, in which each nation needs virtualised controls for its portion to share information and to adjust access to that information (Ackerman, 2017).

3. Defining complexity and I3

According to Javorsek (2016, p. 327), 'In complexity, the interdependent behaviour of connected, heterogeneous, and adaptive agents frequently results in emergent macroscopic features not predictable from fundamental laws'. As this paper progresses, it is important to remember that the agents referred to in this definition are both human agents or operators and the hardware and software systems, often merged at multiple levels and on systems-of-systems working cooperatively. Such systems-of-systems are often simply referred to as complex adaptive systems and are more prevalent as software-intensive systems take on, or support, higher-order human heuristics.

In the context of the assurance of Defence systems, and as used colloquially in that practice, 'integration' refers to the characteristic of multiple systems, including their human interfaces and operators, to be compatible and work synergistically 'by design' to achieve combined, desired, and emergent properties and effects. Formally, this would be at the LISI levels of Domain or higher (Figure 2). Therefore, operational workarounds, introduced after design to achieve control at the Functional or Connected LISI levels, should not necessarily be classified as integration, but rather the lesser standard of LISI interoperability. Interoperability might therefore be referred to as 'capable of working together' either by some deliberate standard physical design features such as lugs or rails, or today with information protocols at the Domain or Enterprise level. Such assurance of interfaces, means that with testing, operational workaround(s), and a certain degree of fortune, systems can be interchanged or cooperate for an effect.

To emphasise the distinction between integration and interoperability, the interfaces in the 1990s between a U.K. missile and a U.S. aircraft had been designed both mechanically and electronically on different continents by two culturally very different companies to interoperate and work together. Yet, when the aircraft and missile came together in 2000, both the software on the missile and on the aircraft, each had to go through three upgrades before they were accepted as integrated. The first software updates resulted from laboratory tests together, the second resulted from installed tests on a real aircraft and missile on the ground, while the final updates followed in-air flight test with aircrew. Interestingly, several emergent properties that the aircraft engineers wanted to 'excise out' were wanted by the aircrew as desired 'enhancing features' which the aircraft systems engineers then needed to ensure remained. Furthermore, the experience was repeated with a U.S. missile and a European helicopter later that decade. One common resort for interoperating Defence systems that were not integrated 'by design', has been to have an operator transfer information between systems that are 'air-gapped' or seek command approval before 'arming' the system(s), however, the speed of many supersonic and hypersonic threats emerging today (Kemburi, 2016) will not allow such human-in-the-loop interoperability in the (real) time available.

When linking the terms integration and interoperability together in the field of Defence assurance, it means that two correlated aims are working together. First, there is a deliberate attempt to seek to integrate by design with as many of the known systems and system-of-systems that should cooperate as possible for training and in operations. Second, there is an aim to have an interoperability assurance level below that integration ideal, where there are interface design features and protocols that will maximise the chances of, as yet, unknown systems cooperating; that is, where these systems were not foreseen during design. This two-level overlapping approach in Defence assurance is necessary given the many bi-lateral and multilateral alliances and especially the unexpected coalitions necessary in modern hybrid or hyper warfare.

As an example of interoperability pressures, in the coalition of forces remaining in Iraq in 2008, established alliances would have then meant that the U.S. and U.K. forces, as well as the few Australian forces could have expected to be 'fully interoperable', however a comparatively large contingent of Romanians remained until the United Nations mandate expired on 30 December 2008. The increasing use of hybrid warfare only multiplies the number of unexpected military coalitions that inevitably need some degree of interoperability while maintaining national sovereignty via security protocols. When compounded with system integrators, intellectual property issues and national security, such coalition possibilities makes multi-level security protocol provisions at the Enterprise level system-of-systems vital.

It seems almost axiomatic this far into the Information Age to explain that information forms an indivisible part of integration and interoperability, yet it is necessary to dwell on it now because of the rising use of cyber-attacks in hybrid and hyper warfare. Information assurance now has two distinct sides. The first side is the pursuit of information dominance and exploitation by militaries, particularly the West, for targeting in either network-centric warfare, where such militaries seek to obtain a distinct advantage by harvesting and synthesising more information through intelligence, surveillance and reconnaissance than the adversary can obtain (Jordan et al., 2016, p. 243). This side, if you like, is the known 'own information goals'. The second side is defending that information domain dependency from exploitation by an adversary through malicious cyber-attack This side, if you like, is the unknown 'information vulnerabilities'. Whereas the first side of information dominance is costly for a military to establish confidently (in a timely manner too), the costs of entry for the second side of cyber-attack, is comparatively low, which makes an offensive cyber program 'affordable for most states' (Heinl, 2016, p. 126). So in defining information in the context of Defence assurance, it is the sharing of information across systems, including embedded information, either by design intent or non-cooperative malicious access (i.e. cyber warfare).

The overall challenge is to align integration, interoperability and information assurance measures efficiently so the known(s) and unknown(s) are reasonably met and balanced for all systems that already exist, are being co-developed or that we are yet to contemplate. The later exposition of U.S. DoD initiatives shows, perhaps unsurprisingly, that this involves flexible and frequent on-going experimentation, including of adversaries' capabilities, using advanced test and experimentation networks.

4. Challenge 1 - growing system synthesis

The increasing complexity of systems and their synthesis is replete in the literature, particularly where software functionality is taken to extremes not even contemplated a decade ago. The F-35 aircraft development is often cited as the most software-intensive aircraft or weapon system ever built, where the aircraft is intended to link far more seamlessly with other F-35 aircraft and ground processing centres at the Enterprise level. This aircraft development has, for at least the last five years, defied attempts to contain and then decrease software deficiencies being found, despite heavy curtailment of any new capability requirements (i.e. Block 4) and several roll-overs to repeat software builds (i.e. Block 3h and 3i) (U.S. DoD, DOT&E, 2015, 2016, 2017). Either the prime systems designer, Lockheed Martin, has software developments that are fundamentally not under control, or more likely from such a prime, the system is exhibiting emergent properties that will require continuous monitoring through life, like that outlined by Normann (2015). According to Cofer (2015, p. 314) a pragmatic approach is necessary where 'measurement of the reliability of software systems through testing alone is a practical impossibility' and therefore 'the objectives of software testing are to demonstrate a sampled compliance with requirements and to detect and eliminate as many software design errors as possible'. This validation challenge has led to new methods for validating U.S. DoD major automated information systems (Conley & Lenig-Schreffler, 2016) which would have applicability for Australian DoD projects like the network management system for Australia's satellite communication (Joint Project 2008 Phase 5B) and the 'One-Sky' air traffic renewal. (7)

The complexity of such systems is not contained to any one project, as it might have been to some extent once. Each new capability development, such as the U.S. and now Australian MQ-4C Triton aircraft, will critically link into the aforementioned satellite communications and air traffic infrastructures in order to fly on long-range missions from places as far apart as Perth, Christmas Island and Darwin, continuously using remote pilots and receiving and sending C4ISR data to Defence establishments in Adelaide, Bungendore and Williamtown. According to Cofer (2015, p. 313) in the U.S. such unmanned aircraft incorporate 'advanced control algorithms that will provide enhanced safety, autonomy and high-level decision-making functions normally performed by human pilots' and that as a consequence of this complexity 'verification of airborne software has become the single most costly development activity' and 'testing alone cannot establish strict bounds on all the behaviors that may occur during operation of these software-intensive systems. He argues that new approaches to verification are needed 'based on logic and mathematical analysis... to tame the complexity beast'.

This complexity assurance is not only in aviation, with the new future submarine having to deal with generational changes in the safety standards since the Collins, where advisory actions from platform management sensors and algorithms will be safety critical and the software tested accordingly. Nor are the developments contained neatly within each fighting domain. For example, the Triton aircraft can be re-tasked in direct support of key naval vessels and stream data direct to them. To establish this integration, a Naval integration program office has a generic future ship-borne architecture on a test vessel in the U.S. Patuxent River test facility that develops the necessary ship-aircraft interfaces during aircraft testing. These interface and architectural changes then inform program offices for each U.S. ship type so they can progressively roll these out. Hence, when choosing new capabilities like F-35 JSF and MQ-4C Triton aircraft, Australian legacy systems like the E-7 Wedgetail aircraft and the Hobart-class Air Warfare Destroyers require fundamental upgrades to synthesise, well beyond standard data-links, and that degree of interconnectedness is no longer discrete 'opt-in' or 'opt-out' upgrades but continuous incremental developments across a family-of-systems. Importantly, this paper will show how the U.S. DoD capture and respond more continuously to change in any one capability area so as to influence change in the interconnected others.

5. Challenge 2 - higher order human functions

Software-intensive systems are enabling higher order human-like functions, such as strategies and decision-making, not simply control. The risk and validation complexity of software functions needs to be tailored to the extent software is used to heuristically take decisions that are skill-based, rule-based, or knowledge-based, and then if that decisionmaking is adaptive and continuous (Wickens, Lee, Liu, & Becker, 2014, Chapter 6). Where systems cooperate to achieve human agency, that is say several systems taking adaptive knowledge-based decisions while working in concert, then the possibility of emergent behaviour occurs in such systems-of-systems, to no less an extant than it would in a team of humans. A significant factor in the difficulty in developing software-intensive systems is that most such projects employ only software developers and not human-factor engineers. Whether a software-intensive system will replace human agency and decision-making or, more often as not, augment human agency, there is a need to iteratively do usability testing involving the humans being replaced or augmented against test metrics for effectiveness, efficiency and user satisfaction (Wickens et al., 2014, pp. 386-397). Usability tests are fundamental to identify specific problems with software design and usually only need five-to-six participants and three-to-five evaluation-design cycles. A lesson from Australian DoD major command-support environment projects like Joint Project 2030 (Phase 8) and Project Land 200 (Tranche 1), was that failure to do such usability testing makes the operational test and evaluation (T&E) and transitional work much harder, resulting in unwanted applications, cultural mismatch, missing critical features, refusal-to-use and inefficient processes. (8) From this lesson, the Australian Army included three usability test-evaluation-design cycles in the second tranche of Project Land 200 so as to better address efficiency, effectiveness and user satisfaction of the digital battle-management system.

In addition to remembering crucial usability tests in software development, a more recent move is to better account for human agency in modelling and verifying complex adaptive systems. According to Javorsek (2016, pp. 327-328), 'uncertainty and probability distribution functions play a key role in understanding any complex adaptive system to include an air vehicle with the aircrew and test team included' and a paradigm shift is necessary in flight test away 'from the predominantly deterministic view that dominated earlier epochs'. Javorsek (2016) goes on to outline how human agency in complex adaptive systems can be better modelled with heavy-tailed distributions (i.e. non-Gaussian) where 'extremes are not exponentially bounded' such that 'In flight test, "improbable" events based on a Gaussian may actually be "expected" if the variables of concern are actually Heavy-tailed'. The broader application of such work in helping to anticipate emergent behaviour in complex adaptive systems is perhaps best made by Javorsek's (2016, p. 329) extrapolation to project management. Here, he argues 'cost-benefit decisions are routinely performed based on the assumption of risk that follows from a method heavily dependent on the anticipated (or estimated) probabilities'. Therefore, for remote risks 'a fiscally constrained program manager might elect to permit or encourage an "optimized" design that is less robust to that factor' such that the program can 'unknowingly create an aircraft less resilient to human agency'.

The Australian DoD has been found to often inadequately and inconsistently describe risk and underestimate the complexity of systems (Australian Senate, 2012, pp. 34-35; Defence Materiel Organisation and Australian National Audit Office, 2014, p. 65). As these systems become more complex adaptive in nature, earlier preview T&E is crucial (Joiner, 2015a), but so too is a better understanding of probabilistic modelling to account for adaptive agents, as reflected in the U.S. DoD mandating such competencies for all its major T&E staffs since 2009 (Ahner, 2016; Chu, 2016). The Australian DoD has so far only experimented with such probabilistic competence and methods to help with complex adaptive systems (Joiner, Kiemele, & McAuliffe, 2016) and has not yet put to its project or test staffs the expected value analysis techniques necessary to introduce heavy-tailed confidence predictions for better risk estimation. Tutty (2016, 2017) proposes a comprehensive operationally focussed non-linear decision-making framework with a 10-level set of military capability preparedness levels (9) and commensurate operational confidence levels to aid in a decision-making framework tied to the experimentation and T&E activities, which if adopted would provide the objectives for such early risk analysis.

6. Challenge 3 - cyber threat complexity response to information dominance

The Western push for information dominance began in earnest after the first Gulf War, where it was thought to have proved decisive and the concept has since continued to find reinforcement for the West in the counter-insurgency warfare of Iraq and Afghanistan (Jordan et al., 2016). Under network-centric policies, many of the enablers have been acquired or developed by Western forces to link information sources to analysts and all forces to their fire-support in as 'near-to-real-time' as possible. In the extreme, coalition forces can conduct cooperative engagement, which when it is enabled, has systems respond to the defence of other systems, often automated. Such cooperative engagement developmental testing was described in the U.S. Navy systems as early as 1997 (O'Driscoll & Krill, 1997) and aims to increase survivability and lethality. Such warfare capabilities require significant increases in the necessary integration and interoperability between defence systems, especially for coalition forces and in the face of automated responses necessary to meet supersonic threats (i.e. ANZAC class anti-ship missile defence testing 2011-2012 (10)) and future hypersonic threats (Kemburi, 2016). The required integration can be such that it forces a narrowing of acquisition options. For example, some have attributed the specification of a U.S. combat systems and weapons for Australia's French-designed future submarine (ANAO, 2016) as coming from a need to integrate that submarine's targeting of underwater weapons and land attack weapons to U.S. Naval assets. Similar integration issues are believed to be a factor in the often speculated transition of the Australian Armed Reconnaissance helicopter away from the current European source.

Offensive cyber is one of the cheapest and growing counters to information dominance in hybrid warfare, or even as an offensive deterrence in asymmetric warfare, when compared to the higher costs of other weapons, especially for the less-wealthy states in the Asia-Pacific region (Heinl, 2016, pp. 126-127). Heinl (2016) argues that non-state actors, 'cyber criminals, terrorists, hackers, hacktivists, and proxy actors engaged or supported by government, must be considered too' such that 'growing cybercrime in this region could cause further instability because of its connections to espionage and military activities'.

The more decentralised and distributed a network, the bigger the attack surface and the more cyberspace favours the attacker (McAfee Centre for Strategic & International Studies, 2017). The cyber threat to Australia was pres-ciently captured in a journal article by Thompson (2012), who following a Doctorate in cybersecurity, has now been appointed as the tri-service lead in the Australian DoD for cyber-warfare. The cyber threat has recently been identified officially by the Australian DoD (2016) as 'a direct threat to the ADF's warfighting ability given its reliance on information networks' and that 'State and non-state actors now have ready access to highly capable and technological advanced tools...'. More recently, the Prime Minister has been reported talking up Australia's offensive cyber capabilities as a deterrent (Pearce, 2016).

Cyber-enabled warfare is no longer confined to ICT systems but can be employed against any software-intensive system, meaning that mobile military systems are not only penetrable and exploitable, but that adversaries can develop system-denial routines, test them in probing operations and then store them for later use. Such work is now entirely analogous to, and partially linked to, electronic warfare. Joiner (2017) has already overviewed U.S. DoD cybersecurity acquisition processes and support infrastructure and made recommendations as to how the Australian DoD can leverage U.S. cyber-security developments in acquisition policy and T&E. Also, Joiner, Sitnikova, and Tutty (2016) have outlined in more systems detail how to leverage some of the U.S. DoD cybersecurity methods to develop more resilient Australian DoD systems, building substantially on the early calls for cybersecurity evaluation by Zhu, Staples, and Nguyen (2014). A recent paper by Fowler, Sweetman, Ravindran, Joiner, and Sitnikova (2017) has examined how acquisition projects in Australian DoD can leverage policies and experience in system safety and system survivability to deal with cybersecurity requirements and T&E, hopefully generating, along with software usability, common test points in development for contractors and the Australian DoD to align and adjust developing systems, especially their software architectures.

7. Challenge 4 - requirements stasis

Unfortunately during development and build of large complex systems, mostly through an over-emphasis on project management cost and schedule achievement in contracts, there is a stasis on capability requirements. Put another way, project managers often guard against 'requirements creep' which if taken to an extreme, can see a developmental capability remain so forever. Perhaps the most public portrayal of this phenomenon is the book and movie called Pentag on Wars, concerning the Bradley Fighting Vehicle (Burton, 1993; Pentagon Wars, 1998). With the utmost respect to that problem, which was as much about independence in informing decisions, the pendulum in Australian DoD may have swung too far towards 'overly fixed' requirements. Australia's landing helicopter dock (LHD) ships and soon Air Warfare Destroyer (AWD) ships, both had requirements set around 2004 and they have had limited scope in the decade since to amend contracts for contemporary requirements like cybersecurity, generally only amending the contracts for contractor difficulties. (11) In the Air Force stable, the F-35 aircraft software has also had capability growth frozen for many years due to ongoing software resolution deficiencies (i.e. Block 4 delayed for ongoing Block 3 versions), increasing the risk that when these are 'unfrozen' and cooperative vulnerability and penetration testing is undertaken, the cybersecurity threat evolution will mean significant architectural changes that will dominate all available capability growth in the next block upgrade (Australian Senate, 2016; U.S. DoD DOT&E, 2015, 2016, 2017). In the Army, the delivery of a surveillance variant of the light armoured vehicle (ASLAV-S) in 2014 came after a decade of requirements stasis, delivering a vehicle with impressive capabilities but that was not networked in the digital Army developed over that period of stasis. (12) The surveillance vehicle has workarounds but will also likely need retrofit. Examples like these are meant to illustrate that projects increasingly need more active requirements assessment and contractual flexibility so that developed capabilities keep abreast of evolving threats and evolving interconnected systems.

Australia's acquisition lifecycle has recently been reworked as a result of the implementation of the First Principles Review (Peever et al., 2015). Joiner has compared the U.S. DoD lifecycle to the new Australian DoD one for aspects like cybersecurity (Joiner, 2017) and for developmental projects like the future submarine (Joiner & Atkinson, 2016). Arguably, the Australian DoD lifecycle has not appreciated the foundational meta-project analysis work by the U.S. Accounting Office (U.S. Government Accounting Office [GAO], 1998) on which the U.S. DoD lifecycle is based, as covered in (Orlowski, Blessner, Blackburn, & Olson, 2017, pp. 36-37, esp. Fig. 1). In essence, the U.S. GAO found three key knowledge points around which the U.S. project plans, T&E and congressional legislation (i.e. Title 10) have since been constructed (see Brown, Christensen, McNeil, & Messerschmidt, 2015; Murphy, Leiby, Glaeser, & Freeman, 2015). The first knowledge point is technology readiness, and this point is broadly covered by the gates in the Australian cycle, albeit a title like the U.S. DoD's 'Technological Maturation and Risk Reduction' might explain better to the public the aim of this project phase. The second and third knowledge points are design maturity and then control of manufacturing processes. Service-level input, T&E and oversight are fundamentally key at these points, and yet in Australian DoD, the lifecycle commencing at Gate 2, provides the 'one-stop approval' for a project to develop, produce, and accept through to in-service. It should not be left to each contract in the Australian DoD to decide if after a period of development, a system design is subject to testing on usability, system safety, survivability and cyber-security, or again after manufacturing processes are under control, if production can begin. Too many projects have proceeded to full-rate production based on claimed maturity off-the-shelf and then been found deficient in basic acceptance and operational testing (Australian Australian Senate, 2012, pp. 34-35). Since the Australian DoD's First Principles Review (Peever, Hill, Leahy, McDowell, & Tanner, 2015) most of the responsibility for usability, system safety, survivability and cyber-security, and all T&E therein, lies with the Australian Services and thus outside the Acquisition Group, so additional way-points or gates are clearly warranted in the Australian DoD lifecycle. These gates need not require government approval but should require the Capability Manager (i.e. Service Chief) to concur (i.e. production readiness) and an adequate basis documented for these accountable decisions.

In summary, a requirements stasis in projects with long development and build risks creating an alternative reality that is too far out of alignment with the contemporary family-of-systems and operational reality into which complex new systems must go into service. An overly simplistic Australian DoD lifecycle, one that is front-end and back-end only, risks avoiding key knowledge points for critical test opportunities around system safety, system survivability, software usability, and cybersecurity, especially before projects commit to production.

8. Characterising current Australian DoD technical and operational assurance

Having covered the four key challenges in DoD acquisitions, it is now necessary to examine the assurance frameworks in the DoDs that are used to maintain I3 for their families of systems, starting with the Australian DoD. The current military assurance frameworks in the Australian DoD are both technical and operational. General characteristics of the technical framework are as follows:

* A platform approach of long-life systems - fused only completely with other interdependent systems in operational test. There is no central unifying design philosophy or framework yet across programs or projects although since the First Principles Review many are reportedly under development by the 40 new program management offices.

* Usually verification occurs against 'high-level' requirements sets that are rather independently constructed for each platform project/program and unfortunately these are usually inflexible during development and build.

* Usually verification is underpinned by deep technical and regulatory reach-back to designers and manufacturers as authorised engineering organisations for performance and safety critical systems (i.e. aerospace, safety critical software, etc.), often extending overseas to major primes or the U.S. DoD Foreign Military Sales with differing legal frameworks and lengthy contracting timelines. Safety and mission criticality criteria and acceptance frameworks are usually not universally understood nor agreed at the project level, let alone at the program or combined force level.

* There is heavy project reliance on original equipment manufacturers for technical expertise to do verification testing and reporting appropriately (and honestly) on a project.

* Technical performance measures are usually set by the development contractors and are deterministic and schedule-driven and not probabilistic in nature.

* Acquisition contracts are incentivised around cost and schedule not performance, and thus tend to discourage early T&E and risk exposure by the contractors who usually maintain a technical ascendency.

* DoD project technical staffs are usually under-resourced to attend system safety or survivability assessments, end-to-end performance, usability assessments, and cybersecurity assessments.

The operational assurance frameworks are usually conducted in the Australian DoD in three parts:

* The first part checks adequacy of operational training, procedures, tactics and the like on debut of new or significantly modified platforms. This part usually ends with a declaration of an initial operational capability and approval to proceed with further operational roll-outs. Unfortunately, such milestones are usually not linked to production and delivery which are, therefore, done regardless.

* The second part is a check of supportability in a longer operational period that checks reliability, availability and maintainability in initial service, leading to the setting of support regimes. Again, this milestone is not usually recognised contractually, unless the capability has been acquired with a long-period service support contract.

* The third part is a high-end force 'certification' as a work-up to warlike operations, usually involving joint forces melded as a fighting family-of-systems for a certain mission and only done when absolutely necessary (i.e. operational control not technical control). These exercises are usually called Mission-Specific Training and Mission-Rehearsal Exercises, are part of critically continuous operations, and are focused more on the mission than any new capability. When these are done, they are an excellent reference for the integration achieved and assure what a joint operational force must achieve competently before rotation into theatre and the engagement of adversaries.

Such assurance activities should be prefaced during project development with operational staffs attending evaluations of system safety, survivability end-to-end performance, usability T&E and cybersecurity T&E, however, like the technical staffs listed earlier, operational staffs are usually not resourced or made available to attend such early evaluations. All of these phases of operational assurance currently have heavy reliance on operational evaluators with limited competence in experimentation and test design, execution or analysis and who operate without the operational confidence criteria mandated by DOT&E in U.S. DoD project T&E Master Plans; or better still, a systematic and authoritative capability preparedness criteria like that proposed by Tutty (2016).

Collectively, the current limitations in military assurance frameworks lead to significant overall capability risks in the following:

* Poor articulation as to the interoperability being sought (i.e. Functional, Domain or Enterprise) and to what levels of safety and confidence for peacetime training and wartime operations.

* Slow and inflexible evolution rates with poor project management incentives similar to those documented by Smith, White, Ritschel, and Thal (2016).

* Low levels of integration at program or force-level, with difficulties porting information such that man-in-loop workarounds are usually necessary.

* Weak to non-existent test of end-to-end weapons performance and effectiveness, cyber-attack surface resilience and cyber-defence capabilities.

* Increasing loss of integration and information assurance to operate with U.S. and Fives Eye's forces.

* Time-consuming certification of joint task forces through extensive training and rehearsal exercises caused by the low integration baseline of all debut capabilities with their legacy systems.

The consequences of poor integration create exploitable weakness to certain threats, such as:

* high-speed attack, where group defence must be automated;

* unexpected attack in such missions as counter-insurgency warfare, where threat directions are not obvious;

* cyber-threat, where legacy systems have not been designed to resist malicious intent; and

* the ability of humans to interpret systems-generated warning, risk or hazard due to a lack of common decision-making framework between systems.

9. Identifiable U.S. DoD I3 assurance initiatives

There are significant and obvious trends in the U.S. DoD since 2010 in dealing with complex adaptive systems (Keating et al., 2017), the increasing interconnectedness of Defence systems (Keating, 2017; Normann, 2015; Small et al., 2017), and cyber threats (Brown et al., 2015; Joiner, 2017; Nejib, Beyer, & Yakabovicz, 2017). Rather than a complete listing, this paper deals mainly with the measures that are not yet being adopted in the Australian DoD at anywhere near the rate required to keep Australian forces integrated, or in some instances interoperable, and certainly not information-assured, even when the equipment is deemed 'common'. In describing these U.S. initiatives, clearly they are advocated to some extent for the Australian DoD. In doing so, bear in mind that what is advocated is not to match U.S. scale - that is unobtainable for Australia - but rather to match the rigour that it is being employed. Given that in each of these areas, the pace of reform in the U.S. DoD has been almost frenetic for at least the last eight years, especially for such an institution, then if the Australian DoD is to catch-up, say in four years while their ally still evolves, by reference Australia's pace of reform would need to be revolutionary. Also, it probably goes without saying, but the Australian DoD can leverage U.S. expertise to catch up, however in doing so, it should not reinvent what the U.S. has refined. For example, the U.S. DoD's Cybersecurity T&E Guidance Manual (U.S. DoD, 2015a) could be adopted by the Australian DoD with only minor clarifications so as to achieve consistency and rigour. Similarly, Australian DoD developmental projects could follow the U.S. DoD lifecycle (U.S. DoD, 2015b).

9.1. Initiative 1 - augmenting operational exercises with formal experimentation

The U.S. DoD has developed experimentation exercises in each key capability area where developing capabilities are deliberately networked and evaluated with legacy systems. Such experimentation exercises have annual battle-rhythms, dedicated planning staffs and evaluation scientists, such that project offices who look after updates of legacy systems and new developmental systems, only have to cover the cost and effort of sending or linking their systems and occasionally providing representative users. The qualification events for the experimentation exercises start with paper-based nomination, followed by laboratory-level checks of basics like system safety, cybersecurity, survivability and general software usability, mainly just to the extent that the candidate systems do not spoil the experimentation for the other systems. Often the qualification exercises will develop a means to validly scale from one or two representative systems to many, and this scaling is important because much integration and networking of distributed systems still exhibit unforeseen properties when scaled. These exercises are distinct from purely operational exercises and qualification of forces in rehearsal exercises, but nor are they purely science experiments. The experimentation exercises fill a void that helps ensure integration and information assurance is not only maintained for capabilities already in service, but that new capabilities have every opportunity to efficiently do I3 assurance during their often long periods of technology maturation, design maturity and manufacture development, and importantly before they debut.

Two of the most established of these U.S. experimentation exercises are Bold Quest, which is in aviation, and the Network Integration Exercise, which is in conventional land forces. The latter will briefly be outlined as an example, since it has been attended or observed on occasions by the Australian DoD since 2010 and developed rapidly since then. (13) The Network Integration Exercise is run by the recently formed Force Modernisation Brigade with oversight by Army T&E Command. While it is run in the field around May each year at White Sands Missile Range, the qualification of the systems for the next year begin almost immediately the exercise concludes, while laboratory tests of candidates is facilitated by the many test networks outlined later. Cybersecurity is now an essential part of the experimentation exercise, especially at the laboratory level, but also involving a degree of real field use. Not all experimentation exercises are large, there are many niche ones especially where DoD requires joint outcomes, innovative use of technology, and the threat evolves more rapidly (such those countering Improvised Explosive Devices with electronic countermeasures - which includes the Five Eyes in three campaigns per year at Naval Air Weapons Station at China Lake). Another example is an industry demonstration experimentation for Chemical, Biological, Nuclear, Radiological and Explosive threat detection and neutralisation, which utilises specialist DoD infrastructure and legacy systems to demonstrate 'plug and play' innovations.

Such experimentation exercises represent some U.S. DoD 'take-back' of RDT&E responsibility from outsourced prime contractors, so as to enable earlier integration testing and evolving threat representation in testing without commercial disclosure, especially for cybersecurity T&E (i.e. cooperative vulnerability and penetration testing) and tactics disclosure in areas like special forces. Such take-back has had the following challenges which will also be there if adopted by Australian DoD:

* Requires DoD ownership of integration test ranges, such as the National Cyber Range.

* Challenges standard contractual risk transference in systems architecture design to go back to more risk-sharing models.

* Is greatly facilitated when the DoD owns the design, or at least key aspects of the intellectual property-something rare in Australia.

* Can delay contractor developmental design progress.

* Requires DoD provision, above individual projects and contractors, of such models, the means to do distributed simulation, and test network backbones, such as those covered later in this paper.

Development of cost-effective experimentation exercises and developmental design critically involves mixing 'Live, 'Virtual' and 'Constructive' (LVC) across the experimentation and T&E networks described later. LVC refers to either the three different types of simulation in a collective sense; that is, live simulation, virtual simulation and constructive simulation, or more usually the LVC term is used to refer to the capability to do all three at the same time, or near-real time within a family-of-systems. The usual capability context for the LVC concept and acronym is referring to the necessary network infrastructure and exercise support personnel to make such distributed simulations occur. The main system development advantage of LVC is to be able to trial the collective effect of legacy and development systems that must integrate or interoperate in a family-of-systems, early enough so as to influence the design of developing systems. The main operational advantage of LVC distributed simulations is to be able to simulate and experiment in using family-of-systems in new and more effective ways (Harper, 2015; Kometer, Burkhart, McKee, & Polk, 2011; Tutty, 2016). The supporting terms of LVC are:

* Live Simulation. Exercises involving real people operating real systems but where the enemy is simulated or role-played and the environment are likely to be constrained to a partially-instrumented range or exercise area.

* Virtual Simulation. Simulation involving real people operating simulated systems in a common shared synthetic environment that supports interactions with simulated entities (i.e. avatars, equipment).

* Constructive Simulation. Simulation involving simulated people operating simulated systems and where real people make limited inputs and most aspects of the forces involved are determined by computer modelling, including where necessary aspects of human decision-making.

LVC requires joint commitment to interoperable simulation, training and instrumentation during all acquisitions and for legacy systems, which is greatly facilitated by the U.S. Program Executive Office for Simulation Training and Instrumentation (PEO-STRI, The Australian DoD's current Land Network Integration Centre lies within Army, fulfils some of these roles, and has organisational analogy to early versions of PEO-STRI. A large Australian DoD simulation project was developed under General Hurley's leadership (14) in the 2000's that had aims to create an environment for LVC in Australia, however, in the 2010's the capability unfortunately reverted to some Service-led initiatives in simulation.

In summary, experimentation exercises like NIE and Bold Quest are:

* inherently programme rather than project;

* form the nexus between project-level systems engineering T&E and operational exercises, especially to bridge long development and build periods for important validation and requirement update opportunities in areas like software usability, cyber-security, system safety and system survivability and effectiveness work;

* provide the structured means to evolve and ultimately validate programme-level measures of effectiveness involving both developed and developing capabilities; and

* critically depend on test network infrastructure to cost-effectively do LVC distributed simulation for candidate qualification serials and often to also do the actual exercises.

9.2. Initiative 2 - integration system program offices and new certifications

The U.S. DoD have developed program offices for each of the services focused on integration. These integration offices form a nucleus of specialty integration staff to advise other program offices on the evolving architecture requirements and to acquire integrating capabilities. These integration offices have been the champion of many of the experimentation exercises and their initial focus is usually on creating common operational training around LVC distributed simulation, since this creates a customer focused on integrating capabilities and effect. Australian DoD initiatives for a Joint Terminal Attack Controller capability (Hartigan, 2016) are an example of a key capability that could really benefit from attention by an integration office. Such an office could focus on end-to-end simulation and experimentation to back up the operational exercise Black Dagger where such Australian controllers are qualified.

The other U.S. DoD endeavour to assure I3 is in new certifications and accreditations. This is an area where the Australian DoD has had to comply in order to allow coalition work with the U.S. and it has created new authorities. Examples are the joint terminal attack controller certification, tactical data link accreditation, information exchange accreditation, joint-fire and targeting certification and so on. Just like LVC simulation and experimentation exercises, these authorities challenge the Australian DoD with questions about where they belong. Are they technical regulation, are they operational or are they part of acquisition? Most have been birthed in acquisition, but then led very active operational lives deploying to forces around Australia because of the absence of connecting test networks to do accrediting remotely via distributed LVC simulation and T&E networks. In truth, these new accreditation agencies have aspects in all three functional areas and are quintessential to integrating and modernising forces. Also, most remain seriously under resourced to ever reach full implementation.

9.3. Initiative 3 - enhanced T&E regime - earlier, evidence-based rigour and innovation: test smart not test often!

The most recent U.S. Director of Operational T&E (DOT&E), Dr Gilmore, (15) was appointed in 2009 and until his departure this year, he oversaw a dramatic improvement in the rigour, timeliness and joint networking of test sites and distributed simulation. In particular, he rolled out mandatory use of probabilistic test design and test analysis techniques to all U.S. test centres, test staff and acquisition programs that are inherently and mathematically efficient. The rollout included: comprehensive education, training, scientific assistance and competency assessment, T&E planning and compliance checks (Joiner, Kiemele & McAuliffe, 2016). He required all major acquisition programs to have testable measures and well-designed probabilistic test measures in their T&E Master Plans at all acquisition gates, including policing the conduct of such test planning before the operational analysis that enables production sign off (by congressional law) and before acceptance into Service (Ahner, 2016; Chu, 2016). The efficiency and rigour of these measures has significantly assisted the adaption of U.S. acquisition to meet cyber threats with comprehensive cybersecurity T&E (Brown et al., 2015; Joiner, 2017; Murphy et al., 2015).

Dr Gilmore's measures have been slow to be adopted in Australia, in part because of a fundamental lack of a champion and organisational equivalent, and despite 'bottom-up' education in the field being available since 2012. A recent Australian DoD 'training needs analysis' of T&E, arising from recommendations of the ANAO (2016), touched on the growing disparity between the T&E and acquisition staffs of the two DoDs in this key field, however despite the ANAO focus, moves to follow the U.S. remain very slow. The University of New South Wales has partnered with a major U.S. DoD provider to continue to educate key staffs in these competencies; however, this remains largely voluntary attendance, and is nowhere near the scale necessary to catch-up, and it has only penetrated some T&E branches and not the acquisition project staffs. As documented by Joiner and Ryan (2016), when they reviewed Australian Army T&E, the U.S. Defense Acquisition University have over five per cent of all U.S. acquisition staff requiring mandatory T&E qualifications, whereas in Australia there are still, as yet, no tracking of T&E qualifications for acquisition staffs. This is despite such tracking of T&E qualifications in acquisition being called for by the ANAO in 2002 and again in 2016 (ANAO, 2002, 2015).

As with T&E more broadly, the use of mandatory test measures under-pinned by rigorous probabilistic experimentation test design and test analysis techniques, has removed much of the scope in the U.S. DoD for 'decision by conjecture and influence' or what is also commonly called 'paper-based analyses'. This tightening of decision-making to be factual has extended in the U.S. to the use of modelling. Models must now be verified, validated and accredited (VV&A) before they can be used for acquisition decisions and/or operational 'deployment. This assurance of models also underpins the availability and rigour for the use of such models in distributed LVC simulation (Elele et al., 2016). (16) The Australian DoD need to take greater charge of the veracity of such models as they fundamentally underpin operational software effects in weapons, electronic warfare and cyber performance (Tutty, 2016, 2017), and these are operational decisions to be made by operational commanders not original equipment manufacturers' software engineers via contractual arrangements. Where decision-making still occurs without testing (to include modelling on VV&A models), the 'name and shame' of independent annual reports to Congress by Dr Gilmore as DOT&E at least calls such practices out, (17) inviting Congress to help end them. For example, with respect to the F-35 aircraft, Dr Gilmore had the following to say (U.S. DOT&E, 2017, p. 57, bold added), and later in the same report he expressed public frustration at decisions being based on contractor modelling not appropriately subjected to accreditation:

... many questions remain on the prudence of committing to the multi-year procurement of a Block Buy scheme prior to the completion of IOT&E:

- Is the F-35 program sufficiently mature to commit to the Block Buy with the ongoing rate of discovery while in development?

- Is it appropriate to commit to a Block Buy given that essentially all the aircraft procured thus far require modifications to be used in combat?...

- Would committing to a Block Buy prior to the completion of IOT&E provide the contractor with needed incentives to fix the problems already discovered, as well as those certain to be discovered during IOT&E?

- Would the Block Buy be... consistent with the law?'

No such legal or name and shame processes exist within the Australian DoD to call out acquisition practices that are not based on experimentation, test and accredited modelling, leaving it to the Parliament (Australian Parliament, 2016; Australian Senate, 2012, 2016), ANAO (2002, 2013, 2016) and academia (Joiner & Atkinson, 2016), who unfortunately can only cover headline problems after the event and usually only on the very large programs that warrant public attention. A lower-level, active, independent and public review mechanism is warranted for the Australian DoD, along the lines of the U.S. DoD (Joiner, 2015b).

9.4. Initiative 4 - T&E network infrastructure

The U.S. DoD test networks are now extensive, connecting every major design development facility and test range in the U.S., using three networks each with different levels of security and purpose; namely the Test Enabling Network Architecture, the Joint Mission Environment Test Capability network and the Joint Information Operations Range. The networks were developed by, and are run by, the Test Resource Management Center whose mission is to provide the necessary enterprise-wide architecture and the common software infrastructure to do the following (Hudgins, 2017):

* Enable interoperability among range, C4ISR, and simulation systems used across ranges, hardware-in-the-loop facilities, and development laboratories.

* Integration of distributed LVC simulation assets.

* Leverage range infrastructure investments across the DoD to keep pace with test and training range requirements.

* Foster reuse of range assets and reduce the cost of future developments.

While the first of the T&E networks is the open architecture for general range instrumentation at a protected-level and has been used for over 250 distributed test events in a decade, the second provides similar network functions and mission for T&E in joint systems-of-systems and cyber environments at a higher classification. This classified T&E network connects over 45 test sites with 76 laboratories across the U.S. DoD (Hudgins, 2017).

The main enabling cybersecurity T&E infrastructure within the U.S. DoD is the National Cyber Range which according to Arnwine (2015) was created between 2009 and 2012 to 'provide secure facilities, innovative technologies, repeatable processes, and the skilled workforce, and create hi-fidelity, mission representative cyberspace environments'. The National Cyber Range is linked to T&E establishments across the U.S. through the aforementioned classified T&E networks. In more detail, the networked cyber range enables the following (Arnwine, 2015; Tutty, McKee, & Sitnikova, 2016):

* conduct of testing that cannot or should not occur on open operational networks due to potential catastrophic consequences, for example full execution of extremely malicious threats on realistic representations of systems and networks (e.g. releasing self-propagating malware);

* test of advanced cyberspace tactics, techniques, and procedures that require isolated environments of complex networked systems (e.g. movement on the Internet);

* rapid and realistic representation of operational environments at different levels of security, fidelity, and/or scale; (18) and

* precise control of the test environment that allows for rapid reconstitution to a baseline checkpoint, reconfiguration, and repeat of complex test cases, so as to quickly evaluate hundreds of scenarios.

The Australian DoD can leverage these networks and their procedures to connect its far fewer test sites and laboratories, in many cases back to the U.S.-procured system houses, so as to conduct more effective and efficient distributed LVC simulations and experimentation exercises as part of gaining confidence in capability (Tutty, 2016). Such leveraging starts with a project agreement under the extant T&E memorandum between the two countries, while implementation starts with an audit of the most representative LVC simulation test sites that support each Australian DoD capability and where that site is.

9.5. Initiative 5 - cybersecurity protection plans and T&E

The U.S. DoD response to cyber threats began in earnest in 2008 with a Presidential Directive and has since leveraged the DoD's strengths in T&E and its T&E network infrastructure extremely well. Joiner (2017) outlines the three-phased approach taken by the U.S. DoD which has ended with a completely updated acquisition policy with cybersecurity integrated into all acquisition lifecycle stages. Joiner argues that the U.S. DoD decision to begin cyber-security reform with representative operational T&E, at the 'right' of the lifecycle, was fundamental to the DoD understanding the threat consequences and risks properly and then investing in the infrastructure, acquisition and T&E staff competencies, developmental design and then the subsequent two phases of 'shift-left' and 'fully integrated. Joiner, Sitnikova, et al. (2016) further propose a series of operationally-focused allied cybersecurity trials to inform the Australian DoD.

Beyond the impressive cybersecurity support and T&E infrastructure outlined earlier, the acquisition policy with cybersecurity integrated is comprehensive (Brown et al., 2015; Murphy et al., 2015; U.S. DoD, 2015b) and is underpinned by a clear and comprehensive Cybersecurity T&E Guide that is readily available online (U.S. DoD, 2015a). The early heart of the process for developing projects or project proposals is the Program Protection Plan, which links the traditional efforts in security, requirements and T&E with the new cybersecurity assurance requirements and activities. Within developmental projects like Australia's future submarine, and in major acquisitions like the F-35 JSF and MQ-9C Triton aircraft that will critically depend on Australian DoD infrastructure, such program protection plans urgently need to be produced, so as to shape the architectural and contractual backbone of more resilient capabilities. There is no embarrassment, and a lot of common sense, in developing those protection plans using the U.S. Cybersecurity T&E Guide (U.S. DoD, 2015a). What is emerging from U.S. efforts with the acquisition policies that have cybersecurity fully integrated; that is since early 2015, is that future cyber resilience fundamentally requires careful control of the supply chain of all aspects of software-intensive system architecture, especially early and then overseen through-life against the evolving cyber threat (Alberts, Haller, Wallen, & Woody, 2017). The ramifications for large developmental projects in the Australian DoD, their associated support infrastructure and industry are profound. Allowing major primes to pick globally from the most competitive suppliers and then integrating these with very limited oversight by DoDs cannot occur in the future. Key Australian industries for manufacturing micro-chips, computer boards, operating systems and software architecture are arguably far more crucial than using Australian steel in future ships. Only through strategic and rigorous program protection plans will such major changes be wrought.

Deeper into the U.S. lifecycle there are Cyber Security Assessment and Advisory Teams that undertake much of the heavy-lifting of setting requirements and conducting cybersecurity cooperative vulnerability and penetration testing. Guiding such work is high-level guidance on what cybersecurity T&E measures are required (U.S. DoD DOT&E, 2014), including the following:

* The 16 core cybersecurity compliance metrics to be verified during the vulnerability and penetration testing phase.

* The minimum core data to be collected during the vulnerability and penetration testing, to include:

* cybersecurity vulnerabilities discovered;

* intrusion, privilege escalation and exploitation techniques used in penetration testing;

* metrics for password strength;

* adversarial activities and the associated difficulty, time to execute and success;

* time for defenders to detect each adversarial intrusion or exploitation;

* time for defenders to mitigate intrusions or exploitations;

* time for restoration of mission capabilities after each degradation; and

* overall mission effects of each degradation.

* Cybersecurity content required in the T&E Master Plan, including: architecture, operational environment, evaluation structure, authority to operate, time and resources and coverage of the key cybersecurity T&E steps.

* Cybersecurity content required in the Operational Test Plan.

In calling for the above detail, it is important to appreciate that in the U.S. DoD, project proposals cannot proceed to development without an independently approved T&E Master Plan (i.e. DOT&E) and that cooperative vulnerability and penetration testing occurs before authority for full production. These demanding policies, backed up by some serious protection plans and penetration testing has had significant effects in the last few years that are not appreciated well outside the U.S. DoD. Once U.S. systems achieve this greater rigour and cyber-resilience, it will be much harder for allies to interoperate, let alone integrate, because the overall informational attack-surfaces will be fundamentally weakened by the lesser information assurance of the non-U.S. systems.

9.6. Initiative 6 - permeating these U.S. initiatives into industry

The U.S. DoD initiatives outlined above have all permeated U.S. Defense Industry. The Australian DoD purchases over 50% of its capabilities from either the U.S. DoD direct through foreign military sales or through direct commercial contracts with suppliers common to the U.S. DoD. As such, it is somewhat alarming that the Australian DoD staffs are not being educated in these initiative areas, except largely through their own initiative. U.S. Defense industry can commercially and reasonably support reduced costs when bidding in Australia compared to when they bid in the U.S., simply by not including the rigorous test methods, LVC simulation capabilities, cybersecurity checks and so forth, including putting their 'B Team' onto the Australian contracts. (19) Unless there are significant changes in the Australian DoD, most acquisition and test staffs would not recognise any attempt by U.S. industry to do this. Beyond simply ensuring Australian DoD are smarter buyers in these areas, any education in these areas needs to also permeate such new rigorous policies and techniques through to the Australian Defence Industry who would necessarily implement much of this work under contract. The best way to do that is to educate the Defence staff, because Defence Industry usually seek a deeper technical knowledge than their customer.

9.7. Future U.S. initiatives

Under funding from the National Centers for System-Of-System Engineering (NCOSE), one very new initiative developed in the U.S. and in very early trial use is a model of Complex System Governance. Nine essential system governance functions have been identified (Keating et al., 2015) that provide control, communication, coordination, and integration as follows:

* Control establishes constraints necessary to ensure consistent performance and future trajectory.

* Communications provides for flow and processing of information necessary to support consistent decision, action, and interpretation throughout the system.

* Coordination provides for effective interaction to prevent unnecessary instabilities within and external to the system.

* Integration maintains system unity through common purpose, designed accountability, and maintenance of balance between system and constituent interests.

Governance using the new Reference Model (Keating & Bradley, 2015) and associated diagnostics is designed to be tailored to the complexity needed by the capability under development, rather than simply being assumed. As such, use of the new governance framework requires entry-level assessments and adjustments. As part of implementing the First Principles Review, Australian DoD has restructured its capability acquisition and sustainment into forty programs to oversee all of the DoD's capability projects. Five of the forty programs will pilot creation of formal Program Management Offices over the next 18 months, so as to determine the most appropriate structures, processes, metrics and so forth. As part of this pilot, the new governance framework has great potential to help develop guidance to tailor the governance performed by the Programs to the system complexity each Program has to acquire and sustain. Such tailoring will be essential to ensure the Program guidance frameworks used for the remaining 35 PMOs has efficient and yet effective governance rather than a 'one-size-fits-all' approach. A research paper by Bradley, Joiner, Efatmaneshnik, and Keating (2017) has proposed the applicability of the new governance framework to Australia's inherently complex future submarine project.

Another research from the U.S. DoD with applicability to the Australian DoD's pilot Programs is the work on incentives in DoD acquisition by Smith et al. (2016). For example, the eight recommendations from their meta-analysis of U.S. projects (their Table 2) can be leveraged to try to provide better project manager tenure, have technological readiness assessment based on test, and appropriately segregate research and development from projects.

10. Leveraging such integration initiatives for Australia

The Australian DoD has to follow the U.S. integration, interoperability and information assurance lead or be forced to either:

* Opt-out of allied Defence exercises and operations where they would be the weakest link.

* Accept significant work-up delays (18 months or longer) to participate in such allied exercises and operations with little flexibility to add or subtract elements once committed due to the wide assurance differences.

* Participate in different ways, such as embedded personnel, rather than Australian force-level or platform contributions.

An analogy here is to consider a New Zealand (N.Z.) land force contribution to an Australian-led expeditionary force once Australia's land force digitisation is complete. How would the combined force do battlefield awareness (i.e. blue and red force de-confliction), do digital battle-management cooperative planning, evoke electronic warfare protections in software definable radios, undertake counter-improvised explosive detection operations, and avoid an untested hole in the cybersecurity attack surface from the N.Z. legacy equipment? The U.S. will increasingly find Australian DoD deficiencies in areas like cyber-resilience. Due to the evolving cyber-threats of potential adversaries, and increasing use of software-intensive systems, such weaknesses cross over into inabilities to trust cooperative engagement, joint fires, share intelligence, share battlefield awareness and so forth. The Australian DoD needs experimentation exercises in: land force networks, air force networks, maritime force networks, joint fires, joint cooperative engagement, cybersecurity, non-traditional threats (20) and many others.

In evolving the I3 assurance options for Australian DoD, as ever, there are options in the degree of sovereignty. Experimentation exercises will be costly to run autonomously with Australian-only systems, test networks and test infrastructure, especially in maintaining the diverse representative threat libraries to apply across those networks. For example, Australia maintains no autonomous air target capability. Even if the Australian DoD can mount autonomous indigenous experimentation exercises, participation in U.S. experimentation exercises would be quintessential to short notice capabilities such as clearance diving, airhead preparation, counter-mines, non-traditional threats, defensive and offensive counter-air, strike and so forth. An option is to forego Australian indigenous experimentation exercises altogether, keeping only the capability to run autonomous project-level T&E (unique platforms) and operational exercises. Such a dependent approach would export Australian experimentation to the U.S. experimentation exercises and may curtail some sources of non-allied acquisition (i.e. European) where developmental design and cybersecurity T&E in U.S. experimentation exercises would not be possible from such sources. It also would require Australian liaison officers in U.S. and Five Eyes experimentation exercise planning staffs and an ongoing commitment to use strategic airlift assets for personnel and representative equipment to routinely attend U.S. experimentation exercises and all their qualification tests and events. Such attendance would need to be under the coverage of joint operations and trials and not the mantra of international visits which are always subject to short-term cost saving purges. Whether indigenous or dependent, such experimentation exercises would:

* be inherently family-of-systems, complex, require mixed qualitative and quantitative methods and be tactically and commercially sensitive, when compared to current platform-level verification T&E;

* be less-suitable to outsource than most current Australian DoD T&E;

* need to embed more than platform people in the key strategic U.S. and allied weapons, electronic warfare and cyber programs to ensure safety, technical integrity and operational suitability and effectiveness is done directly by the military in an informed way and not third-hand via contractor field service representatives;

* require more-qualified military experimentation and T&E staff rather than current liaison staff who often lack test design, execution and analysis skills; and

* form an essential insight into the nexus of capability development efforts and operational capability preparedness and the risks being resolved or accepted therein.

To that end, such experimentation exercise reports would be more professional, incisive and timely report cards for programmes and their projects than the current activities produce and which, to some extent, they may currently fear (i.e. have consequences and tracked to agreed conclusions).

11. Conclusion

The increasingly interconnectedness of DoD systems and the growth in software-intensive higher-order decision-making that they contain, means that acquired DoD capabilities are now more often complex adaptive in nature and they form systems-of-systems that are now more interconnected into families of systems-of-systems. Coupled with the rising sophistication and frequency of cyber threats, there is a need for more reactive and robust assurance mechanisms in DoD acquisition. The pursuit of information dominance is now being exploited by adversaries and requires new vigilance measures!

While the Australian DoD has successfully assured previous complex network-centric capabilities, for example by validating C4ISR functionality, such validation work alone is unlikely today to deal sufficiently with the rate-of-change of threats, or to fully account for the latest complex adaptive systems. This concern is especially for long periods of development, where it is no longer appropriate for requirements to be in stasis and for the developed systems to be isolated from the other systems that they must integrate with.

The exposition in this paper of recent U.S. DoD initiatives to better synthesise their capabilities has identified six broad assurance themes where the U.S. DoD has outpaced the Australian DoD. The work has also identified how the Australian DoD can leverage these themes to catch up with its ally. If the Australian DoD does not follow the U.S. integration and information assurance lead it risks being forced to opt-out of allied exercises and operations, accepting significant work-up delays or being forced to participate with embedded personnel rather than Australian force-level or platform contributions.

The first of these U.S. DoD initiatives is structured experimentation exercises that occur annually in almost every military force area. These experimentation exercises have annual battle rhythms and qualification processes that bring developing systems together with those in-service which are themselves undergoing regular obsolescence changes, in such a way as to check and adjust integration and information assurance as necessary. These centrally-funded opportunities to experiment (find what works or does not work) are cost-effective for individual projects and contractors whether large or small, in particular to check system safety, system survivability, software usability and cybersecurity, including with representative users. The second of the U.S. DoD initiatives is the creation of new project offices focused on integration, interoperability and information assurance who support, and are informed by, the experimentation exercises. This initiative includes a number of new certifications in areas like tactical data links, cybersecurity, joint targeting and accreditation of simulations, many of which have begun to permeate the Australian DoD but not systematically across all warfare domains or acquisition programs.

The third U.S. DoD initiative is rigorous and timely T&E measures, planning, conduct and analyses, that includes innovative and highly efficient probabilistic test design and analysis techniques and 'name and shame' compliance reporting. When compared to the U.S. DoD and best practice, the Australian DoD's new project lifecycle has key knowledge points missing, where service-led T&E and decisions are required to appropriately inform whether to proceed with a design and then production.

The fourth U.S. DoD initiative is three sophisticated test network capabilities that link all test sites and laboratories to enable distributed live-virtual-constructive simulation and testing. This coalescing of systems on networks has allowed early experimentation testing of the integration and information assurance for any family-of-system combinations that joint force planners need, whether anticipated in each systems' design or not. While these test networks were initially developed to drive cost savings in test, they have proved essential to the speed to which the U.S. DoD has improved its cybersecurity T&E and thus cyber-resilience.

The fifth U.S. DoD initiative covered is robust acquisition policy and processes to deliver cyber-resilient systems in the face of mounting cyber threats. The U.S. DoD began by characterising the operational effects of cyber threats which then drove investment by the services. In the mature acquisition policy with integrated cybersecurity, the creation of program protection plans drive early supply chain and architectural security, which is then followed by cooperative vulnerability and penetration testing before production. There are key Australian DoD projects that urgently need such protection plans and these should be expeditiously developed using the U.S. DoD's Cybersecurity T&E Guidebook. With over 50% of Australian DoD acquisitions procured from the U.S., it is no longer appropriate or sensible for the Australian DoD to invent its own such guidebooks or the majority of such planning processes. The sixth and final U.S. DoD initiative is to permeate these assurance initiatives into Defence industry and thus give greater depth and assurance. Such work to ensure industry keeps up and is part of the assurance solution has been deliberate in the U.S. DoD and would need to be similarly considered in the Australian DoD.

This research has identified a multiplicity of ways for the Australian DoD to leverage U.S. DoD initiatives for dealing with complex adaptive systems and these ways are listed in the recommendations. Catching up is crucial if Australian forces are to be able to operate with U.S. forces. It starts with admitting Australian DoD is falling behind in the rigour necessary in the four broad assurance areas covered. Then acquisition and T&E staffs must receive urgent education and competency in these areas, so they become smart buyers. Such education should not be left to individuals' initiative, but needs to be driven across all acquisition programmes.

12. Recommendations and future work

The first and most obvious recommendation from this expository research is for the Australian DoD to implement the six initiatives used by the U.S. DoD for greater strategic assurance of integration, interoperability and information. Second, a case is also made that the most effective and expeditious way for Australia to replicate these initiatives would be to leverage the U.S. DoD technologies and allied support. Beyond these two high-level recommendations, this research has not scoped the implementation issues necessary to support lower-level recommendations. For example, the most significant of the initiatives for infrastructure would be the adoption of U.S. DoD T&E networks to key Australian Defence simulation and exercise sites, especially to enable systematic cybersecurity T&E, where the cost and priority of such an infrastructure and expertise rollout needs to be established before proceeding. Similarly, the most significant of the initiatives for operations and personnel would be the creation of Australian experimentation exercises and the necessary personnel and instrumentation to run these has not been scoped. The U.S. initiative to improve rigour of T&E has multiple threads to it, like the competency of T&E personnel which would also require scoping of Australian qualification frameworks. These initiatives are sufficiently complex, costly and inter-related to warrant a dedicated Defence improvement project, perhaps titled something like: 'Rebalancing the Alliance for decision-making in complex defence systems'. The direct reference to the alliance is crucial if the second recommendation to leverage U.S. initiatives is to be realised. The natural and desirable extension of that reference and direction, would be for the new Defence project to have a dedicated U.S. DoD presence; that is, to be an allied project. This would recognise and leverage the fact that it is highly likely to be in the U.S. DoD interest for its ally not to be allowed to fall further behind in the wherewithal for decision-making in complex decision-making.

Based on the authors' insights to the U.S. DoD initiatives and Australian DoD practices, if such a new Defence project was to begin scoping requirements, some of the key areas it should consider as future scoping work are summarised in Appendix 1.


(1.) Hybrid warfare is defined and explained at length by Jordan et al. (2016, pp 134-135) in multiple domains and from multiple sources, including a U.S. Army definition of: 'the diverse and dynamic combination of regular forces, irregular forces, terrorist forces, criminal elements, or a combination of these forces and elements all unified to achieve mutually benefitting effects'.

(2.) Hyper-war is a proposed new definition by Allen and Husain (2017) for 'a type of conflict where human decision-making is almost entirely absent from the observe-orient-decide-act (OODA) loop of Boyd (1976, 1987, 1996: cited in Grant & Kooter, 2005). As a consequence, the time associated with an OODA cycle will be reduced to near-instantaneous responses. What makes this new form of warfare unique is the unparalleled speed enabled by automating decision-making and the concurrency of action that will become possible by leveraging artificial intelligence and machine cognition'.

(3.) The other (military) domains are the physical ones which are generally agreed by Australia and the U.S. to be where the desired (military) effects are generated, for example: land (including sub-surface), air, space and sea (including subsurface), not from who or where the effect is generated from (Tutty, 2017).

(4.) Note that Alberts et al. (2000) at that time included Cognitive and Social 'domains' in addition to the Physical and Information domains: these are treated more as 'dimensions' similar as to how Time is currently treated.

(5.) In the broadest context, NATO defines this as: The ability of systems, units, or forces to provide the services to and accept services from other systems, units, or forces, and to use the services so exchanged to enable them to operate effectively together (North Atlantic Treaty Organisation (NATO) AAP 6, 2013).

(6.) NATO standardisation had focused in on materials and products until the advent of C4ISR, wherein processes and services being exchanged became more of the focus for information independent of the servers and hardware per se. That is, the interoperability of the hardware no longer needs to be common but rather interchangeable or at least compatible (NATO, 2013).

(7.) Australia's new jointly developed and managed civilian - military air traffic system.

(8.) Based on involvement of the authors in supporting these test programs.

(9.) These used the Technology Readiness Levels which fails to recognise that regardless of readiness any such technology in isolation is not a capability that is 'prepared' i.e. ready and sustainable (Tutty, 2016).

(10.) Based on involvement of the authors in supporting these test programs.

(11.) For the LHD see ANAO (2015, Chapter 4) and Australian Parliament (2016). For AWD see ANAO (2013, pp. 24-32, esp. para. 24).

(12.) Based on involvement of an author in supporting these test programs.

(13.) Bold Quest has also been attended by Australian DoD elements.

(14.) General David Hurley AC, DSC was a pioneer in joint capability and a champion of joint projects as the Head of Capability Systems Division in 2001, inaugural Chief of Capability Development Group in the period 2003-2007, Chief of Joint Operations (2007) and then Vice Chief (2008-2011) and Chief of Defence Force (2011-2014) [ avid_Hurley].

(15.) 'Dr. J. Michael Gilmore was sworn in as Director of Operational Test and Evaluation on September 23, 2009. A Presidential appointee confirmed by the United States Senate, he served as the senior advisor to the Secretary of Defense on operational and live fire test and evaluation of Department of Defense weapon systems' until the end of 2016 [].

(16.) While validation and verification are widely known terms in Australian DoD, the addition of accreditation to the acronym list may require definition. According to Elele et al. (2016, p. 336), modelling and simulation accreditation is, 'The official certification [determination] that a model, simulation, or federation of models and simulations and its associated data are acceptable for use for a specific purpose'. Such accreditation should be made by the operational staff based on technical staff recommendation against agreed criteria.

(17.) The DOT&E is required by Congressional law (Title 10) to provide these annual independent reports to Congress [].

(18.) For example: Blue (friendly) force, Red (adversary) force, and Grey (neutral) networks.

(19.) B-Team is colloquial for the second-best team, often anecdotally alleged to follow the team who achieve contract award.

(20.) Non-traditional threats refers to a collective of chemical, biological, nuclear, radiological, explosive and other threats possible in asymmetric and hybrid warfare.


This research was supported in the Capability Systems Centre of the University of New South Wales by Australian Defence Organisation. Associate Professor Michael Ryan and others assisted and encouraged the work greatly. We also greatly acknowledge the immense support over the last decade by Mr. David Duma of the U.S. DOT&E in tirelessly helping Australian officers understand U.S. DoD initiatives in I3 and T&E. Notwithstanding this support, the assessments herein are purely those of the listed authors and the work in no way reflects the views of either the U.S. or the Australian Defence Organisation.

Disclosure statement

No potential conflict of interest was reported by the authors.


This research was supported in the Capability Systems Centre of the University of New South Wales by Australian Defence Organisation; by the Department of Defence, Australian Government [grant number RG171822].

Notes on contributors

Keith F. Joiner, CSC, joined the Air Force in 1985 and became an aeronautical engineer, project manager and teacher over a 30-year career before joining the University of New South Wales in 2015 as a senior lecturer in test and evaluation. From 2010 to 2014, he was the Director-General of Test and Evaluation for the Australian Defence Force, where he was awarded a Conspicuous Service Cross. He is a certified practising engineer and a certified practising project director. He served with multinational forces in Iraq where he was awarded a U.S. meritorious service medal.

Malcolm G. Tutty has served in the Air Force, Public Service and Industry in a multitude of test, operations, engineering, staff, project management and command roles. This includes being a flight test armament engineer at a research unit, an aircraft stores compatibility engineer while on exchange with the USAF during Gulf War I, the AP-3C Chief Engineer at Tenix, director of both Aircraft Stores Compatibility Engineering (ASCENG) and the Woomera Test Range, and being launch authority for two hypersonic firings into space. Recently, he deployed into Afghanistan to conduct trials and field several new high-end EW systems. He is currently serving as a research fellow at the Air Power Development Centre and he has been a fellow of both the Royal Aeronautical Society and the Institution of Engineers for over a decade.


Ackerman, R., 2017. "Innovation, Efficiency Drive Defense Information Systems." Signal, p. 16.

Ahner, D. K. 2016. "Better Buying Power, Developmental Testing, and Scientific Test and Analysis Techniques." ITEA Journal 37: 286-290.

Alberts, D. S., J. J. Garstka, and F. P. Stein. 2000. Network Centric Warfare: Developing and Leveraging Information Superiority. United States Defense Technical Information Center. Accession Number: ADA406255.

Alberts, C., J. Haller, C. Wallen, and C. Woody. 2017. "Assessing DoD System Acquisition Supply Chain Risk Management." CrossTalk 30 (3): 4-8.

Allen, J. R., and A. Husain. 2017. "On Hyperwar" U.S. Naval Institute Proceedings Magazine, July, 143/7/1, 37.

ANAO (Australian National Audit Office) 2002. Audit Report No. 30: 2001-02 Test and Evaluation of Major Defence Equipment Acquisitions. Canberra: ANAO

ANAO (Australian National Audit Office) 2013. Report No. 22 2013-14: Performance Audit, Air Warfare Destroyer Program. Canberra: ANAO.

ANAO (Australian National Audit Office) 2015. Report No. 9 2015-16: Test and Evaluation of Major Defence Equipment Acquisitions. Canberra: ANAO.

ANAO (Australian National Audit Office) 2016. Report No.48 2016-17, Performance Audit, Future Submarine - Competitive Evaluation Process. Canberra, ANAO.

Arnwine, M. 2015. "Joint Mission Environment Test Capability (JMETC): Distributed Testing for Cyber Security." In Presentation to ITEA Cybersecurity Workshop: Test and Evaluation to Meet the Persistent Threat. Belcamp MD, February.

Australian Government DoD. 2016. 2016 Defence White Paper.

Australian Parliament. 2016. Joint Parliamentary Committee for Accounts and Audit (JCPAA) Hearing with Defence and the Australian National Audit Office. Accessed 3 March 2016. with video viewed at

Australian Senate. 2012. Senate Inquiry into Defence Procurement. Canberra: Australian Parliament House.

Australian Senate. 2016. Senate Inquiry into Planned acquisition of the F-35 Lightning II (Joint Strike Fighter), October.

Bradley, J. M., K. F. Joiner, M. Efatmaneshnik, C. B. Keating. 2017. "Evaluating Australia's Most Complex System-of-Systems, the Future Submarine: A Case for Using New Complex Systems Governance." In Proceedings 27th Annual INCOSE International Symposium (IS 2017), Adelaide, Australia, July 15-20.

Brown, C., P. Christensen, J. McNeil, and L. Messerschmidt. 2015. "Using the Developmental Evaluation Framework to Right Size Cyber T&E Test Data and Infrastructure Requirements." ITEA Journal 36: 26-34.

Burton, J., 1993. The Pentagon Wars: Reformers Challenge the Old Guard. Annapolis, MD: Naval Institute Press. ISBN 1-55750-081-9.

C4ISR Architecture Framework. Version 2.0, 18 December 1997.

CCEB (Combined Communications Electronics Branch) 1999. Combined Interoperability Technical Architecture (CITA) - ACP 140, May 3.

Chu, D. S. C. 2016. "Statistics in Defense: A Guardian at the Gate." ITEA Journal 37: 284-285.

Cofer, D. 2015. "Taming the Complexity Beast." ITEA Journal 36: 313-318.

Conley, S., and J. Lenig-Schreffler. 2016. "Management, Mechanics, and Math (M3): an Enhanced Methodology for the Future T&E of Complex Information Systems" ITEA Journal 37: 306-312.

Defence Materiel Organisation and Australian National Audit Office. 2014. Report No. 14 2014-15: 2013-14 Major Projects Report. Canberra: ANAO.

Elele, J. N., D. H. Hall, M. E. Davis, D. Turner, A. Faird, and J. Madry. 2016. "M&S Requirements and VV&a Requirements: What's the Relationship?" ITEA Journal 37: 333-341.

Fowler, S., C. Sweetman, S. Ravindran, K. F. Joiner, and E. Sitnikova. 2017. "Developing Cyber-Security Policies That Penetrate Australian Defence Acquisitions." Australian Defence Force Journal, no. 202, July.

GAO (U.S. Government Accounting Office). 1998. Best Practices: Successful Application to Weapon Acquisition Requires Changes in DoD's Environment. GAO/NSIAD-98-56. Washington, DC: GAO

Grant, T., and B. Kooter. 2005. "Comparing OODA & Other Models as Operational View C2 Architecture." In 10th International Command and Control Research and Technology Symposium: The Future of C2. CCRTS/CD/papers/196.pdf.

Harper, J. 2015. "Live, Virtual, Constructive Training Poised for Growth"' National Defense: NDIAs Business and Technology Magazine.

Hartigan, B., 2016. "Ex Black Dagger Hatches New Brood of JTACs" Online Contact Air Land and Sea, April.

Hecht, M. 2015. "Verification of Software Intensive System Reliability and Availability through Testing and Modeling." ITEA Journal 36: 304-312.

Heinl, C. H. 2016. "The Potential Military Impact of Emerging Technologies in the Asia-Pacific Region: A Focus on Cyber Capabilities." In Emerging Critical Technologies and Security in the Asia-Pacific, edited by R. A. Bitzinger, 123-137. Hampshire: Palgrave Macmillan.

Hudgins, G. 2017. Successful Distributed and Cyber Testing with TENA and JMETC. U.S. DoD public briefing by the Test Resource Management Centre.

Javorsek, D. 2016. "Modernizing Flight Test Safety to Address Human Agency." ITEA Journal 37: 325-332.

Joiner, K. F. 2015a. "How New Test and Evaluation Policy is Being Used to De-Risk Project Approvals through Preview T&E." ITEA Journal 36: 288-297.

Joiner, K. F. 2015b. Implementing the Defence First Principles Review: Two Key Opportunities to Achieve Best Practice in Capability Development. Strategic Insights No. 102. Canberra: Australian Strategic Policy Institute.

Joiner, K. 2017. "How Australia Can Catch up to U.S. Cyber Resilience by Understanding That Cyber Survivability Test and Evaluation Drives Defense Investment." Information Security Journal: A Global Perspective 26 (2): 74-84.

Joiner, K. F., and S. R. Atkinson. 2016. "Australia's Future Submarine: Shaping Early Adaptive Designs through Test and Evaluation." Australian Journal of Multi-Disciplinary Engineering, Engineers Australia, 3-26. doi:

Joiner, K. F., M. Kiemele, and M. McAuliffe. 2016. "Australia's First Official Use of Design of Experiments in T&E: User Trials to Select Rifle Enhancements." ITEA Journal 37: 141-152.

Joiner, K., E. Sitnikova, and M. G. Tutty. 2016. "Structuring Defence Cyber-Survivability T&E to Research Best Practice in Cyber-Resilient Systems." Paper Presented at Systems Engineering Test and Evaluation Conference, Melbourne.

Joiner, K. F., and M. J. Ryan. 2016. Study into Land Test and Evaluation for the Australian Army. Australian Army.

Jordan, D., J. D. Kiras, D. J. Lonsdale, I. Spellar, C. Tuck, and C. D. Walton. 2016. Understanding Modern Warfare. Cambridge, UK: Cambridge University Press.

Keating, C. B. 2017. "Complex Systems Problem Domain: Landscape of a Modern Project Management Practitioner." In Keynote Presentation to the Project Governance and Controls Symposium, University of New South Wales, Australian Defence Force Academy campus, Canberra, May 4.

Keating, C. B., and J. M. Bradley. 2015. "Complex System Governance Reference Model." International Journal of System of Systems Engineering 6 (1/2): 33-52.

Keating, C. B., P. F. Katina, and J. M. Bradley, 2015. "Challenges for Developing Complex System Governance." In Proceedings of the 2015 Industrial and Systems Engineering Research Conference, edited by S. Cetinkaya and J. K. Ryan. Nashville, TN, May 30-June 2.

Keating, C. B., P. F. Katina, J. M. Bradley, R. Jaradat, and A. V. Gheorghe. 2017. "Acquisition System Development: A Complex System Governance Perspective." In 27th Annual INCOSE International Symposium (IS 2017), Adelaide, Australia, July 15-20.

Kemburi, K. M. 2016. "From Subsonic to Hypersonic Cruise Missiles: Revolution or Evolution in Land Attack Capabilities." In Emerging Critical Technologies and Security in the Asia-Pacific, edited by R. A.Bitzinger, 107-122. Hampshire: Palgrave Macmillan.

Kometer, M. W., K. Burkhart, S. V. McKee, and D. W. Polk. 2011. "Operational Testing: From Basics to System-of-Systems Capabilities." ITEA Journal 32: 39-51.

LISI (Levels of Information Systems Interoperability) 1998. C4ISR Architecture Working Group, March 30.

McAfee Centre for Strategic and International Studies. 2017. Tilting the Playing Field: How Misaligned Incentives Work against Cybersecurity. Santa Clara.

Murphy, T., L. D. Leiby, K. Glaeser, and L. Freeman. 2015. "How Scientific Test and Analysis Techniques Can Assist the Chief Developmental Tester." ITEA Journal 36: 96-101.

NATO, AAP-06, Edition. 2013. Glossary of Terms.

Nejib, P., D. Beyer, and E. Yakabovicz. 2017. "Systems Security Engineering: What Every System Engineer Needs to Know." In 27th Annual INCOSE International Symposium (IS 2017), Adelaide, Australia, July 15-20.

Normann, B. 2015. "Continuous System Monitoring as a Test Tool for Complex Systems of Systems" ITEA Journal 36: 298-303.

O'Driscoll, M. J., and J. A. Krill. 1997. "Cooperative Engagement Capability." Naval Engineers Journal March, 43-57.

Orlowski, C. T., P. Blessner, T. Blackburn, and B. A. Olson. 2017. "Systems Engineering Measurement as a Leading Indicator for Project Performance." ITEA Journal 38: 35-47.

Pearce, R. 2016. "Cyber Deterrent: PM Talks up Australia's Offensive Capabilities"' Computerworld. https://www.computerworld.

Peever, D., R. Hill, P. Leahy, J. McDowell, and L. Tanner. 2015. First Principles Review: Creating One Defence. Canberra.

Pentagon Wars. 1998.

Ryan, M. J., and M. R. Frater. 2007. Battlefield Communications Systems. Canberra: Argos Press.

Small, C., E. Pohl, B. Cottam, G. Parnell, S. R. Goerger, E. Specking, and Z. Wade. 2017. "Engineered Resilient Systems with Value Focused Thinking." In 27th Annual INCOSE International Symposium (IS 2017), Adelaide, Australia, July 15-20.

Smith, N. C., E. D. White, J. D. Ritschel, and A. E. Thal. 2016. "Counteracting Harmful Incentives in DoD Acquisition through Test and Evaluation and Oversight." ITEA Journal 37: 218-226.

Thompson, M. 2012. "The Cyber Threat to Australia." Australian Defence Force Journal 188: 57-70.

Tutty, M. G., 2016. The Profession of Arms in the Information Age: Operational Joint Fires Capability Preparedness in a Small-World, University of South Australia, 1 January 2016, [Posted online 10 January 2016].

Tutty, M. G., 2017. The Profession of Arms in the Information Age. V2.2, Air Power Development Centre, Defence Estate Fairbairn, ACT, June 1.

Tutty, M. G., S. McKee, and E. Sitnikova. 2016. "Towards Joint Fires Superiority in Kinetic Weapons, Non-Kinetic Electronic and Cyber Warfare Operations." In SETE Symposium 2016, Melbourne, Victoria, Australia, May 18-19 [Posted online 10 May 2016].

U.S. DoD. 2015a. Cybersecurity T&E Guidebook, Version 1.0, July 1. Available online in numerous locations such as

U.S. DoD. 2015b. Operations of the Defense Acquisition System. U.S. Department of Defense Instruction No. 5000.02 dated 7 January 2015.

U.S. DoD. 2017. "F35 Joint Strike Fighter: Financial Year 2016 DoD Programs." Annual Director Operational Test & Evaluation (DOT&E) Report to Congress, January.

U.S. DoD, Director of OT&E Memorandum. 2014. Procedures for Operational Test and Evaluation of Cybersecurity in Acquisition Programs, August 1.

U.S. DoD, Directorate of Operational Test and Evaluation. 2015, 2016, 2017. Annual Reports to Congress on DoD Programs, F35 Joint Strike Fighter.

Wickens, C. D., J. Lee, Y. Liu, and S. D. Becker. 2014. An Introduction to Human Factors Engineering. 2nd ed. New York: Pearson Prentice Hall.

Zhu, L., M. Staples, and T. Nguyen. 2014. The Need for Software Architecture Evaluation in the Acquisition of Software-Intensive Systems. Fishermans Bend: Aerospace Division, Defence Science and Technology Organisation.

Appendix 1.

Questions and issues posed by authors for a proposed Australian Defence Project

'Rebalancing the alliance for decision-making in complex defence systems'

(1) How to focus leadership on integration and information assurance above the requisite system level and platform level that is archetypal today, so leaders strategically set experimentation exercises and scrutinise integration and information assurance outcomes to truly inform capability development with consequential decision-making?

(2) How to focus leadership on establishing the test networks from the three U.S. backbones at all Australian developmental and operational simulation and test sites and ensuring new and legacy capabilities have the necessary distributed live-virtual-constructive simulation capabilities and models to enable any force mix?

(3) A comprehensive audit of all possible live-virtual-constructive simulation sites and current simulation capabilities with support by all acquisition programmes.

(4) Complete re-evaluation and reprogram with the U.S. of the Australian DoD exercise program to make room for, and critically rely on, the dedicated experimentation exercises

(5) Experimentation exercise funding autonomous of programmes and projects that incentivises participation and threat representativeness.

(6) Dedicated liaison officers into U.S. DoD experimentation exercises.

(7) Technical and operational T&E competence program to educate staff to run such experimental exercises.

(8) How to strategically shift acquisition incentives as outlined by Smith et al. (2016) to align with the eight recommendations in their Table 2, for example to give: better project manager tenure, have technological readiness assessment based on test, and appropriately segregate research and development from projects?

(9) How to improve the alignment of Australian DoD acquisition guidebooks to the U.S. DoD wherever possible so as to account for over 50% of Australian DoD acquisitions being procured from the U.S., especially in areas like cyber-security and T&E?

(10) How to revise the new Australian DoD acquisition lifecycle to ensure projects, especially developmental projects, do soliciting and contracting in distinct phases of the U.S. DoD: (1) analysis of alternatives, then (2) technological maturation and risk-reduction, (3) design maturity and (4) engineering manufacture and development?

(11) How to assure acquisition development phases include T&E focused on software usability, system safety, system survivability and cyber-security, as well as how to better assure these critical evaluations precede production approval?

(12) How to assure all software-intensive or ICT systems undergo usability testing, at least once prior to contract if they are off-the-shelf, and multiple times in iteration if any aspects are developmental?

(13) How to better qualify project or programme staffs who propose or acquire development of complex adaptive systems in the probabilistic experimental test methods and expected value analysis needed to better predict and deal with adaptive agents like artificial intelligence?

(14) How to better assure programmes and their projects are set up to use modelling and simulation through life with distributed live-virtual-constructive simulation of all systems?

(15) How to better assure acquisition contracts have flexibility to avoid requirements stasis, especially to support modelling, distributed live-virtual-constructive simulation, usability testing, cybersecurity assessments and experimentation exercises, which are so key to continued integration and information assurance?

(16) How to assure programmes and their projects use the new Australian or U.S. DoD experimentation exercises as early as possible during pre-contractual risk reduction, development, acceptance, operational fielding and operational upgrades?

(17) How to better assure that acquisition proj ects check claims of off-the-shelf maturity through offer definition activities to do user trials and functional and physical configuration audits to confirm so as to avoid unnecessary re-specifying and re-verifying of the non-developmental aspects?

(18) How to better assure that acquisition projects check developmental components, even integrating software, with an engineering manufacture development and user trial before a production contract can be approved?

(19) Trial the new complex systems governance (Keating et al., 2017) and new acquisition incentives Smith et al. (2016) within Defence's new pilot programme offices.

(20) How to link all T&E facilities and laboratories to the three U.S. T&E networks with appropriate training and assurances so as to enable distributed live-virtual-constructive experimentation and cybersecurity T&E?

(21) How to better assure independent review by test professionals of test concept strategies and plans?

(22) Specific competency programs for project proposal and programme staffs on risk-focused preview T&E planning and experimentation exercises.

(23) How to assure the rigorous use of the U.S. DoD scientific test and analysis methods and competencies, including early T&E criteria?

(24) An annual independent T&E report on all major acquisition programs, focusing on the integration and information assurance aspects.

Keith F. Joiner (a) and Malcolm G. Tutty (b)

(a) Capability systems centre, university of New south Wales, canberra, Australia; (b) Defence, Air power Development centre

CONTACT Keith f. Joiner


Received 10 July 2017

Accepted 6 January 2018
COPYRIGHT 2018 Taylor & Francis Group LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Joiner, Keith F.; Tutty, Malcolm G.
Publication:Australian Journal of Multi-disciplinary Engineering
Geographic Code:8AUST
Date:Aug 1, 2018
Previous Article:Editorial: an Australian perspective on multidisciplinary engineering.
Next Article:Detection techniques for mitigating the nonlinear distortion of ADPCM link.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters