Printer Friendly

Current approaches in HiL-based ADAS testing.

ABSTRACT

The way to autonomous driving is closely connected to the capability of verifying and validating Advanced Driver Assistance Systems (ADAS), as it is one of the main challenges to achieve secure, reliable and thereby socially accepted self-driving cars. Hardware-in-the-Loop (HiL) based testing methods offer the great advantage of validating components and systems in an early stage of the development cycle, and they are established in automotive industry.

When validating ADAS using HiL test benches, engineers face different barriers and conceptual difficulties: How to pipe simulated signals into multiple sensors including radar, ultrasonic, video, or lidar? How to combine classical physical simulations, e.g. vehicle dynamics, with sophisticated three-dimensional, GPU-based environmental simulations?

In this article, we present current approaches of how to master these challenges and provide guidance by showing the advantages and drawbacks of each approach. Therefore, we discuss different ADAS setups and show ways of how to implement HiL test benches for these. We discuss two categories: 1) Hardware level: we focus on the communication structure between the simulated plant model and the Unit under Test (UuT). We show possible interfaces into the sensor units and involved bus systems. 2) Software level: we focus on how to provide the data the UuT expects. This results in rendering images, creating data lists or providing ray-tracing based point clouds.

This article provides solutions for current and up-coming challenges when dealing with HiL-based validation of ADAS and presents an overview of current test-approaches.

CITATION: Feilhauer, M., Haering, J., and Wyatt, S., "Current Approaches in HiL-Based ADAS Testing," SAE Int. J. Commer. Veh. 9(2):2016, doi: 10.4271/2016-01-8013.

INTRODUCTION

The development process in automotive industry is dominated by the V-model. Here, all constructive stages on the left hand side have a corresponding counterpart testing its outcomes on the right hand side. Different testing methods to verify and validate functionalities along all stages are applied and are summarized to X-in-the-Loop (XiL) methods. These testing methods enable short feedback cycles and figure 1 visualizes their usage along the V-Model.

Model-in-the-Loop (MiL) settings test a model of the functionality to be developed. MiL is applied in early stages of the development process to verify basic architectural decisions, design choices or elementary implementations. In a later step, Software-in-the-Loop (SiL) setups test the functionality of the program code, mostly compiled for the target hardware, but without including real hardware. Hardware-in-the-Loop (HiL) methods verify and validate the software implemented in the physical target Electronic Control Units (ECUs). Therefore, HiL test benches need to provide realistic real-time stimuli and load simulation to the ECUs to let them perform as if they were built into a real vehicle. Multiple ECUs are tested additionally in system and integration tests, e.g. network HiL setups. Thereby, tests of the complete electrical setup of a vehicle including its bus systems, sensors and actuators are possible. The more recent approach of Vehicle-in-the-Loop (ViL) replaces the vehicle simulation (and therewith most of the virtual ECUs, too) by a physical car. This approach enables e.g. safety critical ADAS tests by simulating and crashing virtual obstacles while still driving the real vehicle. The simulated parts of the environment are injected into the vehicle's sensors or ECUs. [1]

All mentioned test methods have "in-the-Loop" in common, which means that the reaction of the UuT influences the plant simulation. In respect to ADAS, this feedback is necessary when e.g. testing a Lane Change Assist (LCA): Here the influence of the ECU's control signals on the simulated vehicle dynamics is in focus.

Another way of testing ADAS are Open-Loop tests. In the following, the concept is illustrated using the example of regression testing. If a vehicle project is shortly before start of production, meaning all hardware sensor layouts and their locations within the vehicle are fixed, testing the car in real life scenarios is inevitable. Many aspects of these test runs are random, as e.g. weather conditions or the behavior of other traffic participants are neither foreseeable nor reproducible. Nevertheless, these tests are necessary in the scope of Advanced Driver Assistance Systems development. In reality, during testing, all sensor data has to be recorded. If an unwanted behavior occurs, involved algorithms can be adjusted and afterwards be tested using the recorded data as regression tests. This is done by piping the recorded data through the algorithms again and observing the effects of the change of the algorithms in all recorded scenarios again. The Automotive Data and Time-Triggered Framework (ADTF, [7]) is a software framework often used within the automotive industry for such tasks. As an example, Figure 2 visualizes these Open-Loop regression tests for the example of a road-sign recognition function schematically.

The illustrations shows how sensor data is piped through Algorithm A, which is not recognizing all road signs. Algorithm A might have been used within a test run in reality. With Algorithm B the engineers want to improve the road sign recognition, mainly focusing on the not recognized scenes. When now using the recorded raw sensor data to test Algorithm B, it becomes obvious that it is not perfect, too. Within another iteration, the data processing is further improved, resulting in Algorithm C recognizing all road signs on the available data.

The main benefit of such an Open-Loop approach is the realistic data it is based on. On the other hand, Closed-Loop setups are advantageous when investigating a function's feedback loop. Here the power of Closed-Loop setups lies in the possibility to modify the stimuli data in real-time and therewith enabling to evaluate how the function affects the overall ADAS system. For example, the impact of starting a lane change earlier can be investigated in a Closed-Loop setup, which is not possible when using immutable recorded data in an Open-Loop setup.

To realize Closed-Loop ADAS HiL test benches, the simulated plant models consists of three components:

* Vehicle Simulation including a Driver Simulation

This simulation component consist of a driver able to follow a road or predefined trajectory, a powertrain simulation representing engine and drivetrain, and a three dimensional vehicle dynamics simulation to characterize the vehicle's (truck and trailer) movement.

* Environment and Sensor Channel Simulation

The perception of the vehicle's environment is a core component in autonomous driving functionalities. Therefore, the environment model is part of the simulation. The model contains a three-dimensional representation of the vehicle's surrounding including e.g. a road network, traffic participants or buildings. This virtual world enables to simulate the sensor channel to provide inputs to the sensors, e.g. a video picture to a video camera.

* Traffic Simulation

Many ADAS functions interact with other traffic participants. This is why these entities are part of the simulation. Here, simplified simulation models, e.g. for vehicle dynamics, can be used as long as a realistic and deterministic behavior of all entities is ensured.

When integrating these plant models to an overall plant simulation, connecting it to the sensor interfaces (i.e. stimuli) of the ADAS hardware, and feedback the ECU reaction to the simulation, the loop is closed. Figure 3 visualizes this generic ADAS HiL architecture.

This setting will be used in the next section to explain how to integrate ADAS hardware in various test setups.

ADAS HIL ARCHITECTURE

To evaluate ADAS within a HiL context one of the first questions arising is what the Unit-under-Test and its interfaces are. There might be a single ECU covering the core functionality, e.g. a Radar-based Emergency Brake Assist (EBA). Another setup might include a dedicated and centralized ADAS ECU, implementing sensor fusion and control algorithms and receiving sensor-information from additional control units. Depending on the setup, multiple real ECUs have to be included in the test bench and connected via automotive network systems, like CAN, FlexRay, or Automotive Ethernet. Additional ECUs have to be available virtually to provide relevant supplementary information.

Once the basic test bench setup was identified, a suitable simulation environment for the Unit-under-Test has to be defined. In addition, one of the main challenges when testing ADAS in the context of HiL has to be solved, i.e. how to inject the stimuli into the ADAS ECU. A core component of autonomous driving functionality is the perception of the vehicle's environment via internal sensors. If the HiL test bench includes real sensor systems, the simulated environmental data, like e.g. video images, has to be injected into the hardware components. There are many different possibilities where data gets inserted into the ECU, which results in testing different parts of the UuT.

Figure 4 visualize a simplified high-level ADAS architecture ("ADAS Pipeline") that can be found in many ADAS systems. This architecture defines potential interfaces to inject data into the ECU.

The main interface to the environment are the vehicle's sensors. Stereo and mono video cameras, ultra-sonic, radar or lidar sensor systems are the most relevant sensor concepts used for ADAS. Combining different sensor concepts with different characteristics helps to gather as many aspects of the surrounding as required to safely analyze the current traffic scenario. The sensor captures information from its sensor channel. The raw data is then further processed by e.g. feature and object recognition algorithms, resulting in a processed data set, like a grid based map representation of the environment or a list of recognized objects. The situation analysis combines this processed information and estimates the current traffic situation, forwarding the data to the ADAS application. The application finally decides which actions to take, like e.g. triggering an emergency brake.

In a HiL test bench the accessible layers for injecting virtual data in the Unit-under-Test might vary. Based on the presented pipeline, the following sections discuss different possibilities to inject the data.

Hardware Level

Injecting Data into the Sensor Channel Layer

The most complete and complex test bench includes multiple real sensors and their connection to a (centralized) ADAS ECU. This requires the simultaneous injection of multiple physical signals in each sensor channel. For a camera sensor, this results in injecting light in the sensor's spectrum, i.e. a sequence of images of the actual scenery. Figure 5 visualizes the principle of injecting a synthetically generated image into a camera sensor.

A screen or projector emits light and injects it directly into the camera's optics. By using this approach, the sensor is stimulated via its physical sensor channel. This approach is already well established.

A similar setup can be realized using a radar sensor. The measurement principle of radar is based on the echo-effect: Within a HiL setup, the radar signal does not have to be generated but the emitted signal must be modified. The available methods to modify these signals are not yet as sophisticated as in the spectrum of light, but the principle technology is already available (see [2]). Figure 6 visualizes this approach.

The radar unit emits radars waves, which are modified in a way that the reflected signal represents the simulated scenery, e.g. a moving target object. As the functionality of ultrasonic and lidar sensor systems are based on the echo-effect, too, the same sensor-channel-based testing approach is conceivable.

When using the physical sensor channels in an ADAS HiL test bench there are several advantages and drawbacks: One of the main benefits is that there is no need to modify the involved UuT for testing purposes at all. The injected signal is based upon the same physical principles as the signal the vehicle's sensor system will be exposed to in reality. This is why the original housing and the standard data output of the ECU, e.g. via FlexRay, can be used for testing the complete unit. However, using the physical sensor channel to stimulate the UuT involves great challenges, too. A major problem is interference from the test surrounding. This is why such test benches must be setup within a sealed environment. No stray light must interfere the emitting object and the video sensor, and no electromagnetic waves must affect the signals between the radar sensor and the modulating device. Another challenge is the quality of the injected signals: Images emitted by a screen might not necessarily represent the dynamic range a video sensor faces in real-life situations, e.g. when being blinded by the sun. In terms of radar, the signal reflection in reality happens in three-dimensional space, which makes it necessary to setup multiple signal modulators around the sensor in a spherical construction. As a conclusion, it can be said that injecting data into the sensor channel layer involves the lowest effort concerning the necessity of modifying the Unit under Test, but representing scenarios within the physical sensor channel is one of the most demanding tasks in testing and due to technical limitations currently not possible for all systems.

Injecting Data into the Raw Data Layer

If not injecting the data into the sensor via the physical layer, virtual data can be injected into the device just after signal conversion: An electrical stimulation on the camera's imager might be realized or a signal can be injected in the radar's AD-converter. This requires the availability of an appropriate interface. These approaches are highly product specific as they depend on the hardware layout of involved sensor systems. Injecting synthetically generated raw sensor data into the unit eliminates the drawback of undesired interference with the test surrounding. Additionally the main part of the ADAS system is still involved into the test loop, only excluding the sensor itself. A drawback to this approach is that it requires modification of the unit under test. Besides injecting synthetic raw data electrically, the realization of injecting raw data digitally is possible, too. Here the main challenge is the creation of a test interface to inject raw data into a software layer. As there is currently no standard for storing or processing raw data for various sensor systems available, the injection of raw sensor data is highly product specific. Either hardware or software injection of raw data makes it necessary to heavily modify the Unit under Test.

The raw sensor data results in a high amount of data. An uncompressed video stream for example (1.280px 720px, 24 bit/px, 60Hz) results in 1.3 Gbit/s. When multiple cameras and additional sensor systems are involved, the raw data stream gets even larger. This is why transferring raw data directly to a centralized fusion control unit is currently not widely used within mass production vehicles. Today Sensor-Control-Units process their raw data directly. The resulting information is transferred afterwards to other ECUs via common automotive bus-systems, like CAN, FlexRay or Automotive Ethernet (see [1, p. 363]). Their data rates reach from 25 kbit/s (Low Speed CAN) over 10 Mbit/s (FlexRay) to more than 1 Gbit/s using Automotive Ethernet systems (see [4, p. 8], [5] and [6]). With increasing transfer rates via Ethernet or specialized ADAS network solutions (as e.g. presented in [7]) there will be the possibility to transfer raw sensor data in future applications. The currently limited bandwidth is the main reason why a more common approach in ADAS systems is to transmit pre-processed sensor data to a central ECU, resulting e.g. in transferring object lists.

Injecting Data into the Processed Data Layer

When testing only the ADAS control algorithms, it is sufficient to generate data synthetically based on a phenomenological level. Not the signal propagation is in the center of focus but the input of the control algorithm, which is processed sensor data. Such interfaces are e.g. object lists generated by a Sensor-Control-Unit. The advantage using this approach is the limited complexity and the clear definition of the interface. Therewith the costs to setup the HiL system is mainly limited to bus communication and software.

These interfaces enable to test ADAS functionalities, like EBA or ACC, on HiL test benches by injecting simulated object lists into the UuT. Depending on the involved sensor systems, these object lists include information like object type (vehicle, cyclist, pedestrian or static object) or object kinematics. Bernsteiner et al. discuss the implementation of such a phenomenological radar sensor model in detail (see [8]). The basic idea is to have a virtual environment including a sensor model. The output of this module is the processed data, e.g. an object list. As soon as a target vehicle gets into the sensor's field of view (FoV), it is added to the object list including all necessary information. Figure 7 visualizes the principle of a virtual sensor model outputting an object list and the following code snippet gives an exemplary impression of the generated output in XML format [9].

Listing 1 shows a time stamped object list including two targets. Target with identifier 1 has a distance of 10 meters ahead, 2 meters right hand side and a relative velocity of [-2.sup.m]/s. All values are represented in the sensor's coordinate frame. In a HiL setup, this Object List gets permanently updated in the time schedule the Unit under Test expects.

Supplementary Vehicle Data

Besides the actual sensor information on any layer, the ECU requires supplementary vehicle data provided by other ECUs. This information is usually passed to the Unit under Test via the automotive CAN bus system, including details like gear rate, acceleration, engine speed, steering angle or GPS data. All ADAS systems require at least parts of this information to perform their data processing and to check the plausibility of their situation analysis. The safety critical intervention of the ADAS system only happens if all information are consistent. This is why the supplementary vehicle data is required to get the Unit under Test in a fully operational state to verify and validate its functionality.

Software Level

The required quality of the software based environment simulation mainly depends on the layer of injection into the UuT. Nevertheless, there are basic parts that have to be available in every plant simulation: Vehicle models for all traffic participants, including their vehicle dynamics and drivers, as well as a virtual environment must be available to include sensor simulations. As the models of traffic participants are analogous to the main ADAS vehicle, the following sections focus on describing the main vehicle structure only.

For scenario-based tests, the car has to drive realistically on a predefined trajectory. It therefore consists of a driver model and a complete vehicle model. The driver's task can be subdivided into the categories of navigation, guidance and stabilization, as visualized in figure 8 and discussed in detail in [1]. Depending on the actual ADAS HiL test cases, the level of navigation might not always be included into the simulation. By predefining a driving route within a test scenario, the driver follows this path and provides the actuating variables for the virtual vehicle.

The environment simulation uses the vehicle's position to place it in the virtual world. This is where the combination of the physical simulation domain, used to realize the vehicle dynamics, with algorithms and data structures, representing the virtual world, takes place. In game industry and scientific visualization sophisticated 3D engines ([10], [11] or [12]) have been created, which can be used to create a virtual environment and as a framework to implement sensor models. Approaches like traditional GPU based rasterization using OpenGL or more recently real-time-based Raytracing offer the possibility to work efficiently on large data sets. The level of data injection into the UuT defines the level of detail required for the virtual sensor model: It has to generate the data required by the interface the UuT offers. When injecting the data via the physical sensor layer or the unprocessed raw data, the simulated data have to represent the real world on a very detailed level: For a video camera, the rendered image has to match a captured image in reality as close as possible. A bit-wise mapping of rendered and captured images is not possible as the rendering-stage always includes models, which cannot represent every effect reality includes. Nevertheless, the ADAS system can be stimulated using rendered images for a video camera or point clouds representing reflection points for a radar sensor. To enable the transfer of validating the complete ADAS pipeline system within a simulated environment, the simulated raw data has to be validated, including a huge number of different objects and materials.

The situation changes as soon as the virtual data gets injected as processed sensor data. The interfaces are far simpler than e.g. a Full-HD Video stream and the validation of virtual sensor models, based on preprocessed data, is possible for certain situations as shown in [8]. Even if the output of the sensor model is a simplified list of recognized objects, the creation of the virtual environment is necessary, too. The only difference is the way of working with the three-dimensional data: Instead of simulating every possible effect in reality, the virtual world is utilized in a more abstract way. As objects have to be placed in three-dimensional space, too, the use of 3D engines as mentioned above is useful, as they offer an optimized framework to create virtual worlds. This enables the implementation of detailed sensor models as well as simplified phenomenological sensor models using the same software framework. Use cases including multiple injection layers are therewith possible: A HiL setup might include a radar-based function, stimulated by processed sensor data, combined with a video-based function stimulated via the physical sensor channel. Figure 9 visualizes this setup. This exemplary mixed ADAS HiL setup generates processed radar sensor data as well as data to be injected into a video sensor's physical layer.

All blue visualized elements are involved in the test loop, including the video sensor, the video data processing, the situation analysis and the final emergency break application.

SUMMARY

Validation of Advanced Driver Assistant Systems is one of the key aspects to release reliable and socially accepted self-driving vehicles. As these systems react on or interact with the environment, testing becomes highly complex and validation cannot be realized using a single testing approach. Open-Loop setups offer great possibilities to permanently test new algorithms using a large set of real-life data. However, these Open-Loop approaches lack the possibility of evaluating different system reactions, as this requires to adjust the system's input. This is why Hardware-in-the-Loop (HiL) test benches are also a necessary part on the way to develop and test autonomous driving functionalities along the development cycle. As discussed, there are multiple possibilities injecting synthetically generated stimuli into the Unit under Test (UuT) during HiL testing. Each of these setups offer different pros and cons and table 1 presents a summarizing overview.

When developing systems for autonomous driving their testability has to be ensured by enabling data injection at relevant layers. To comprehensively use the physical sensor channel for all systems technical progress is necessary. It is therefore useful to focus on software interfaces and their standardization. As soon as standards for storing and processing raw sensor data are established, interchangeability of test systems and data sets can easily be realized. Currently these ADAS HiL test systems are highly sensor product specific and compatibility among virtual sensor models is not given. As the validation of ADAS is complex due to the plurality of influencing effects and resulting test scenarios, further frameworks must focus on enabling interchangeability of different modules. This decreases the testing efforts as it enables to reuse stable and robust modules, such as validated sensor models.

REFERENCES

[1.] Winner H., Hakuli S., Lotz F., and Singer C., Eds, Handbuch Fahrerassistenzsysteme: Grundlagen, Komponenten und Systeme fur aktive Sicherheit und Komfort, 3rd ed.: Springer Vieweg

[2.] Reil U., Fischer L., and Bach V., Better than real life: radar echoes from a target simulator. Available: https://cdn.rohde-schwarz.com/pws/dl_downloads/dl_common_library/dl_news_from_rs/213/NEWS_213_ARTS9510_english.pdf (2016, Mar. 10).

[3.] Zimmermann W. and Schmidgall R., Bussysteme in der Fahrzeugtechnik: Protokolle, Standards und Softwarearchitektur, 5th ed. Wiesbaden: Springer Vieweg, 2014.

[4.] ixia, Automotive Ethernet: An Overview: White paper. Available: https://www.ixiacom.com/sites/default/files/resources/whitepaper/ixia-automotive-ethernet-primer-whitepaper_1.pdf (2016. Mar. 10).

[5.] ethernet alliance, 2015 Ethernet Roadmap: Ethernet Ecosystem (2015, Oct. 05).

[6.] Ross P. E., Nvidia Wants to Build the Robocar's Brain. Available: http://spectrum.ieee.org/cars-that-think/transportation/self-driving/nvidia-wants-to-build-the-robocars-brain (2015, Aug. 24).

[7.] EB Assist ADTF - Elektrobit Automotive. Available: https://www.elektrobit.com/products/eb-assist/adtf/ (2015, Sep. 22).

[8.] Bernsteiner S., Magosi Z., Lindvai-Soos D., and Eichberger A., "Radarsensormodell fur den virtuellen Entwicklungsprozess," ATZelektronik, no. 2, pp. 72-79, 2015.

[9.] W3C, XML Technology: Standards. Available: https://www.w3.org/standards/xml/(2016, Apr. 03).

[10.] OpenSceneGraph: Home. Available: http://www.openscenegraph.org/(2015, Jul. 06).

[11.] Crytek GmbH, CRYENGINE: The complete solution for next generation game development by Crytek. Available: https://www.cryengine.com/(2016, Apr. 03).

[12.] I. Amazon Web Services, Amazon Lumberyard: Free AAA Game Engine. Available: https://aws.amazon.com/lumberyard/ (2016, Apr. 03).

Marius Feilhauer and Juergen Haering

ETAS GmbH

Sean Wyatt

ETAS Inc.

CONTACT INFORMATION

Dr. Jurgen Haring

ETAS GmbH

Product Manager (ETAS/ETS-P)

Borsigstrabe 14

70469 Stuttgart

Germany

Phone: +49 (0) 711-3423-2845 juergen.haering@etas.com

www.etas.com

DEFINITIONS/ABBREVIATIONS

ACC -Adaptive Cruise Control

ADAS - Advanced Driver Assistance System

ADTF - Automotive Data and Time-Triggered Framework

EBA - Emergency Break Assist

ECU - Electronic Control Unit

FoV-Field of View

HiL - Hardware-in-the-Loop

LCA - Lane Change Assist

MiL - Model-in-the-Loop

SiL - Software-in-the-Loop

UuT - Unit under Test

ViL - Vehicle-in-the-Loop

XiL - X-in-the-Loop

Table 1. An overview of different ADAS HiL approaches and their
advantages and drawbacks

Injecting data at  Physical Signal Layer    Raw Data Layer


Hardware                                    Hardware injection:
modification of    Not necessary            Highly sensor
the UuT                                     specific, complex
                                            modifications
Software                                    Software Injection:
modification of    Not necessary            Software-Bypass to
the UuT                                     inject synthetic data
                                            necessary
                   Complex signal           Specific hardware
Additional         generation/ modulation   adaptor
hardware           depending on the         (custom product)
                   sensor type
Necessary quality  Depending on the
of the             possibility to modulate  High quality
environmental      physical layer           necessary
simulation
Involved part of
the ADAS           Complete                 Excluding sensor
pipeline

Injecting data at  Processed Data
                   Layer

Hardware
modification of    Not necessary
the UuT

Software           Software-Bypass to
modification of    inject synthetic data
the UuT            necessary

                   Not necessary
Additional         (mainly network
hardware           communication)

Necessary quality
of the             Simple simulation
environmental      sufficient
simulation
Involved part of   Excluding sensor
the ADAS           and parts of signal
pipeline           processing software
COPYRIGHT 2016 SAE International
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2016 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Feilhauer, Marius; Haering, Juergen; Wyatt, Sean
Publication:SAE International Journal of Commercial Vehicles
Article Type:Report
Date:Oct 1, 2016
Words:4370
Previous Article:Heavy vehicle hardware-in-the-loop automatic emergency braking simulation with experimental validation.
Next Article:Aerodynamic performance of flat-panel boat-tails and their interactive benefits with side-skirts.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters