Sensordata from different sensors will flow via a standardized interface to the central control unit.
Sensordata from different sensors will flow via a standardized interface to the central control unit.
( Source: Hella)

autonomous driving Open Interfaces for many sensor data

| Author / Editor: Hendrik Härter / Florian Richert

Sensor data play a decisive role in autonomous driving. However, a uniform interface is still lacking. A fusion platform is expected to change this.

With the help of a near-series fusion platform, a German research association of universities, IT companies, and research institutions from the automotive industry wants to make it possible for automobile manufacturers and their suppliers to enable highly and fully automated driver assistance functions for automated driving. The fusion platform has open interfaces (Open Fusion Platform, OFP).

The project, funded by the Federal Ministry of Education and Research with 4.4 million euros, was completed after just over three years with the integration of the first fully autonomous driving functions in three demonstration vehicles. In the implemented scenario, an electric vehicle drives fully automatically to a free charging station on a parking lot and positions itself above the charging plate. After the charging process has been completed, it searches fully automatically for a free parking space without a charging plate. "For such highly or fully automated scenarios, only prototypes that were not yet close to series production existed so far," says Dr. Michael Schilling, project manager for pre-development Automated Driving at Hella and network coordinator for the OFP project. The OFP was developed by the network coordinator Hella together with the German Aerospace Center, Elektrobit Automotive, Infineon Technologies AG, InnoSent, Hella Aglaia Mobile Vision, the Reutlingen University of Applied Sciences, the RWTH Aachen Electric Mobility Office, Streetscooter Research and TWT Science and Innovation. Also Continental and Nvidia provided support as associated partners.

Connect sensor and control unit

Even today, driver assistance systems are in series production, such as traffic jam assistants that merge data from two sensors. "But fully automated driving requires vehicles to be able to perceive their entire surroundings. This requires the fusion of data from many sensors and cameras to create a complete environmental model that maps the environment with the required accuracy and on which the driving function can be reliably implemented. To date, there is no standardized interface between the individual sensors and the central control unit. Currently, the interfaces of the driver assistance systems are very function-specific and depend on the respective supplier or automobile manufacturer. This is exactly where the research project started.

Four cameras and eight 77 GHz radar sensors, each covering 360° around the vehicle, served as input for the OFP in the project. An additional vehicle-to-x communication module (V2X module) also enabled communication between vehicles and external infrastructure, such as the pallets. The partners in the project jointly disclosed the interface descriptions of the individual components in a freely available "Interface Specification". During the project, an ISO standardization process for the sensor data interfaces was started by the research association together with other leading automobile manufacturers and suppliers.

Unified interface planned

Upon completion of the project, an updated interface description will be published, which will also flow into the ongoing ISO process. For the first time, all automobile manufacturers and suppliers will have the opportunity to integrate their products quickly and easily into the fusion platform. With the environment model, Hella Aglaia Mobile Vision has developed the central component of the OFP. Using the visualization options, developers can see how the vehicle perceives the entire environment and on this basis decide which sensor data is to be merged. Whether complex driver assistance functions or fully automatic driving functions - all functions can be programmed in this way.

Work on the OFP will continue after the project. Central questions will be how the sensor data can be processed with machine learning to improve the functions and further accelerate the development work. The car park scenario is also to be extended to include urban driving situations or at speeds of over 20 km/h. The aim is to develop a new system for the parking of cars in the future. These scenarios require interaction with other sensors such as LiDAR. It is precisely in this area of multi-sensory data fusion that the OFP can finally play to its full potential. Besides, functional safety will play a major role in further development to ensure that all functions developed are fail-safe.

This article was first published in German by Next Mobility.