SHARPEN – Scalable Highly Automated Vehicles with Robust Perception

SHARPEN aims at developing optimised real-time deep neural networks for 3D object and free-space detection working in tough weather conditions which are the core building blocks of the perception system for confined area applications.

Summary:

The global automotive industry is rapidly adapting to Deep Learning (DL) as a key technology, especially within perception for autonomous driving. This poses an increasing threat to the Swedish automotive industry. It is therefore of critical importance that the Swedish automotive industry builds world class DL competence and excels in adapting the technology. The SHARPEN project addresses this need by developing novel concepts for Scalable and Robust perception systems, in particular within the Confined Area automation through Cost-Effective and Realistic Data generation. These concepts can be reused by the Swedish automotive industry, and the outputs of the project such as the developed Deep Neural Networks (DNNs) and data generation tools, will also be utilized in future projects and products of the partners.

Swedish automotive OEMs have an increasing need for robust perception modules to build a world model in autonomous driving applications. State-of-the art research has mostly focused on perception in the public road setting with clear day conditions. However, the research on tougher conditions like night operations, adverse weather, sensor dirt presence and confined area domain is very limited. Thus, SHARPEN will go beyond state-of-the art and develop optimized real-time DNNs for 3D object and free-space detection working in these tougher conditions, which are the core building blocks of the perception system for confined area applications. These core blocks will be developed using sensor fusion. Furthermore, sensor failures during the operation of automated vehicles will be compensated for by applying novel SHARPEN sensor dropout and sensor-to-sensor mapping mechanisms which will be integrated in the sensor fusion. These DNNs will perform during night and day and changing weather conditions such as sunny, rainy, and foggy in confined area.

Project period:

April 1, 2019, to March 31, 2021

Financier:

Vinnova

Involved partner:

Volvo GTT, Volvo CE, Machine Intelligence Sweden AB and Halmstad University

Project leader:

Eren Erdal Aksoy

Department:

ISDD

updated

2019-11-11

contact

share

Contact