Continental is developing technologies that will make automated driving in cities possible in the first place. In the past five years, as part of the @CITY joint project funded by the Federal Ministry for Economic Affairs and Climate Protection, human-vehicle interaction models have been advanced, solutions for intelligent intersections have been developed and special driving functions for inner-city intersections and bottlenecks have been tested.
City traffic is the key discipline of automated driving. So far, the focus of the industry has been primarily on highways. With the @CITY project, however, we also took on the challenges presented by urban traffic. To master them, prototypical technologies had to be merged and new simulators had to be developed. This makes automated driving possible in an urban environment - and we are already testing it on public roads.
External data helps with orientation
Front and surround-view cameras, lidar sensors, and long- and short-range radar: Our environment sensors already enable many automated driving functions. As part of the @CITY project, we merged these systems with external data. After all: With digital maps as well as weather or traffic information, the vehicle can be given important clues "how to behave". This means, for example, that the vehicle electronics can always determine the exact position of the car - regardless of the GPS signal. This is an essential prerequisite to be able to navigate narrow streets or roundabouts precisely and automatically.
In addition, we have developed an infrastructure sensor concept that detects other vehicles, but also weaker road users, even if they are covered by objects such as a parked truck or a billboard. For this purpose, additional sensors are installed at intersections, which send data from other perspectives to the automated vehicle via radio technology and thus point out road users who are “invisible” for its sensors. These technologies are already being tested on public roads in Frankfurt.
It is particularly important to interact with cyclists and pedestrians on the road – especially in confusing situations. For this purpose, we programmed software that can recognize and interpret gestures such as a cyclist's arm stretching out, which indicates an intention to turn, using artificial intelligence and neural networks.
The vehicle communicates in- and externally
Automated vehicles must also send signals themselves – both to the occupants and to other road users. Internal as well as external human-machine interfaces are therefore becoming increasingly important. Our solution: A light strip on the outside of the body that uses light signals to indicate that it will stop for a pedestrian. At the same time, the vehicle occupants also receive this information so that they can understand the reason for braking. A cockpit adapted to automated driving was set up for this purpose. Two simulators developed by Continental demonstrate how people and cars will exist harmoniously in the city in the future. Both simulators, as well as other innovations from Continental and further research results from the @CITY joint project, will be presented on June 22 and 23, 2022 at the Testing Center in Aldenhoven, Germany.