Internationales Verkehrswesen
iv
0020-9511
expert verlag Tübingen
10.24053/IV-2023-0099
101
2023
75Collection
Automotive vehicles need an all-round view to minimize traffic fatalities
101
2023
Kobi Marenko
Chrysler introduced the first driver assistance system in the form of cruise control in 1958. Cruise control was able to automatically control longitudinal acceleration, but it was not able to “see” its surroundings. The automotive industry is now about to make a huge technological leap, with high-definition imaging radar playing a much larger role. However, in order to further support drivers and make road traffic even safer – moving toward vision zero – it is necessary for all data from all sensors and from every direction to be processed together. This all-around view creates a 360-degree image of the driving environment that identifies stationary objects and the size and shape of obstacles, thus providing information on whether an area is passable or not.
iv75Collection0030
International Transportation | Collection 2023 30 PRODUCTS & SOLUTIONS Safety Automotive vehicles need an-all-round view to minimize traffic fatalities Autonomous sensors, Perception radar, Safer driving Chrysler introduced the first driver assistance system in the form of cruise control in 1958. Cruise control was able to automatically control longitudinal acceleration, but it was not able to “see” its surroundings. The automotive industry is now about to make a huge technological leap, with high-definition imaging radar playing a much larger role. However, in order to further support drivers and make road traffic even safer - moving toward vision zero - it is necessary for all data from all sensors and from every direction to be processed together. This all-around view creates a 360-degree image of the driving environment that identifies stationary objects and the size and shape of obstacles, thus providing information on whether an area is passable or not. Kobi Marenko W hen automotive vehicles are on the road, they need to be aware of what is happening all around the vehicle and process information from all directions simultaneously, without gaps and without making assumptions. This is a basic requirement for traffic safety. But for human drivers, it’s impossible to process information in this way. No matter how skillful drivers may be or how many mirrors are positioned around the car, they can’t see in all directions at the same time or process this information at once. This is where autonomous sensors and their supplementary perception algorithms come in. They enable a complete, continuous, and unified understanding of the driving environment. The challenge so far: Cameras offer highly detailed information for perception algorithms, but they don’t function well in some conditions, such as rain, fog, snow, darkness, or blaring lights. While radar operates well in all those conditions, until now it could not provide data detailed enough to support perception algorithms due to low resolution. Today, perception radar is the sensor that will close this gap, for the first time enabling 360-degree perception in all environments and road conditions and will be the key sensor to supplement optical sensors. Toward safer driving by merging all-sensor data In order to merge data from several sensors, the algorithm must first be adapted; tracking, ego movement, free space mapping and classifications must each be done multiple times. The challenge is to support the perception algorithms by merging all data from individual sensors into a continuous, uniform representation of the driving scenario that reliably records the vehicle’s surroundings. An example to illustrate: A dense and long traffic jam is difficult to master with only a front radar. However, when rear-facing radar is used in combination with side radars, it gives the vehicle a complete understanding of the driving environment and how it is likely to evolve over the course of the journey, covering traffic merging into your lane, which should be considered for driver assist and for autonomous driving. The more directions that are covered by sensors, the more data will be processed, thus greatly improving long-range detection, which is essential to reliably complementing optic sensors. A 360 degree view with new-abilities A comprehensive image of the surroundings, such as that provided by 360 perception leveraging Arbe’s Phoenix Perception Radar and Lynx Surround Imaging Radar, is critical for a variety of advanced autonomous and ADAS applications. It can detect stationary and moving objects in the following scenarios: •• Crossing traffic at intersections or junctions •• Merging onto a highway while detecting the fast traffic coming from behind and from the side •• Changing lanes on highways by mapping the adjacent lanes •• Backing out of a parking space onto the road All of these situations require long-range, high-resolution detection of accurate object delineations, the ability to create an image of the vehicle’s surroundings in a large field of view, and free space mapping. ■ Kobi Marenko CEO, Arbe Robotics, Tel Aviv-Yafo (IL) info@arberobotics.com Source: Arbe Robotics
