As the number of vehicles around the world rapidly expands, the occurrence of traffic accidents and their fatality rate is increasing. According to the World Health Organization, more than 1.3 million people worldwide die in road accidents every year, and traffic accidents kill one person every 24 seconds.
An increase in the number of elderly drivers is also emerging as a new concern. In most developed countries, the proportion of elderly drivers over the age of 65 started to increase steadily in the 2000s, and by 2019, it increased by more than 50 percent. For example, the number of drivers over the age of 65 in the U.S., compiled by the U.S. NHTSA in 2019, exceeded 54.1 million, which is a significant increase of 35% compared to 2010. The reason we need to prepare for the aging of drivers is because of the high fatality rate in that age range, compared to other age groups in the event of a traffic accident. According to a survey of senior driver safety conducted by RAND Corporation, these drivers over the age of 65 have a lower accident rate than the average adult driver, but are 573% more likely to die in an accident.
Many countries are devising automobile safety policies to reduce traffic accidents and mortality, and recommend that automakers reflect these safety policies in driving technology. In response to these changes in government policy, OEMs are expanding the introduction of the Advanced Driver Assistance Systems (ADAS) for the safety of drivers and pedestrians. According to data released in 2021 by Strategy Analytics, the global ADAS industry size has already exceeded $20 billion in 2020 and is expected to reach $49.3 billion by 2025, a rapid growth rate of 17.7% per year.
The latest technology trends in driving safety
Advanced Driver Assistance Systems and Autonomous Driving technology consist of three stages: perception, planning, and control. Perception means detecting the surrounding environment of the vehicle through sensors, and it is the most important step in the Autonomous Driving procedure. Vehicles based on ADAS or Autonomous Driving technology have a variety of sensors, and improving the detection range and accuracy is of the utmost importance for safety.
Taking the camera sensor as an example, our team at StradVision, and the industry as a whole, is focusing on increasing the field of view and image resolution. Recent technology achievements recognize vehicles or pedestrians located on both sides of an intersection through a camera with a field of view of 100 degrees, which is significantly improved compared to the existing 50 degrees horizontally, or to increase driving safety on the highway with high-resolution cameras of 4 to 8 million pixels that can recognize objects at a distance of up to 200 meters or more.
A technology that integrates various sensors to detect 360-degree directions around the vehicle is also being developed, with the goal of minimizing and eliminating blind spots around the vehicle. Other new technologies are being introduced, such as systems that monitor for driver drowsiness or careless driving, and four-channel surround view monitoring that implements automatic parking and parking assistance features.
Augmented reality technology, which integrates various driving information into actual video images acquired through a camera, is also rapidly expanding. In particular AR HUD places computer graphics for ADAS warnings and navigation information that interact with the driver’s field of vision, unlike legacy head-up displays, allowing critical information to be overlaid directly onto the actual road. This is the latest feature that StradVision cooperates with the industry leaders for enhancing the driver’s situational awareness, thereby preventing accidents and providing a safer and more user-friendly driving environment.
Furthermore, this is implemented in a form that the driver’s field of vision is naturally expanded, which greatly improves situational awareness. AR HUDs can identify threats by displaying them directly within the driver’s line of sight. In addition, AR graphics are overlaid on real objects in such a way that the driver can immediately recognize the threat and quickly take appropriate action, such as braking against road obstacles. Displaying ADAS warnings in this way significantly improves the driver’s reaction time and situational awareness, especially during night driving or in low-visibility conditions.
Various new technologies for improving the recognition performance of sensors are also being introduced. One technology garnering much attention is deep learning — a technology central to StradVision’s SVNet software. By introducing deep learning to the vehicle’s object recognition software, it accurately detects complex environmental information collected by sensors and provides the ability to learn and adapt to driving environments that vary by country or region.
Sensor Fusion is another area where StradVision has been a pioneer in the Autonomous Vehicle industry, and it is one of the essential technologies to achieve the ultimate value of a safe driving environment pursued by the automotive industry. This technology combines the versatility of a camera — which identifies information about objects such as shape and color — with the precision of Lidar that measures the distance to an object with a margin of error within a few millimeters. These combined technologies recognize the surrounding environment more precisely than existing single sensor-based solutions.
At StradVision, we believe that Sensor Fusion can dramatically improve the limitations of the existing perception technology, and we are working closely to prove our technology and provide the most advanced solutions to the automotive industry.
Bringing innovative safety tech to the masses
The ultimate goal for governments and the automobile industry worldwide is to find policy and technical solutions that will reduce traffic accidents and fatalities, especially for the aging populations. Delivering the benefits of ADAS and Autonomous Driving technologies to a wider public will be key to achieving this goal.
First, governments should establish legal standards for the introduction and expansion of automotive safety technology. In addition, various automotive manufacturers and their partners worldwide must establish the General Safety Regulation and New Car Assessment Program that sets standards for the latest technologies from the stage of new car development.
In addition, the automotive industry should strive to supply ADAS and Autonomous Driving technologies that guarantee human safety by improving the performance of the latest safety technology, while improving efficiency and unit cost, at a lower price. For example, it is necessary to introduce a new solution such as centralization that connects multiple sensors to one Electronic Control Unit for improved performance and features while accomplishing lower unit costs.
In particular, to respond to the recent trend of the increasing importance of adopting advanced hardware and software technologies, it is necessary to introduce an open development eco-system between automobile manufacturers, semiconductor manufacturers, and software developers to strengthen collaboration and lower development costs from the initial stage. It is also necessary to set up a new system that enables follow-up services such as vehicle software updates, new feature additions, and bug fixes without incurring large-scale costs by utilizing over-the-air (OTA) technology commonly used in mobile devices such as smartphones.
Junhwan Kim is CEO of StradVision, an automotive vision processing software supplier.