How many senses do you need to drive a car?

The recent AutoSens conference in Detroit made me question whether I should surrender my driver’s license. The response that OEMs in attendance gave to all discussions about the benefits of RGB cameras, ultrasound, radar, lidar and thermal sensors was unanimous “We probably need all of them in some form of combination” to make autonomy a reality in the automotive sector. Together, these sensors are so much better than my eyes and ears. Technological progress is rapid in automatic driver assistance systems (ADAS) and autonomous driving, but we haven’t gotten there holistically yet. So I’ll keep my license for a while longer.

I traveled to Auto City to speak at a panel hosted by Ann Mutschler focusing on aspects of the design supply chain, alongside Siemens EDA and GlobalFoundries. Ann’s article “Automotive Relationships Changing with Chiplets” sums up the panel well. The conference was a great experience as networking allowed us to talk to the entire design chain, from OEMs through tier 1 system vendors, tier 2 semiconductors and software developers, all the way down to tier 3 as us in semiconductor IP. Since the panel had a foundry, an IP vendor, and an EDA vendor, we quickly focused our discussions on chiplets.

On the tracking side, Owl.Ai CEO and co-founder Chuck Gershman gave an excellent presentation summarizing the problem the industry is trying to solve 700,000 annual pedestrian fatalities worldwide, rising of 59% of pedestrian deaths in the last decade in the United States and 76% of fatal accidents occur at night. Government regulations for night time pedestrian safety are coming around the world. Owl.Ai and Flir showcased thermal imaging technologies, motivated by the fact that only 1 in 23 vehicles passed all tests in a night-time IIHS test using cameras and radar, and on RGB image sensors being unable to see in complete darkness (just like me, I should say, but I still keep my license).

Source: Owl.Ai, AutoSens 2023, Detroit

Chuck well introduced the four critical steps of “detect” is something that “classification” is a person, car or deer “distance estimation” what distance in meters is the object and “take action” by warning the driver or taking action automatically. I liked the Owl.Ai slide above, which shows the different use cases and flaws of the various detection methods. And during the discussion I had at the conference, OEMs agreed that more sensors are needed.

Regarding the transition of driving from L3 to L4 robotic taxis, Abdullah Zaidi of Rivian showed the slide below outlining the different needs of cameras, radars and lidars and also the computational requirements.

Source: Rivian, AutoSens 2023, Detroit

No wonder the automotive sector is such an attractive space for semiconductors. Data processing, tracking and transport requirements are growing enormously. And keep in mind that the image above doesn’t mention any other cameras for in-cab monitoring.

In addition to IT requirements, data transport is essential for my day-to-day work. In one of his slides, Konstantin Fichtner of Mercedes-Benz AG presented that the DrivePilot system records 33.73 GB of trigger measurements per minute, 281 times as much as it takes to watch a 4K Netflix stream. There is a lot of data to be transported across networks on a chip (NoC), between chips and chiplets. And, of course, it raises the question of internal versus external processing.

We have arrived? Not really, but we’re getting close. On the final day of the conference, Prof. Philip Koopman of Carnegie Mellon University cheered the audience with his keynote address “Defining Safety for the Shared Responsibility of Computer and Human Drivers.” His talk guided the audience through the liability dilemma when a highly automated vehicle crashes and provided some constructive suggestions on updating state laws. In their recently published essay “Winning the Imitation Game: Setting Safety Expectations for Automated Vehicles”. The professor. Koopman and William H. Widen of the University of Miami School of Law suggest that lawmakers modify existing laws to create a new legal category of “computer driver” to allow a plaintiff to file a malpractice claim.

To clarify that exact point, the universe created the situation the very next day, which I’ve snapped a picture of below. Can you see what’s wrong here?

Source: Frank Schirrmeister, May 2023

Yes, a pedestrian ghost in the car.

Under the technology excuse, there was a pedestrian walking on a jay about 30 seconds ago, which probably erred in caution. However, this was a good reminder that hopefully future sensors are better than my eyes, and a thermal sensor would help here as well.

Sobriety and defects aside, let’s not forget the ultimate goal: Reduction of fatal accidents due to traffic situations. And as I joked in my latest ISO 26262 safety blog, How Safe is Safe Enough: If aliens came and figured out how to reduce traffic-related deaths, they would definitely take humans off the streets.

Brave new autonomous world, here we come. And I keep my licence. That 1997 Miata doesn’t drive itself!

Frank Schirmeister

Frank Schirmeister

(all posts)

Frank Schirrmeister is Vice President of Solutions and Business Development at Arteris. He leads businesses in the automotive, data center, 5G/6G communications, mobile, aerospace and data center industries, and in the horizontal technologies artificial intelligence, machine learning and security. Prior to Arteris, Schirrmeister held various senior leadership positions at Cadence Design Systems, Synopsys, and Imperas, focusing on product management and marketing, solutions, strategic ecosystem partner initiatives, and customer engagement.

#senses #drive #car

Leave a Comment