Industrial manufacturing
Industrial Internet of Things | Industrial materials | Equipment Maintenance and Repair | Industrial programming |
home  MfgRobots >> Industrial manufacturing >  >> Industrial Internet of Things >> Embedded

Experts Debate Optimal Sensor Integration for Next‑Gen ADAS and Autonomous Vehicles

Implementing sensing technology for assisted‑driving and fully autonomous vehicles is a complex, evolving challenge. Instead of a single “best” approach, experts identified six core considerations that will shape how manufacturers integrate sensors in tomorrow’s cars.

The panelists—Patrick Denny, senior expert for vision systems and ADAS at Valeo; Paul‑Henri Matha, technical leader at Volvo Car Corp.; Robert Stead, managing director at Sense Media Group; and Carsten Astheimer, director of design firm Astheimer Ltd.—discussed the right sensor mix and how design must never compromise safety, and vice versa, at the closing session of the AutoSens Brussels 2020 virtual conference. EE Times Europe also sought insights from Pierrick Boulay, technology and market analyst at Yole Développement, on the adoption and usage of various sensor types in automotive systems.

Experts Debate Optimal Sensor Integration for Next‑Gen ADAS and Autonomous Vehicles

Getting the number right

Modern cars are already equipping themselves with an impressive array of sensors—ultrasonic units, radar, multiple cameras (for perception, imaging, and vision), and LiDAR. Yole’s Boulay estimates that a typical vehicle today carries between 10 and 20 sensors, with high‑end models packing more than mid‑tier or budget options.

As vehicles climb the automation ladder, the sensor count is projected to rise to roughly 35–40 units. Boulay explains that each automation level will demand distinct short‑, mid‑ and long‑range capabilities, making a single sensor type insufficient for all scenarios.

However, adding sensors amplifies data volume, challenging the vehicle’s computing stack. Boulay notes that while legacy ADAS chips deliver between 0.25 and 2.5 TOPS (tens of times the performance of a high‑end laptop), robotic platforms are already operating beyond 250 TOPS. He envisions a shift from distributed architectures to centralized domain controllers that can fuse raw sensor data efficiently.

Cost and integration constraints will eventually cap the sensor count. Boulay observes that some OEMs, such as Tesla, are achieving high automation levels with fewer sensors by leveraging advanced AI and computing power. “Software and processing capability will become the differentiator,” he says.

Optimizing the mix

Weather and lighting fluctuations demand a diverse sensor portfolio. Denny stresses that redundancy is essential: “When you’re in complete darkness or heavy precipitation, you need multiple modalities working in concert.”

Cameras excel in daylight, whereas radar, LiDAR and ultrasonic sensors maintain performance in low‑light, fog, rain or snow. Boulay notes that cameras become ‘blind’ under adverse conditions, whereas other sensors can still provide reliable perception.

Ensuring the right placement

Just like human senses, sensors must be strategically positioned to maintain continuous environmental awareness. Technical constraints—such as condensation in headlights, frost on windshields, or paint covering ultrasonic transducers—can degrade performance. Denny cautions that ultrasonic units are sensitive to surface coatings.

Volvo’s Matha highlights power consumption as a critical factor. “Each sensor draws between 1 and 10 W, so a full ADAS suite can reach 100–200 W, contributing up to 4 g of CO₂.” He suggests intermittent activation and advanced cooling to mitigate thermal limits, especially in high‑temperature zones like behind the windshield.

Simulation and real‑world testing are essential to pinpoint optimal sensor locations. Boulay points out that highway‑level LiDARs are typically mounted centrally, aligning with long‑range radar and primary cameras, while city‑driving or parking sensors may be placed on the sides or corners.

Integrating aesthetically

Volvo’s XC90 currently houses 20 sensor types, many of which are concealed—forward cameras in the grille, side cameras in mirrors, and a rear camera above the registration plate. Matha emphasizes that seamless integration can enhance both form and function.

Astheimer argues that the most advanced cars should visibly reflect their intelligence. As autonomy advances, sensors like 360° LiDARs will need prominent, unobstructed placement. He cites the Deliver‑E electric delivery vehicle, where cameras flank the sides and a LiDAR crowns the rear, aligning with the vehicle’s identity.

Boulay discusses the Magneti Marelli Smart Corner pod, which can house LiDAR, radar, cameras, ultrasonics and LED lighting. While manufacturing integration is streamlined, repair costs after collisions could be high, underscoring the need for a balanced approach.

Reducing cognitive overload

The human‑machine interface must translate sensor data into clear, actionable information. Astheimer, working on Volta Trucks’ Zero electric delivery truck, notes that heavy vehicles—though only 4 % of traffic—cause over 50 % of vulnerable‑road‑user fatalities. Cognitive overload, alongside limited visibility, is a major contributor.

He stresses that ECUs and CAN systems should filter signals and present them via tactile, audio or visual cues to maintain driver vigilance.

Making safety cool

Stead posed a question: “Is making safety cool the key to selling connected cars?” Matha responds that safety is the core business, and the best way to communicate it is through elegant design that showcases sensor‑driven intelligence.

Astheimer warns that as vehicles assume more tasks, drivers may become complacent. The sensor suite must keep the driver informed and engaged, preventing a “cocooning” effect that reduces road awareness.


Experts Debate Optimal Sensor Integration for Next‑Gen ADAS and Autonomous Vehicles

A new book, AspenCore Guide to Sensors in Automotive: Making Cars See and Think Ahead, brings together leading thinkers in safety and automotive engineering to chart progress and highlight remaining challenges. It’s available now at the EE Times bookstore.

>> This article was originally published on our sister site, EE Times.

Embedded

  1. Toposens Unveils TS3: Advanced 3‑D Ultrasonic Sensor for Automotive, Robotics, and ADAS Applications
  2. Digital Magnetic Hall Sensors: Fundamentals, Design, and Automotive Applications
  3. Advanced Sensors Empower Wearable Health Tech: From Heart Rate to Temperature Accuracy
  4. Melexis Launches ASIL‑Ready Hall‑Effect Sensor for Safety‑Critical Automotive Systems
  5. TI Introduces TMAG5170: A High‑Precision 3D Hall‑Effect Sensor Empowering Real‑Time Control in Factory Automation
  6. Advancing Autonomous Driving: How Next‑Gen Sensors Are Pushing the Limits
  7. Digital Sensors in Industrial Machinery: Function, Types, and Best Practices
  8. NASA-Developed Inductive Non-Contact Position Sensor for Precise Motion Control
  9. How IoT Sensors Revolutionize Industries: From Pest Control to Climate Monitoring
  10. Distance Sensors Explained: Types, Uses, and How to Choose the Right One