A new way of interpreting autonomous vehicle sensor data is on the horizon, and it promises to be faster than ever.
Hyundai- Hyundai and IonQ, a Maryland-based leader in quantum computing technology, are working on ways to interpret sensor data from autonomous vehicle sensors.
- Quantum computers are being designed to perform object detection tasks on three-dimensional data from autonomous vehicles.
- This technology has the potential to greatly speed up real-world data processing gathered by vehicle sensors in cars with advanced levels of autonomy.
A number of autonomous developers, with the notable exception of Tesla, have embraced the rapidly advancing Lidar (light detection and ranging) technology as part of SAE Level 2 driver-assist systems and higher levels.
The principle behind Lidar is relatively simple: A pulsed laser emitter scans its surrounding area, emitting millions of pulses per second to paint a three-dimensional picture for the driver-assist or higher-level autonomous system to interpret, detecting objects up to a few hundred yards away. Onboard hardware and software then decide what to actually do with that data when making automatic driving decisions.
Half a decade ago these Lidar sensors looked like spinning aluminum cans mounted on vehicle roofs, but they have now gone solid-state and look like small sensor pods at the leading edges of vehicle roofs, just ahead of the windshield.
Lidar sensors and technology have made great strides over the past decade, to the point that every Level 3 system and higher currently on sale or about to arrive on the market offers Lidar. Still, there is room for the technology to grow.
Hyundai and Maryland-based quantum computing developer IonQ have recently revealed the next stages of their partnership, applying IonQ’s quantum computers to image processing that is hoped to perform object detection tasks on 3D data from autonomous vehicles.
Specifically, the companies are looking to use IonQ’s quantum computers to simulate electrochemical reactions of different metal catalysts. In such an instance, images of road signs are encoded into a quantum state for classification and object detection, greatly speeding up the system’s detection and classification of objects, cars, people, and buildings along the road.
“Quantum machine learning techniques being investigated at IonQ have shown the potential to learn faster, be more effective in recognizing edge cases, generalize better, learn from lower resolution or noisy data, and capture complex correlations with a far lower number of parameters,” the company says. “These deep technical advantages can ultimately lead to quicker, safer and more accurate decisions without user input.”
Hyundai is also working with IonQ on chemical reactions and lithium compounds used in EV batteries, investigating new metal catalyst chemical reactions, analyzing them with quantum computers.
“Autonomous vehicles are still in their infancy, yet the quantum-derived algorithms we’re testing today have the potential to shape the commerciality, efficiency, and safety of such systems,” said Jungsang Kim, co-founder and CTO of IonQ.
At the moment, most of the focus of autonomous developers is miniaturizing solid-state Lidar systems while also making them less expensive to manufacture. But it’s refreshing to see companies rethinking how Lidar sensors perceive the world around them in the context of Level 3 and Level 4 systems, as both types of systems are nearly ready for market launch.
Jay Ramey Jay Ramey grew up around very strange European cars, and instead of seeking out something reliable and comfortable for his own personal use he has been drawn to the more adventurous side of the dependability spectrum.
Keyword: Hyundai Bets on Quantum Computing for Autonomy