Utah Business

With thermal imaging, AI, and quantum computing.

With new thermal imaging, AI, and quantum computing, you'll never have to wonder if self driving cars are safe again.

These three companies are making self-driving cars safer

With new thermal imaging, AI, and quantum computing, you'll never have to wonder if self driving cars are safe again.

Before America had seatbelts or the Interstate Highway System, before Elon Musk was even a twinkle in Silicon Valley’s eye, the dream of fully autonomous vehicles was revving its engine. General Motors unveiled their Futurama exhibit at the World’s Fair in 1939, which gave visions of a utopian tomorrow. Tens of thousands of eager visitors lined up to ride around in a model of the city of the future, which imagined a world of 14-lane highways, flying vehicles, and, you guessed it—cars that drove themselves. 

We’ve come a long way in the past eight decades when it comes to autonomous vehicles. Tesla Autopilot, Ford BlueCruise, and GM Super Cruise are commercially available cars with some autonomous features, such as steering, braking, and accelerating in limited conditions. Waymo, which grew out of Google’s self-driving car project, currently has self-driving taxis that can operate autonomously in certain circumstances. And TuSimple has a fleet of self-driving semi trucks in operation now along supply routes in some regions.

But those caveats—in limited conditions, in certain circumstances, in some regions—are signs that fully autonomous vehicles are not yet a reality. There’s still a long road to travel before consumers can nap or watch a blockbuster on their evening commute. Despite decades of advancement, autonomous vehicles still aren’t quite safe enough for the general public. Federal regulations designed specifically for autonomous passenger vehicles were only put in place this year. Waymo, Tesla, and TruSimple vehicles have all been involved in crashes that make consumers uneasy. While elements of automation promise to alleviate human errors that lead to injury or death on the road, the tech isn’t quite there yet.

That’s where companies like, Quantum Computing, Inc., and Ephel come in to close the gap between the dream of autonomous vehicles and reality.

“The world is complex and constantly changing,” says Michael Kohen, founder and CEO of New York-based, which grew out of Zoox, the autonomous vehicle company acquired by Amazon. “It’s really difficult to build an AI system that works well 100 percent of the time. You wouldn’t get into a self-driving car that was 95 percent accurate. That last five percent is ‘make or break’ for most autonomous applications.”

Because AI lacks cognition—that human ability to use common sense, intuition, previous experience, and learning to make split-second decisions—self-driving cars continue to come up against situations they don’t understand. This confusion can lead to stalls and delays at best and crashes or malfunctions at worst. solves this problem by making human cognition available to AI in real time. 

Say an autonomous vehicle encounters an unfamiliar section of the road. It detects orange flags and cones everywhere, people waving their arms in the middle of the street, and unfamiliar signage. The car is confused and doesn’t know how to operate in this situation. With’s technology, the vehicle can call a human to assess the situation. This mission specialist can identify the car in a construction zone in just a few seconds and decide on the safest path forward.

It’s not just self-driving cars that can benefit from occasionally putting humans back in the driver’s seat. Kohen believes that empowering artificial intelligence with human cognition can be a game-changer for any industry trying to overcome plateaus in their AI.

One of the areas where autonomous vehicles outshine human drivers is in their superior senses. These days, autonomous vehicles are equipped with an array of cameras, radar, and LiDAR (light detection and ranging) sensors, which allow them to navigate conditions that would be tricky for humans, such as driving through a fog bank or navigating a pitch black highway. However, the number of sensors required for an autonomous vehicle to have total coverage of road conditions and complete situational awareness was inefficient and cost-prohibitive. 

"Now, sensor capabilities can go beyond what the average human can do."

That is, until now. During BMW’s Quantum Computing Challenge, Virginia-based quantum computing company QCI reduced the number of sensors needed for nearly total situational awareness in autonomous vehicles from 400 to just 15. Fewer sensors enable autonomous car manufacturers to create safer vehicles that are also more efficient and cost-effective.

“Now, sensor capabilities can go beyond what the average human can do,” says Bob Liscouski, CEO of QCI. “And quantum sensing can be one of the factors that get us there. Quantum computers have the ability to unlock things that people only dreamed about. I remember, as a kid, driving along the highway and wondering if I could just put a sensor in the car and just let the car drive by itself. And just look at us now.”

One type of sensor that is key to safe driving in low-visibility conditions is thermal cameras. These cameras can detect heat signatures, which allows them to see in the dark and through fog, smoke, and dust. They’ve been instrumental in increasing the safety of autonomous vehicles, especially due to their skill at detecting humans and animals on the road.

However, thermal cameras have their limitations. They can be expensive, need a lot of power, and often produce low-resolution images. This is where Ephel, a Utah-based, open-source imaging solutions company, comes in.

Elphel invented a new kind of 3D thermal imaging using Long-Wave Infrared sensors, for which it won a 2022 Utah Innovation Award in the hardware and electronics category. This new technology makes it possible for autonomous vehicles to perceive the road ahead more accurately and have more time to detect and respond to potential hazards.

Olga Filippova, VP of Elphel, believes this technology may help put autonomous vehicles on the road and help out our current supply chain issues. “One of the first applications would be for trucking,” she says. “Semis require a longer distance for braking. They are also constantly on the road—driving in all conditions, through snow, dust, fog, and darkness. Our technology could help put more autonomous trucks with the kind of sophisticated vision that would feel safe for everyone.”

Filippova says it’s becoming easier to see a future in which autonomous cars will feel safe for all. Kohan and Liscouski share this vision, though they recognize the path may be winding.

“Advancements in AI and quantum computing are accelerating the autonomous vehicle industry,” Liscouski says. “We’re doing things that we never thought we could do, and it’s important to remember that can have both positive and negative impacts.”

One trend Kohan is particularly excited about is that companies are more honest now about what it will take to get autonomous vehicles and other robotic applications to market. “They’re admitting the capabilities are not quite there, not yet,” he says.

Filippova is unsure when autonomous cars will take off and is already looking ahead to the next innovation in consumer transportation. She paints a picture of the future lifted from the 1939 World’s Fair: “We’re working with NASA, who predicts that in ten years, we all will be in flying cars.”