1.2191449-2535500546
Uber’s self-driving cars at its offices in Pittsburgh, Pennsylvania. Image Credit: AFP

On Sunday night, a women died after she was hit by a self-driving car operated by Uber in Tempe, Arizona.

Elaine Herzberg, 49, was walking her bicycle outside the crosswalk on a four-lane road in the Phoenix suburb of Tempe about 10pm local time on Sunday when she was struck by the Uber vehicle travelling at about 65km per hour, police said. The Volvo XC90 SUV was in autonomous mode with an operator behind the wheel, according to police.

Uber is one of many companies testing this kind of vehicle in Arizona, California and other parts of the country. Waymo, the self-driving car company owned by Google’s parent company, Alphabet, has said it is operating autonomous cars on the outskirts of Phoenix without a safety driver behind the wheel. On Monday, Uber said it was halting tests in Tempe, Pittsburgh, Toronto and San Francisco.

Footage from TV shows investigators at the scene of the fatal accident involving a self-driving Uber car in Tempe, Arizona, on Monday. (AP)

Here is a brief guide to the way these cars operate.

How do these cars know where they are?

When designing these vehicles, companies like Uber and Waymo begin by building a three-dimensional map of a place. They equip ordinary automobiles with lidar sensors — “light detection and ranging” devices that measure distances using pulses of light — and as company workers drive these cars on local roads, these expensive devices collect the information needed to build the map.

Once the map is complete, cars can use it to navigate the roads on their own. As they do, they continue to track their surroundings using lidar, and they compare what they see with what the map shows. In this way, the car gains a good idea of where it is in the world.

Lidar also alerts the cars to nearby objects, including other cars, pedestrians and bicyclists.

Is that the only important technology?

Lidar works pretty well, but it can’t do everything. It provides information only about objects that are relatively close, which limits how fast cars can drive. Its measurements are not always sharp enough to distinguish one object from another. And when multiple autonomous vehicles drive the same road, their lidar signals can interfere with one another.

Even in situations where lidar works well, these companies want backup systems in place. So most driverless cars are also equipped with a variety of other sensors.

Like what?

Cameras, radar and global positioning system antennas, the kind of GPS hardware that tells your smartphone where it is.

With the GPS antennas, companies like Uber and Waymo are providing cars with even more information about where they are. With cameras and radar sensors, they can gather additional information about nearby pedestrians, bicyclists, cars and other objects.

Cameras also provide a way to recognise traffic lights, street signs, road markings and other signals that cars need to take into account.

While technologies ranging from advanced sensors to machine learning are being applied to self-driving cars, engineers acknowledge that even the most sophisticated vehicles will not be infallible. Developers of self-driving cars from Nissan to Zoox say such technology may be needed to address “edge cases” — the unique situations that software programmes can’t anticipate. A fallen tree, a sinkhole, a string of strange pylons, a flash flood, a fire or some other obstruction on a lonely road could make an autonomous car stop safely, but then what?

How do the cars use all that information?

That is the hard part. Sifting through all that data and responding to it require a system of immense complexity.

In some cases, engineers will write specific rules that define how a car should respond in a particular situation. A Waymo car, for example, is programmemed to stop if it detects a red light.

But a team of engineers could never write rules for every situation a car could encounter. So companies like Waymo and Uber are beginning to rely on “machine learning” systems that can learn behaviour by analysing vast amounts of data describing the country’s roadways.

Waymo now uses a system that learns to identify pedestrians by analysing thousands of photos that contain people walking or running across or near roads.

Is that the kind of thing that broke down in Tempe?

It is unclear what happened in Tempe. But these cars are designed so that if one system fails, another will kick in. In all likelihood, the Uber cars used lidar and radar as well as cameras to detect and respond to nearby objects, including pedestrians.

Self-driving cars can have difficulty duplicating the subtle, non-verbal communication that goes on between pedestrians and drivers. An autonomous vehicle, after all, can’t make eye contact with someone at a crosswalk.

“It is still important to realise how hard these problems are,” said Ken Goldberg, a professor at the University of California, Berkeley, who specialises in robotics. “That is the thing that many don’t understand, just because these are things humans do so effortlessly.”

The crash occurred at night. Is that a problem?

These cars are designed to work at night, when some sensors can operate just as well as they can in the daytime. Some companies even argue that it is easier for these cars to operate at night.

But there are conditions that these cars are still struggling to master. They do not work as well in heavy precipitation. They can have trouble in tunnels and on bridges. And they may have difficulty dealing with heavy traffic.

What are the other technologies being developed?

1) Waymo, the self-driving vehicle unit of Google’s parent company Alphabet, is testing autonomous taxis — but with an observer in the back seat. It is focused on having the car make all the driving decisions, but there is a system for handling edge cases. If a Waymo vehicle becomes confused — by, say, a new set of cones or a police barricade in the road — it can request confirmation from a remote human specialist. Once it receives confirmation of what it is sensing, the car — not the remote operator — then decides how to proceed.

The Waymo approach ensures that latency — a delay in the communications traffic — doesn’t compromise the car’s driving behaviour by leaving a remote operator unable to react in real time.

2) Phantom Auto, in Mountain View, California, is working on remote control systems, often referred to as teleoperation, that many see as a necessary safety feature for the autonomous cars of the future. Phantom Auto uses a standard 4G cellular data connection and GPS information to link the car and its backup driver. The company has even been able to make it work in areas that may have less than perfect service.

To accomplish this, Phantom Auto mounted a computer the size of a hardcover book, spiked with antennas and four wireless modems, in the trunk. The car was outfitted with three video cameras, but no LIDAR or far infrared sensors. Phantom hopes its approach will also reassure passengers by letting them talk to and see the remote operator — within limits.

3) Nissan, one of the first automakers to publicly address situations in which a self-driving car may be flummoxed by its surroundings, has proposed using a system called Seamless Autonomous Mobility, or SAM. It’s partly based on the remote control technology Nasa uses to operate rovers on Mars.

“The current idea is to draw the new route command onto a screen to direct the car,” said Maarten Sierhuis, director of Nissan’s research centre in Silicon Valley.

Such an approach avoids any possible communications glitches, and Sierhuis said the company was working on a more advanced solution, SAM 2.0, but wasn’t ready to discuss it yet.

What a happens when a self-driving can’t help itself?

“We want to be the OnStar for the autonomous industry,” said Shai Magzimof, a co-founder and the chief executive of Phantom Auto. He pictures the technology his company has developed being used in fleet vehicles, robotaxis, trucks and even self-driving cars owned by individuals.

A car in need of help would automatically contact a Phantom Auto centre, where a remote operator could use the car’s cameras and sensors to see what was happening, then manoeuvre the vehicle out of trouble. The technology prefigures a time when most passengers wouldn’t be able to take control for the simple reason that they won’t know how to drive a car — or because the steering wheel and pedals have been removed.

It is a thorny problem other companies are also trying to solve.