Sign up FAST! Login

What maps will look like when we need cars to read them

Google Maps. This is the driving route Barack Obama would take, if he were driving from his house to our office.

What maps will look like when we need cars to read them - The Washington Post

The Google map on your smart phone is a map made for human consumption. It resembles, in simplistic strokes, the world right before your eyes: road straight ahead, side streets to the left and right, intersection approaching. It talks to you in terms you can appreciate: "turn right in 200 feet." It flags your destination with a big red dot, so even the most directionally challenged can't miss it.

"That information is really intended for you," says Ogi Redzic, senior vice president of "connected driving" for the Nokia-owned mapping company HERE. "The computer processes it, but it then passes you the information, and you actually do something with it. You turn the steering wheel, you go left, you go right."

This isn't a terribly novel observation — navigation maps are meant for people! — until you consider what will happen when people don't do the navigating any more. In a future when driving becomes more automated, maps will be read by cars.

That prospect raises a ton of technical challenges for the engineers already preparing now for a world of driverless vehicles. For instance, you can supplement what Google Maps tells you with your own judgment and eyesight. A ballpark direction — "turn right in 200 feet" — is enough information for you to work with. Not so for a machine. Map errors, meanwhile, that might be amusing to a human — say, when Apple routes you through a cornfield — could be devastating to a car.

"When we say 'turn right,' that decision needs to be made much more precisely. It’s likely going to be 201 feet, 199 feet," Redzic says. "The ultimate consumer of the experience is the human, but the ultimate consumer of the data is the car."

HERE is one of the companies already trying to build such maps meant for machines, so that they'll be ready when automakers roll out cars with more automation. Driverless cars will of course have exterior sensors, too, like the hardware atop the Google autonomous car that's gotten a lot of attention. But they'll need these kinds of internal mapping brains, too, not just to navigate from point A to B, but to know precisely where they are and what's coming up that sensors can't see.

"A bad snow day — that’s also going to make sensor-based decision-making much more challenging," Redzic says. "We always view map data as an extended sensor, as allowing sensors to see much further into distance."

These maps will first and foremost have to be a lot more precise than what we're used to seeing on smartphone apps or in-car navigation screens. This picture, for instance, shows the difference between roads as they're mapped on traditional maps that may be precise down to the meter (in green), and roads as HERE is trying to map them down to the centimeter:


When cars are processing this information, each lane must be individually modeled. The car will need to know that you can't turn left from the right lane (or right from the left lane). Such maps would have to include the slope and precise curvature of each little bit of roadway, as well as the elevation (so that, for instance, a car knows where it is and what to do here). These maps would have to convey which roads travel one-way at rush hour, and exactly when, which intersections have stop signs instead of stop lights, which streets have bike lanes and on-street parking. They'll need to know the exact width of each lane, and every change in lane markings.

Oh, and they'll have to be constantly updated, because an outdated map detail is a big safety problem.

That means whatever mapping data HERE or companies like it produce will rely on constant feedback from cars on the road. Maybe you'll buy a driverless car one day that uses sensors to find its way, but also to measure changes in the landscape to update these maps. This is, in fact, how HERE builds these maps today: It has vehicles — they look about as silly as the Google Street View car — that map roads with LiDAR sensors as they travel.

The result, for now, looks something like this, a high-definition, 3D map that gives a whole lot more information than the Google Map above:


This picture raises another question: What will happen when your driverless car drives to a place that hasn't been mapped yet? HERE won't have the entire world mapped in this way by the time the first fully autonomous car comes out.

"I do not envision a scenario where the first car will be able to drive anywhere in the world," Redzic says.

You'll ride to, say, the Mexican border, and then you'll have to stop there, beyond which, the world — to a car — will be literally off the map.

Stashed in: Maps!, Auto Erotica, Self-driving Cars, My Cold Dead Fingers

To save this post, select a stash from drop-down menu or type in a new one:

Video by HERE

HERE helps people navigate their lives and make sense of the world with intelligent and personal maps.

3 steps to automation: By HERE VP for Connected Driving, Ogi Redzic

Everyone is talking about autonomous cars and they’re raising a lot of questions and concerns. For example, they are often described as ‘robotic’. We disagree and think they can do a lot of good. Having a human element might also make some of those concerns go away and help people get used to them.

People wonder whether autonomous cars will have pedals or steering wheels; others argue that they will increase traffic and recently some have pointed out that they could be used as a weapon.

We strongly believe that autonomous cars will actually do the world good. They will bring less traffic, less pollution, more safety, and reduce the space needed for parking. We’re not the only ones to think so, TED speaker Bran Ferren, told us last May that autonomous cars are the first automotive revolution in 100 years and they will profoundly transform the cities we live in.

How can autonomous cars become a reality then? What happens under the hood to bring you comfortably to your destination? Ogi Redzic, VP Connected Driving at HERE, explained this at last week’s Automated Vehicle Symposium in San Francisco.

Ogi says that an autonomous car needs three location ingredients to work, and we’re using these ingredients to develop solutions for autonomous cars like the HERE-powered Mercedes S500.

1. Where exactly am I?


First of all, an autonomous car needs to precisely position itself along the road and to know its lateral position, too.

In other words, traditional maps aren’t enough anymore. That’s why we’re mapping with 10-20 cm accuracy: with our mapping cars we’re collecting of billions of 3D points to model the surface of the road. We also include lanes, their boundaries, mark the lane center, and where the cars are supposed to drive. As you can see in the picture, the difference between traditional maps (green lines) and highly precise maps (red and blue lines) is stunning.

2. What lies ahead?


Highly precise maps give autonomous cars the power to get to their destination. Then car sensors do another part of the job: making sure that the vehicle gets there safely, recognizing its position and its environment, most notably by avoiding collisions and by reacting to unforeseen events.

However, sensor sight is limited, because of curves in the road or bigger vehicles ahead. A much smoother drive is possible if the vehicle is connected and can see around corners to control maneuvers beyond sensor visibility.

We call this ‘live road’ and that’s where our cloud services come into play. We are already working on this solution with Continental to bring autonomous cars to reality more quickly. When cars are connected with each other and to the cloud, it’s much easier for them to know what lies ahead, whether it’s wet roads, obstacles or incidents, making it easier for sensors to promptly react.

3. How can I get there comfortably?


Even if the technology is ready, the big question is whether and how humans will accept them. A self-driving car that works mainly with sensors makes for a robotic and hair raising ride.

We believe in fact that ‘humanised driving’ will be crucial to getting people to adopt them. Personalising autonomous cars driving behavior means that the cars will drive like their owners expect them to.

Our studies show that everything from the curvature of the road to the perception of open fields, trees and houses can affect driver behavior. By incorporating this perception to recreate human behavior in autonomous cars to help people feel more comfortable inside them.

I'm ready to accept them! Where do I sign up?

Am I missing anything? This sounds way better than what we have today!

Me too, imo self driving cars are coming in the nick of time, way too many human styles of driving all driving together, not good.  I think people will look back at this time as barbarous, how much human life was lost or damaged due to us driving these things (auto body repair businesses and ER's will lose a lot of business, probably cops too).

I can't wait to kick back and do what I want, while my car drives me to my destination, plus no responsibility for wrecks, on the rare occasion when it happens, it will be the car manufactures fault.

Me too! Maybe even install a minibar in my car's glove compartment...

The other thing is I'm trying to make to the Technological Singularity, so I have a chance at living forever, and driving might be the single most dangerous thing I participate in, that will knock me out before I get to that time, I consider myself lucky every time I make it home.

Yeah, I feel pretty lucky too.

You May Also Like: