Google Presentation at ITS California

google

ITS California was held September 20 to 23 in Los Angeles. It was the best attended show in our history and many good ideas were exchanged. But the highlight for me was Tuesday’s lunch keynote speaker, Dmitri Dolgov. He is Google’s principal engineer and leads the software team for their autonomous vehicle project.

During his presentation it quickly became apparent that their approach to this challenge has broadened significantly. Their early efforts approached navigation solely through GPS and mapping. And that is still their primary focus. But they are now beginning to take a more holistic approach, including V2V and V2I as part of their efforts.

His said that DSRC deployment takes time. And they don’t want to wait for the population of transmitters to hit some critical mass before Google starts to work on this. There is plenty to do now using GPS and mapping.

Even with this narrower approach, they use pavement markings and signs to help orient the vehicle. The process starts with GPS but that only gets you to a point near your actual location. That is then compared to their maps to narrow it down further. Finally they look at the infrastructure to learn precisely where the vehicle is located.

Greg Larson of Caltrans’ Division of Research & Innovation asked a question we discussed here in July (https://workzoneitsblog.com/2015/07/02/should-driverless-cars-make-life-or-death-decisions/): How will Google handle those situations where the vehicle is faced with an unavoidable crash and must choose between something like running into a wall or over children on the sidewalk? Mr. Dolgov answered that they are unwilling to stipulate that those crashes are unavoidable. He firmly believes they will reach the point where those crashes are preventable. The key is getting drivers out of the equation. Once there are enough of these autonomous vehicles on the road, driver error and most if not all crashes, will be eliminated.

Many in the audience asked about work zones and other changing conditions. First, I found it great that the industry now realizes this is a problem. Mr. Dolgov claimed their system now recognizes a lane closure and attempts to merge into the open lane right away. It even tells other autonomous vehicles in the area so they can plan to do the same.

The Google system is learning to recognize flagger and law enforcement hand signals as well. You must assume this will make it even more important to perform those signals in a standard and recognizable manner.

He went on to say that even when something completely random occurs, the vehicle slows or stops, until it finds a way to drive safely around it. He said they have a simulator they use to test his software. He mentioned an example where the car happened upon a woman in a wheelchair in the middle of the road chasing a chicken with a broomstick. Now that is random! I am not sure how most of us would react to that. But the car came to a stop until she moved out of the road.

I left the session thinking work zones won’t be as much of a problem as we all once thought. If anything, these vehicles may be the best behaved we face while working in traffic. And the good news is that quality standards for our devices and workers will become even more important once machines are depending on them.