We have talked here in the past about the difficulties autonomous vehicle drivers (operators?) will have acclimating when control of their vehicles is handed back to them, such as when they approach a work zone. Studies with simulators have shown a need for anywhere from 4 to 14 seconds for a driver to get a full grasp of all of the relevant external factors they must consider as they begin to drive.
A recent article in Axios Autonomous Vehicles points out that aviation has made use of automation for some time now. And they, too, understand the problem of moving from automated to human operators. In aviation, training focuses on that hand-off. Pilots are drilled in flight simulators on a variety of potential problems. So, when they encounter that problem during a real flight, muscle memory takes over and they react quickly and correctly.
The recent 737 Max 8 crash further underlines the importance of that training. It was apparently not included and that may have contributed to the pilot’s difficulty in regaining control.
The difference between aviation and autonomous vehicles is that training is mandatory for all pilots. If you fly a 767 you must stay current in all 767 training. However, for vehicles, a big selling point is that drivers no longer have to drive. They are told they can act more as passengers – gazing out the window, catching up on work, or watching an endless variety of streaming entertainment. Getting from that idea to one of mandatory training is a very long stretch!
Adding to the problem are the very different ways automakers are designing the machine-to-human hand-off. Each one is different.
In the Axios article, they quote Steve Casner of NASA, “We’re terrible at paying attention — and we think we’re awesome at it” Mr. Casner argues that drivers will need training. And they will need continuous updates to that training in order to learn how to deal with automation. Without initial user training and frequent refresh classes drivers will quickly become complacent.
This is a new topic of discussion but one that we must have to make CAVs safe for work zones and other segments of roadway with changing conditions.
One of the challenges the roadway safety infrastructure industry faces in regard to autonomous vehicles is understanding how those vehicles visualize the world they are passing through. Manufacturers have been restrained in their sharing of that information. The best we can get out of them is “Keep doing what you are doing to make striping, signing and traffic control devices easier to see.”
But a story published yesterday in The Verge by Andrew Hawkins details efforts by GM Cruise and Uber to make some of those visualization tools open source and free to use. It is even provided in a fairly simple and easy to use format that anyone can use on most any device.
This could be very useful for pavement marking manufacturers or contractors. It may be helpful for sign manufacturers. And it will definitely help traffic control device manufacturers understand what the vehicle “sees” and what it does not.
Now this is far from the ultimate testing platform, but it will help our industry begin to develop an understanding of the underlying issues and ways we may be able to address them. It may also help work zone ITS providers in that it offers a simple data formatting system that may be able to accommodate data feeds from smart work zones.
The GM Cruise tool is called “Worldview” and can be found HERE.
The Uber tool is called “Autonomous Visualization System” or AVS for short and can be found HERE.
We haven’t spoken with anyone who has used these tools yet. So, please try them out and tell us what you think. Are they useful to our industry? And, if so, how? What can be improved? We look forward to hearing from you!
We have been talking for the past couple of years about “connected work zones” – that is, the automatic and real-time method of putting our work zones on the digital map that everyone is quickly coming to depend upon when choosing a route.
We have argued that traffic control workers don’t need more to do when they are setting up or tearing down a work zone. So, to arrive at a point where we have timely and accurate reporting of work zones, it must happen automatically.
Several companies are now providing solutions. Those solutions vary in their complexity and technologies involved. But in their simplest form they each include a device attached to existing traffic control devices. One of those is normally the arrow board. The beauty of this approach is that when the arrow board is turned on, the system immediately tells the digital map that a work zone just popped up on that route at that precise location. And when it is turned off, it tells the map that the work zone is now gone. It happens every time a “smart” arrow board is used and those are becoming more and more common.
We all “get” this. But now the driving public is also recognizing the importance of these systems. An article by Tim Harlow in the January 27th Minneapolis Star-Tribune talks about a system supplied by Street Smart Rentals to Minnesota DOT in the Twin Cities.
He points out that the existing 511 system does a good job of informing the public about long-term projects, but that short-term and unplanned closures can cause just as much disruption yet are not included in their warnings to the public.
The system supplied by Mike Granger and Street Smart Rentals is changing that for the better. And with the arrival of autonomous vehicles, this will become even more important. In the article Brian Kary, MnDOT’s Director of Traffic Operations “said the technology is not active now, but it could be this summer or fall. MnDOT is evaluating costs before making it a permanent 511 feature. The agency also is setting up a timeline install the technology and figuring out how best to get information to other traffic information sources, such as Google, Waze and TomTom, since not everybody uses 511.”
We believe economies of scale will quickly and significantly reduce those costs. And the need for this information will bring down any barriers to those traffic information sources. We look forward to hearing more about this system and others like it the exciting year to come.
Being the work zone data nerds that we are, we attended the National Dialogue on Highway Automation Workshop #2: Digital Infrastructure and Data held August 1st and 2nd in Seattle. The first workshop covered planning and policy. Workshop #3 focuses on freight. #4 is Operations and is held at the same time as the National Rural ITS meeting in Phoenix. The final workshop will be held late this year in Austin and will be more technical in nature as it covers infrastructure design and safety.
Each workshop includes a series of presentations followed by breakout groups where ideas are discussed and then shared with the larger group. The format works well and benefits from the input of a wide range of stakeholders.
You will be happy to hear that work zones came up early and often. In fact the opening comments used work zones as an example of the need for some sort of standardization as every agency now provides varying amounts of data, different types of data, different formats and a very wide range of detail. Another speaker called work zones the “low hanging fruit” for highway automation in general and data collection and dissemination in particular.
There were about 200 in attendance and maybe 30 raised their hands when asked who attended the Automated Vehicle Symposium last month in San Francisco. So, this was an almost entirely new group.
You should also know the FHWA is seriously committed to this process. They had 20 or 30 of their own people at this event running it, moderating the breakout sessions, and asking lots of questions.
There were a number of themes that jumped out at us. One was data quality and verification. The consensus was that state DOTs will probably have the responsibility of verifying data accuracy. But what that process might be is unclear. It will likely vary by data type. In our case it will probably come as a quality check after it is already posted. Work zone activity must be reported in real time to be actionable, so they will weed the inaccurate reports (and reporters) out after the fact.
Remarkably most in the room were well acquainted with the MUTCD. Multiple comments suggested that it needs to be revised to recognize automated vehicles. Some even suggested reducing the leeway states have in specifying sign formats, pavement marking details, etc. to create more consistent traffic control for CAVs. But later others pointed out this is unlikely to happen and the effort would be better spent doing this outside the MUTCD process, at least to begin with.
These two days were time well spent. If you are able, we strongly encourage you to participate in one of their future workshops, especially the event in Phoenix. It will be focused on traffic operations. But because it will be held in conjunction with the NRITS show, it will also spend more time on automated vehicles and rural roads. Learn more HERE.
In our last post we looked at the current state of the art in autonomous vehicle navigation. Another way in which the problem of navigation in unmapped or incorrectly mapped areas will be overcome is through artificial intelligence. We looked at the potential of this technology in our 4/10/17 post entitled, “Machine Learning and Work Zones”. Michael Reser published an article May 8th in Electronic Design entitled, “How AI Will Help Pave the Way to Autonomous Driving”.
Mr. Reser’s main point is that given the unfathomable quantity of data that must be digested and acted upon by autonomous vehicles (AVs) the technology will progress much faster and more accurately through machine learning. “Translating it all into a real-world challenge for AI-backed autonomous-driving systems, the expected outcome of such massive data processing is nothing short of getting the right answer in the shortest possible time to determine a proper action to avoid a traffic incident.”
“To put it differently, (a) large set of data in combination with realistic scenarios and nonlinear parameter sets enables systems and applications to fail safely and learn faster.”
He goes on to list the many challenges that must also be addressed including how to tie images from multiple sensors with varying resolution quality into one accurate picture. Another was how to validate and tie different data sources together in time. They must have a consistent way of labeling those sources in time.
Mr. Reser goes on to say they are not there yet, but he sees the process as inevitable.
“For true enablement of Level 4 and Level 5 automated driving, the system should be functional in all weather and driving conditions, which is obviously a given requirement. Still, it’s a much bigger challenge than sometimes mentioned and admitted”.
Like most AV challenges, this one has serious implications for work zones. It will be interesting to watch as this process unfolds.
Much has been written about autonomous vehicles and their methods of navigation. But most of that writing is little more than science fiction. The systems described are usually just concepts that engineers are working toward. What is the current state of the art?
Dyllan Furness posted May 9th about emerging technology in Digital Trends magazine His article, titled “Get lost: MIT’s self-driving car takes on unmarked roads” examined the current capabilities of autonomous vehicles. He found that current AVs are only able to drive on well-mapped city streets. This deficiency would affect autonomous vehicles ability to navigate a work zone as well. As he wrote in his opening lines, “If you find yourself on a country road in a self-driving car, chances are you’re both pretty lost. Today’s most advanced autonomous driving systems rely on maps that have been carefully detailed and characterized in advanced. That means the millions of miles of unpaved roads in the United States are effectively off-limits for autonomous vehicles.”
MIT is working to change that by developing a method of navigating using simple GPS, Google map data and a variety of sensors. ““We were realizing how limited today’s self-driving cars are in terms of where they can actually drive,” Teddy Ort, an MIT CSAIL graduate student who worked on the project, told Digital Trends. “Companies like Google only test in big cities where they’ve labeled the exact positions of things like lanes and stop signs. These same cars wouldn’t have success on roads that are unpaved, unlit, or unreliably marked. This is a problem.””
Certainly, work zones fall into this problem area. And MIT’s new system could address our issues, as well. In particular, by using Google map data this system would also pick up near real-time work zone data like we described in our 9/25/17 post. Then the sensors could identify traffic control devices and follow them safely through the work zone.
It is good to see that at least one organization understands the limits of current technology and is looking for a better, safer way for autonomous vehicles to find their way through rural roads and work zones.