Self-driving cars may no longer be confined to the scenes of science fiction movies.

The Next Generation Vehicle project at the University’s Perceptual Robotics Laboratory is envisioning a 21st century where car-related injuries and deaths are preventable using software tested on the streets of Ann Arbor.

The project is working on a variety of vehicle components that could enable autonomous driving. The Perceptual Robotics Laboratory supports projects that develop algorithms for research and development in image processing and systems engineering, among other fields.

Though the vehicle remained under the control of a manual driver during initial tests in Ann Arbor, the navigational system delivered correct imaging data. When paired with other components of the NGV project, these advances aim to launch the first class of driverless vehicles.

Led by Ryan Eustice, associate professor of engineering, and Edwin Olson, associate professor of computer science and engineering, the NGV project has a team of graduate students developing vehicle technologies like new navigational and obstacle detection systems designed to enhance driver safety, reduce vehicle production costs and ultimately design a self-driving car.

Arash Ushani, an engineering graduate student, said the NGV project will have a long lifespan at the University even after it accomplishes its target goals.

“It’s like saying, ‘Do you think airplanes will be complete once you make something that flies?’ ” Ushani said. “There’s always ways you can improve it to make it better.”

Ryan Wolcott, an engineering graduate student, has been working on the NGV project’s navigational system since 2012. The system employs the same type of camera technology used in video gaming systems to produce three-dimensional mapping data.

“The end goal is for driver safety,” Wolcott said, “It’s obvious that human drivers are not very good at driving. You know, people have gotten accustomed to fatalities with driving just because it’s such an important piece of technology. Technology can fill in the gaps where humans are distracted, or what have you. It’s ultimately technology that the user can then turn on.”

However, it will still be a while before driverless vehicles hit the market, as researchers run into obstacles. The navigation system, for example, has trouble adapting to a change in weather conditions and is difficult to implement in natural gas vehicles.

The vehicle’s software independently deciphers the images it collects to make operating decisions. The transition from laser scanning technology to NGV’s camera-based unit is projected to significantly reduce the cost of vehicle navigation systems, setting their project apart from other, higher-cost projects that use lasers.

“A lot of the bigger companies like Google rely on (lasers) so heavily,” Wolcott said. “A key part of their infrastructure is these laser scanners. But it’d be nice long term to have the cameras as maybe the only sensor because they’re far easier to conceal. It will lead to a car that looks more attractive.”

Engineering junior Carl Kershaw worked on the NGV project as part of the Undergraduate Research Opportunity Program.

“We have a few research vehicles, but they have to be hybrids because of how much power the computers draw,” he said. “Those cars are very expensive because they’re covered in sensors, basically. You try to plug this into a normal car, you basically slowly drain the battery.”

Ford Motor Company and State Farm Insurance sponsor the NGV project, in collaboration with April Robotics Laboratory. Further NGV testing will be conducted at the University’s new Mobility Transformation Center testing facility, set to open this spring.

Leave a comment

Your email address will not be published. Required fields are marked *