A team of researchers at the University of Michigan is looking to animals to find new ways for autonomous vehicles navigation system through the environment.

scientists to take help from animals to create future navigation system

Autonomous vehicles are jauntily steering through the streets of more and more cities, but the navigation systems in these vehicles remain an evolving technological concept. As companies vie for the rights to urban terrains, they typically use sensors based on optical properties (like light waves and video) or radio waves to map and navigate the environment. These options may not provide the best coverage, especially in bad weather. A team of researchers at the University of Michigan is turning to nature to develop something better.

“Animals have the amazing ability to find their way using sound,” said Bogdan Popa, an assistant professor of mechanical engineering at the university and principal investigator on the project. “We want to develop a sensor that uses sound like animals.”

Previous efforts with sound have failed because sound waves do not travel as far in air as light and radio waves. In fact, current ultrasound sensors have a range of only 1 meter and produce low-resolution maps.

Popa plans to leverage knowledge from nature to advance this technology.

Dolphins, bats, and whales use echolocation, a technique where a sound pulse is emitted into the environment. When the pulse encounters an object, it bounces off and sends reflections back to the animal to decipher. Using this approach, animals can navigate their terrain, find food, and avoid predators—all of which happens very quickly.

Popa believes echolocation offers a tantalizing new opportunity that will allow autonomous vehicles to operate in an uncertain world under inclement weather conditions while retaining their autonomy.

The Sensor of the Future

Sound has a limited range as it travels through the air. To propel sound waves more efficiently, Popa and his team constructed an acoustic lens using passive and active metamaterials.

Similar to an optical lens, the acoustic lens consists of two engineered pieces of patterned plastic that are capable of focusing ultrasonic sound waves (35–45 kilohertz) in any desired direction with only the slightest deformation. This capability means that the lens can be fixed to the vehicle and does not need to be cleaned or realigned. With only minor adjustments, the lens can project a focused wave in almost any direction. Popa likens this new sensor to a laser beam compared to traditional sound applications that are more like an incandescent light bulb.

The team also developed a process to analyze the vast amount of information contained in the returning echoes. To do this, they made the project even more multidisciplinary, turning to computer science to interpret biological sensory signals. The Michigan team developed a convolutional neural network, consisting of individual deep learning algorithms that can differentiate, weigh, and assign importance to self-labeled images.

“Using some experiments with dolphins to understand their behavior, we developed a series of neural networks,” said Popa. “Each neural network is specialized to recognize one object, like a type of fish, a threat, rocks, etc.”

For the first stage of the study, the team plans to develop a series of neural networks. Each network will be trained to interpret the returning echoes for a specific object and determine whether the object is present in the environment and its likeliest position.

“This is a modular approach that is more decentralized,” said Popa. “It is easier to do as opposed to one algorithm that has to provide all the data.”

Once identified, the object will be placed on the map in front of the vehicle. Popa plans to simultaneously map the environment with multiple neural networks to identify many different objects to recreate the world before the vehicle.

Next Steps

Once a network is trained, Popa believes it will be able to provide an answer to questions about location almost instantaneously. The team plans to layer neural network after neural network to provide the power of interpretation for an array of incoming echoes.

Although the team is still acquiring the data to train the various algorithms in the neural networks, they plan to test the system using virtual simulations. If all goes well, they will release the new acoustic sensor-based navigation system into the real world to see how it helps autonomous vehicles navigate the streets.

“Since this technology is still in the beginning stages, it’s hard to say how it will compare with current sensors,” said Teddy Ort, a Ph.D. candidate in the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology. “If it could provide detailed 3D data at range, it could prove very valuable not only to replace, but perhaps to augment the current sensor suites.” Ort did not contribute to this study.

As the demand for autonomous vehicle technology increases, Popa’s contribution could improve the safety of vehicles navigating every community, large and small.

“For me, the most exciting part is understanding how the natural world does what it does in such an efficient way,” said Popa. “We hope to replicate or equal the performance of these biological systems.”

Originally published by EOS

By Web Team

Technology Times Web team handles all matters relevant to website posting and management.