One of the toughest problems facing search and rescue teams is gathering enough volunteers in those critical first few hours after someone goes missing. A hiker is lost somewhere in a large park with hundreds of miles of trails, and your team has limited resources. Which trails do you send people down, and which do you leave unexplored?
It’s a crucial dilemma, but one that might soon disappear thanks to the help of drones. A team from the (IDSIA) and the University of Zurich is trying to build a fleet of affordable electronic eyes capable of navigating trails to aid large search operations and reduce the time it takes to find someone stranded in the woods.
It’s no easy task. Paths through the forest look massively different from one another. Sometimes they’re well defined. Sometimes they’re barely visible. Each has its own topography. Trying to teach an electronic device how to identify and follow a trail is wildly complicated, so for a solution researchers have chosen to build a machine that can teach itself via an artificial neural network.
An artificial neural network is similar to a biological neural network, like the one in your brain. It’s one of the hallmarks of machine learning. The network makes connections between simulated neurons, each of which is a mathematical function that can interact with other functions so they eventually become adept at collectively identifying patterns—or learning. You don’t have to spend a million hours telling the network how to think. Instead, you give it a bunch of examples broken into different classifications, and the network figures out how to organize, or recognize, all those examples in a way that makes sense.
In this case, the researchers needed the network to reliably recognize three basic things: an image of the right side of a trail, an image of the left side of a trail, and an image of the middle of a trail. For basic material, they uploaded 20,000 photos gathered by a hiker wearing three GoPros that recorded each of those individual aspects and fed them to the network.
How did it work? Really well. The network was able to recognize the right side, left side, or center of the image 85 percent of the time. That’s slightly better than some humans did in a comparison study in which researchers showed people the same images and asked them to make the same call. And remember, that 85 percent was without anybody telling the computer what a trail is or what a trail looks like.
As you can see in the video below, the project is still in its infancy. The drone researchers chose for the real-life experiment is just an off-the-shelf $300 , but the navigational software that the neural network created ran on a laptop connected to the drone via WiFi—meaning someone had to run behind the drone the entire time.
Down the road, researchers think the final software will be lightweight enough to run on a drone’s onboard computer. Then you’d just have to teach it one more thing: what a human looks like. From there, SAR teams could deploy a whole fleet of these machines to fly along trail networks and report back if they see anything that looks like a missing person. Issues of battery life, long-range radio signals, and whirring noises disturbing wildlife will have to be addressed, but for a beta version, they’ve come a long way.