Google AI
The Times Australia
The Times World News

.

Autonomous drones could speed up search and rescue after flash floods, hurricanes and other disasters

  • Written by Vijayan Asari, Professor of Electrical and Computer Engineering, University of Dayton

During hurricanes, flash flooding and other disasters, it can be extremely dangerous to send in first responders, even though people may badly need help.

Rescuers already use drones in some cases, but most require individual pilots who fly the unmanned aircraft by remote control. That limits how quickly rescuers can view an entire affected area, and it can delay aid from reaching victims.

Autonomous drones could cover more ground faster, especially if they could identify people in need and notify rescue teams.

My team and I[1] at the University of Dayton Vision Lab[2] have been designing these autonomous systems of the future to eventually help spot people who might be trapped by debris. Our multi-sensor technology mimics the behavior of human rescuers to look deeply at wide areas and quickly choose specific regions to focus on, examine more closely, and determine if anyone needs help.

The deep learning technology that we use mimics the structure and behavior of a human brain in processing the images captured by the 2-dimensional and 3D sensors embedded in the drones. It is able to process large amounts of data simultaneously to make decisions in real time.

Looking for an object in a chaotic scene

Disaster areas are often cluttered with downed trees, collapsed buildings, torn-up roads and other disarray that can make spotting victims in need of rescue very difficult. 3D lidar sensor technology, which uses light pulses, can detect objects hidden by overhanging trees.

Autonomous drones could speed up search and rescue after flash floods, hurricanes and other disasters Spotting people amid busy surroundings. University of Dayton Vision Lab, CC BY-ND[3]

My research team developed an artificial neural network system that could run in a computer onboard a drone. This system emulates some of the ways human vision works. It analyzes images captured by the drone’s sensors and communicates notable findings to human supervisors.

First, the system processes the images to improve their clarity[4]. Just as humans squint their eyes to adjust their focus, this technology take detailed estimates of darker regions in a scene and computationally lightens the images.

In a rainy environment, human brains use a brilliant strategy to see clearly: By noticing the parts of a scene that don’t change[5] as the raindrops fall, people can see reasonably well despite the rain. Our technology uses the same strategy, continuously investigating the contents of each location in a sequence of images[6] to get clear information[7] about the objects in that location.

Confirming objects of interest

When rescuers search for human beings trapped in disaster areas, the viewers’ minds imagine 3D views[8] of how a person might appear in the scene. They should be able to detect the presence of a trapped human even if they haven’t seen someone in such a position before.

Autonomous drones could speed up search and rescue after flash floods, hurricanes and other disasters Confusing and dim lighting can make it hard to identify people. University of Dayton Vision Lab, CC BY-ND[9]

We employ this strategy by computing 3D models of people and rotating the shapes in all directions. We train the autonomous machine to perform exactly like a human rescuer does. That allows the system to identify people in various positions, such as lying prone or curled in the fetal position, even from different viewing angles and in varying lighting and weather conditions.

The system can also be trained to detect and locate a leg sticking out from under rubble, a hand waving at a distance, or a head popping up above a pile of wooden blocks. It can tell a person or animal apart from a tree, bush or vehicle.

Putting the pieces together

During its initial scan of the landscape, the system mimics the approach of an airborne spotter, examining the ground to find possible objects of interest or regions worth further examination, and then looking more closely. For example, an aircraft pilot who is looking for a truck on the ground would typically pay less attention to lakes, ponds, farm fields and playgrounds because trucks are less likely to be in those areas. The autonomous technology employs the same strategy to focus the search area to the most significant regions in the scene.

Then the system investigates each selected region to obtain information about the shape, structure and texture of objects there. When it detects a set of features that matches a human being or part of a human, it flags that location, collects GPS data and senses how far the person is from other objects to provide an exact location.

The entire process takes about one-fifth of a second.

This is what faster search-and-rescue operations can look like in the future. A next step will be to turn this technology into an integrated system that can be deployed for emergency response.

We previously worked with the U.S. Army Medical Research and Materiel Command[10] on technology to find wounded individuals in a battlefield who need rescue. We also have adapted the technology to help utility companies monitor heavy equipment[11] that could damage pipelines. These are just a few of the ways disaster responders, companies or even farmers could benefit from technology that can see as humans can see, especially in places humans can’t easily reach.

[Over 100,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today[12].]

References

  1. ^ and I (scholar.google.com)
  2. ^ University of Dayton Vision Lab (www.udayton.edu)
  3. ^ CC BY-ND (creativecommons.org)
  4. ^ improve their clarity (www.udayton.edu)
  5. ^ the parts of a scene that don’t change (physics.stackexchange.com)
  6. ^ in a sequence of images (doi.org)
  7. ^ clear information (link.springer.com)
  8. ^ imagine 3D views (psycnet.apa.org)
  9. ^ CC BY-ND (creativecommons.org)
  10. ^ U.S. Army Medical Research and Materiel Command (mrmc.amedd.army.mil)
  11. ^ monitor heavy equipment (www.udayton.edu)
  12. ^ Sign up today (theconversation.com)

Read more https://theconversation.com/autonomous-drones-could-speed-up-search-and-rescue-after-flash-floods-hurricanes-and-other-disasters-167016

Times Magazine

Why Is Professional Porsche Servicing Important for Performance and Longevity?

Owning a Porsche is a symbol of precision engineering, luxury, and high performance. To maintain t...

6 ways your smartwatch is lying to you, according to science

You check your smartwatch after a run. Your fitness score has dropped. You’ve burnt hardly any...

Has the adoption of electric vehicles led to new forms of electricity theft

Why the concern exists Electric vehicles (EVs) like the Tesla Model 3 or Nissan Leaf shift “fue...

Adobe Ushers in a New Era of Creativity with New Creative Agent and Generative AI Innovations in Adobe Firefly

Adobe (Nasdaq: ADBE) — the global technology leader that unleashes creativity, productivity and ...

CRO Tech Stack: A Technical Guide to Conversion Rate Optimization Tools

The fascinating thing is that the value of this website lies in the fact that creating a high-cali...

How Decentralised Applications Are Reshaping Enterprise Software in Australia

Australian businesses are experiencing a quiet revolution in how they manage data, execute agreeme...

The Times Features

The Coalition wants NDIS reform to focus on 3 things. H…

The government is expected to announce further changes to the National Disability Insurance Sche...

Power Bills: What Are the Options to Decrease What a Fa…

Australian households are being told, repeatedly, to “use less power.” Turn off lights. Shorten...

The Times Launches Dedicated Property Advertising Platf…

In a significant expansion of its digital media offering, The Times has formally launched TimesA...

Can I get a free flu shot? And will it cover ‘super K’?…

For many of us, flu can mean a nasty few weeks of illness. But for the very young and old, and...

Mother’s Day, The Lodge Dining Room

Her Day, The Lodge Way This Mother’s Day, The Lodge Dining Room presents a refined take on high...

The Albanese Government’s plan to impose a retrospectiv…

LABOR’S RETROSPECTIVE TAX GRAB RISKS 3 MILLION JOBS The Albanese Government’s plan to impose a retr...

Court outcome reinforces wildlife trafficking will not …

A 20-year-old man has been fined close to $50,000 and ordered to pay costs after pleading guilty t...

Businesses tap UOW PhD researchers to accelerate innova…

Industry internship program connects businesses with research talent to fast-track innovation an...

Olivia Colman, Kate Box to join an exclusive Live Q…

Photo credit : Photo Credit Mark De BlokFresh out of cinemas, JIMPA - the new film by acclaimed di...