Ten years ago, taking an aerial photo of the Binghamton University campus meant hiring an airplane and hoping for sunshine. The photos were pretty and showed why the main road around campus is called “the brain.”
Two years ago, Chengbin Deng, associate professor of geography, used a drone to take thousands of aerial images of campus over two nights and then used a computer program to turn the images into a mosaic.
Still a pretty picture, but it’s more than that.
It’s an example of how drones are inspiring new ways of looking at old problems and finding new challenges to be solved. To be clear, the drones — or unmanned aerial vehicles (UAVs) — are simply the platforms on which cameras, sensors and microprocessors can be mounted. But Binghamton University faculty who use sensors in fields ranging from anthropology to engineering to geography to systems science all agree: Drones are game changers.
Seeing beyond the surface
The nighttime image of campus is a visual reference point for Deng, who does environmental criminology research, and two collaborators who are looking at what it takes to make a sustainable community. Specifically, they are focusing on safety.
“If we want a community to be sustainable, we have to make it safe,” says Yu Chen, associate professor of electrical and computer engineering.
Making it safe means understanding the landscape of the community and how people behave within it. For instance, the nighttime image shows where lighting is ample and where it’s lacking, so more lights or increased patrols might be warranted.
But when it comes to human behavior, things get complicated.
The most straightforward application, Chen says, is to fly a drone, mounted with a video camera, over a location that a police officer cannot easily or safely approach, and stream images to a dispatcher for review.
Consider a large outdoor concert: Suddenly the crowd shifts and turns away from the stage. Is it a fight — or are people dancing like dervishes? If a drone can fly over the crowd and transmit images in real time, then a human can decide if the people are throwing punches or clapping in rhythm.
“When something unusual happens, you might not want to dispatch police immediately, but you want to at least make someone in charge aware of what’s going on. It’s called situational awareness,” Chen says.
“A drone is one of the most important devices for us to move toward the future of smart cities. Many people think a smart city is one that has the internet everywhere. But a smart city is one in which you can collect data from the environment, analyze it and use it to make a decision instantly.”
Making drones “smart”
Of course, it’s not that easy.
Drones have limited flying time based on battery size; 20 minutes on average. As the size and weight of sensors have dropped — making them better adapted to mounting on a drone — the amount of information they can collect has grown. So, a drone doing surveillance will have to send a huge amount of video data back to a server on the ground before it gets to a computer screen where a human can assess what’s going on.
To streamline this, Chen and his group are using a process called edge computing to turn the drone into a “smart device,” meaning the drone carries a small, single-board computer that can process the video data right in the air. Algorithms built to recognize aberrations in predictable behavior can immediately issue an alert that tells a human, “Hey, pay attention to this!”
Paying attention has been Timothy Faughnan’s job for the 37 years he’s spent in law enforcement on the Binghamton campus, including 10 years as chief of police. He is now associate vice president for emergency services.
“A lot of it has to do with familiarity. Police generally work the same shift in the same patrol zone and, over time, learn what is normal and what is not — like who parks when and where at a building. It becomes easy to recognize a pattern,” he says.
So, when Chen wanted to create artificial intelligence to recognize patterns particular to a campus community, he asked Faughnan for help.
Faughnan says he remembers asking Chen, “What role am I really playing here? And he [Chen] said, ‘It’s like left brain, right brain. We need each other. You provide real-world stuff and I provide technical stuff.”
They started with basic observation skills right from the police manual. “We trained artificial intelligence how to think like a new officer,” Faughnan says.
But what does a veteran police officer bring to the equation? Chen and a graduate assistant asked Faughnan a lot of questions, like “Why would you investigate this but not that?”
“They were taking what I was telling them and thinking in terms of what arguments AI is going to make to determine, ‘If I see this, do I do that or not do that?’
“And remember, nobody is ‘seeing’ anything,” he says.
The algorithms recognize movements and patterns, not individuals.
As important as knowing what to look at is knowing what not to look at. Privacy is a vital concern for the team.
Chen is working on a way to keep drones from turning into peeping toms.
“If you fly a drone close to windows, there will be some algorithm that says, ‘This is a window, and do not remain to look at it,’” Chen says. “Or when there is someone in their backyard, maybe sunbathing nude, we can denature the data before the image is sent back.”
It will be imperative that policymakers, stakeholders and communities understand the upside and downside of drones, says Deng. The technology will eventually change our lives, and will require policies.
“We don’t want to monitor everyone. We want to make sure we have a safe campus without violating people’s privacy,” Deng says.