For research on campus safety, Chengbin Deng, associate professor of geography, used a drone to take thousands of aerial images of campus at night, then put them together in a mosaic.
鈥婽en years ago, taking an aerial photo of the 91社区 campus meant hiring an airplane and hoping for sunshine. The photos were pretty and showed why the main road around campus is called 鈥渢he brain.鈥
Two years ago, Chengbin Deng, associate professor of geography, used a drone to take thousands of aerial images of campus over two nights and then used a computer program to turn the images into a mosaic.
Still a pretty picture, but it鈥檚 more than that.
It鈥檚 an example of how drones are inspiring new ways of looking at old problems and finding new challenges to be solved. To be clear, the drones 鈥 or unmanned aerial vehicles (UAVs) 鈥 are simply the platforms on which cameras, sensors and microprocessors can be mounted. But 91社区 faculty who use sensors in fields ranging from anthropology to engineering to geography to systems science all agree: Drones are game changers.
Seeing beyond the surface
The nighttime image of campus is a visual reference point for Deng, who does environmental criminology research, and two collaborators who are looking at what it takes to make a sustainable community. Specifically, they are focusing on safety.
鈥淚f we want a community to be sustainable, we have to make it safe,鈥 says Yu Chen, associate professor of electrical and computer engineering.
Making it safe means understanding the landscape of the community and how people behave within it. For instance, the nighttime image shows where lighting is ample and where it鈥檚 lacking, so more lights or increased patrols might be warranted.
But when it comes to human behavior, things get complicated.
The most straightforward application, Chen says, is to fly a drone, mounted with a video camera, over a location that a police officer cannot easily or safely approach, and stream images to a dispatcher for review.
Consider a large outdoor concert: Suddenly the crowd shifts and turns away from the stage. Is it a fight 鈥 or are people dancing like dervishes? If a drone can fly over the crowd and transmit images in real time, then a human can decide if the people are throwing punches or clapping in rhythm.
鈥淲hen something unusual happens, you might not want to dispatch police immediately, but you want to at least make someone in charge aware of what鈥檚 going on. It鈥檚 called situational awareness,鈥 Chen says.
鈥淎 drone is one of the most important devices for us to move toward the future of smart cities. Many people think a smart city is one that has the internet everywhere. But a smart city is one in which you can collect data from the environment, analyze it and use it to make a decision instantly.鈥
Making drones 鈥渟mart鈥
Of course, it鈥檚 not that easy.
Drones have limited flying time based on battery size; 20 minutes on average. As the size and weight of sensors have dropped 鈥 making them better adapted to mounting on a drone 鈥 the amount of information they can collect has grown. So, a drone doing surveillance will have to send a huge amount of video data back to a server on the ground before it gets to a computer screen where a human can assess what鈥檚 going on.
To streamline this, Chen and his group are using a process called edge computing to turn the drone into a 鈥渟mart device,鈥 meaning the drone carries a small, single-board computer that can process the video data right in the air. Algorithms built to recognize aberrations in predictable behavior can immediately issue an alert that tells a human, 鈥淗ey, pay attention to this!鈥
Paying attention has been Timothy Faughnan鈥檚 job for the 37 years he鈥檚 spent in law enforcement on the 91社区 campus, including 10 years as chief of police. He is now associate vice president for emergency services.
鈥淎 lot of it has to do with familiarity. Police generally work the same shift in the same patrol zone and, over time, learn what is normal and what is not 鈥 like who parks when and where at a building. It becomes easy to recognize a pattern,鈥 he says.
So, when Chen wanted to create artificial intelligence to recognize patterns particular to a campus community, he asked Faughnan for help.
Faughnan says he remembers asking Chen, 鈥淲hat role am I really playing here? And he [Chen] said, 鈥業t鈥檚 like left brain, right brain. We need each other. You provide real-world stuff and I provide technical stuff.鈥
They started with basic observation skills right from the police manual. 鈥淲e trained artificial intelligence how to think like a new officer,鈥 Faughnan says.
But what does a veteran police officer bring to the equation? Chen and a graduate assistant asked Faughnan a lot of questions, like 鈥淲hy would you investigate this but not that?鈥
鈥淭hey were taking what I was telling them and thinking in terms of what arguments AI is going to make to determine, 鈥業f I see this, do I do that or not do that?鈥
鈥淎nd remember, nobody is 鈥榮eeing鈥 anything,鈥 he says.
The algorithms recognize movements and patterns, not individuals.
Protecting privacy
As important as knowing what to look at is knowing what not to look at. Privacy is a vital concern for the team.
Chen is working on a way to keep drones from turning into peeping toms.
鈥淚f you fly a drone close to windows, there will be some algorithm that says, 鈥楾his is a window, and do not remain to look at it,鈥欌 Chen says. 鈥淥r when there is someone in their backyard, maybe sunbathing nude, we can denature the data before the image is sent back.鈥
It will be imperative that policymakers, stakeholders and communities understand the upside and downside of drones, says Deng. The technology will eventually change our lives, and will require policies.
鈥淲e don鈥檛 want to monitor everyone. We want to make sure we have a safe campus without violating people鈥檚 privacy,鈥 Deng says.