
By Kathleen Wong, UC Natural Reserve System
Trail cameras have become essential tools for environmental monitoring. Relatively inexpensive, and easy to operate, they enable scientists to spot local wildlife, observe animal behaviors, and even occasionally uncover the presence of rare species.
At the 41 reserves of the University of California Natural Reserve System, a network of protected lands enabling study of the natural environment, dozens of trail cameras are trained on watering holes and stands of wildflowers, squirrel feeders and birds’ nests; even, at one point, a typically desiccated clump of moss.
However, using a trail camera to look for wildlife is like dragging a trawl across a coral reef: you’re guaranteed to capture plenty of material you don’t want. Triggered by movement, trail cameras record far more images of skittering leaves, buzzing insects, and waving grasses than species of interest. It all adds up to a lot of useless files. Sorting through the files can be a daunting chore.
A daunting pile of pictures
At Sedgwick Reserve in the Santa Ynez Valley, just one camera can accumulate several thousand high-resolution shots every few weeks. But the potential is tantalizing. Covering more than nine square miles of the Santa Ynez Valley, the reserve teems with large animals such as bears, golden eagles, coyotes, and deer.
To snatch glimpses of these animals, Grant Canova Parker, the reserve’s trail camera intern, “had to open every single picture. He’s spent hundreds of hours going through them,” says Sedgwick director Kate McCurdy. The shots are glorious: of deer stepping delicately through dappled forest, bear bathing in midsummer with childlike abandon, and even a badger buddying up with a coyote. But after nearly a decade of camera trapping, “he had five hard drives full of photos” adding up to millions of images.
From farming to trail cam filtering
That was before Rich Wolski and Chandra Krintz, computer science professors at UC Santa Barbara, entered Sedgwick’s gates more than four years ago. The two wanted to use the reserve’s agricultural fields for their SmartFarm project—ways for computers to help farmers extract insights from their sensor data.
While showing the scientists around Sedgwick, McCurdy mentioned the reserve’s outstanding trail camera program and the problem of winnowing down the shots. Krintz and Wolski realized immediately that this was a problem that computers could solve. With that, the “Where’s the bear?” project was born.
“The reserve needed something that’s constantly filtering out the stuff we don’t want, and forwarding only the good stuff,” Krintz says.
Bringing the cloud to the country
Their first hurdle was coping with Sedgwick’s remote location. If the cameras were located in the middle of a city, it might be feasible to transfer them elsewhere for processing. But at Sedgwick, that kind of off-site analysis is next to impossible. Like many NRS reserves, it’s located in a sparsely populated community with a slow internet connection. Stuffing the reserve’s accumulated collection of trail cam pics through that pipe would take months at best.
Instead, the researchers got creative. “The camera traps are out there. The pictures are out there,” Wolski says. “So we thought, what if we sent the cloud to the images instead?”
A cloud is basically a collection of computers providing services. To bring that processing capacity to Sedgwick, Krintz and Wolski would have to install a miniature Amazon—what the scientists have dubbed an edge cloud—on site.
“We took that same interface as on Amazon Web Services, and we mirrored it on the edge cloud, so that browsers can interact,” Krintz says.
Moving the cloud to Sedgwick wasn’t as simple as it might sound. “Getting it to run out there by itself in a closet was a challenge. It gets hot. It gets cold. We had to make these systems more rugged,” Krintz says.
Animal ID with Google
The next task was to teach the computer to recognize animals. Machines learn by being fed images, and using these to build a model of a bear or deer or coyote image. For training photos, Krintz and Wolski turned—where else?—to Google’s vast repository of animal photos.
Solving this problem, says Wolski, gets to the heart of their research interests. “We’re interested in the AI, in algorithms that have this property of universality.”
The code required for the model is relatively small, leaving plenty of space on the edge cloud to analyze a vast amount of camera data.
“It’s not only AI, and it’s not just architecture, and it’s not cloud. It’s all of the above,” Wolski says.
Instant notifications
Today, at least five trail cameras surveil the wildlife around Sedgwick, including at a watering hole and a pond near reserve headquarters. Intern Canova Parker manually collects the memory cards from some and plugs them into the edge cloud for analysis. But their latest, a camera trained on a raptor perch, automatically sends images to the edge cloud via a radio relay. Naturally, Krintz and Wolski call this experiment “The eagle has landed.”
Such real-time responses are ideal for those seeking rare species. The computer can notify a human immediately if it recognizes the target animal in an image. This way, “you didn’t have to wait until your intern hiked out weeks later and got camera card, and looked every pic to know your long tailed weasel was here three months ago,”
says McCurdy. “The end game is when the weasel shows up, the processing happens immediately, and sends you that picture” in time for you to track it in the wild.
The edge cloud has since learned to recognize more wildlife species, including deer and coyotes. It’s also getting the hang of counting the number of animals in a given image—an achievement that can be applied to census studies.

One quail, two quail, three quail…
Now Wolski and Krintz are looking into what kinds of other data AI can glean from the images. Their journey has produced surprises. Some problems easy for a human to solve, it turns out, can be next to impossible for a computer. An example: distinguishing fawns from adult deer.
In this case, perspective—the ability to gauge distance—is needed to estimate the deer’s size. “Maybe it’s a baby up close, but maybe it’s an adult if it’s far away,” Krintz says. But a computer has trouble gleaning 3D information from a flat, 2D image.
The ability to recognize individuals is also on the wish list. This ability could ensure the same animal seen on different images isn’t counted twice. Recognizing facial differences, rather than comparing scars or body markings, seems most doable for species like deer. But once again, solving this problem requires image perspective, so that the software can pick out characteristics such as the relative depth of a snout, width of a jaw, or distance between the eyes.
“We have conversations with computer vision people periodically where they go, no, that’s too hard. But there’s a lot of ecological research that could be done immediately if someone can figure out how to do that,” Wolski says.
Recognizing a case of mange on a bear, for example, or recognizing the size of deer antlers, can all provide insights into the health of local wildlife.
Shifting to video, or deploying multiple, coordinated cameras in a single location, could help address these problems. With the cost of such technologies falling, solutions are drawing within reach.
Describing behaviors to a computer
The holy grail of wildlife camera AI might be recognizing animal behaviors. Such a computer could automatically narrow down any images to the vanishingly few frames that show fighting, mating, feeding, grooming, or other activities of interest.
Through their agricultural work, Krintz and Wolski have become aware of how a watchful computer could help ranchers spot a very particular behavior: cows having trouble giving birth.
“Apparently they wander off by themselves in a particular way. They’re not with the herd at a certain time, and they find someplace that’s sheltered as opposed to standing in the open,” Wolski says.
The hard part is telling the AI what to look for. “Agriculturalists have very elaborate models in their head for what animals or plants do, but they’re not used to describing them. Wolski says. “Exactly how does it walk? What time of day? There’s a lot of interviewing and intellectual cross-fertilization that has to take place before we can tackle these problems.”
“These questions are spurring new machine learning, new artificial intelligence, and could advance conservation science,” Krintz says.

Going beyond Sedgwick Reserve
Right now, the Where’s the bear? system is a one-off that exists only at Sedgwick. Building it required a lot of work and technical knowhow, from setting up solar panels to supply electricity to adjusting the radios at cameras to communicate over hilly terrain.
Someday, though, Krintz and Wolski want their innovation to become into a plug-and-play device. “We want to be able to package it like an appliance such as your refrigerator. You won’t have to have a PhD in cloud computing or understand what it’s doing. It just works,” Krintz says.
There’s certainly a need for better wildlife-spotting technologies around the NRS. For example, such a system could be used to monitor western snowy plovers as they hatch their chicks at Coal Oil Point Natural Reserve. Or to spot invasive species that have made landfall on Santa Cruz Island Reserve. Or even to protect reserve wildlife.
“If we had more cameras up on our remote areas, and could train the computer to look for rifles, that would cut down on poaching,” says McCurdy.
A launchpad for research
Sedgwick Reserve’s ethos of nurturing research, the computer scientists say, was instrumental in getting both SmartFarm and Where’s the bear? to work.
“None of this would have happened without the NRS,” Krintz says.
“Kate McCurdy, who is a saint, was willing to let us fail a lot, every day, for months.” Wolski adds. “We didn’t have to explain this wasn’t a commercial project. They knew we as researchers were going to be out there for some time when nothing was going to work. The expectations were completely appropriate for investigating. We can then expand from that to other collaborators who have different expectations.”
Their ability to apply AI at Sedgwick has also given researchers street cred among farmers. Since then, through their SmartFarm project, Krintz and Wolski have worked with growers across California raising mandarins, almonds, pistachios, and even wine grapes. The edge clouds on those farms are interpreting climate sensor data to solve problems such as when to automatically turn on measures to prevent frost damage. That means a system that began as a way to find bears at Sedgwick Reserve has already helped feed California and the world.
Leave a Reply