top of page

Blog

AI for Good

Minimizing wildlife conflicts is essential to both wildlife conservation and agriculture production.


Private working lands (65% of Montana’s 93 million acres) are more essential than ever in stabilizing land use, fighting climate change, and providing invaluable ecosystem services. Wildlife need the connected, open landscapes that are facilitated by private working lands. However, across much of the western US, expanding wildlife populations, development, and land conversion places increasing pressure on private working lands by concentrating livestock and wildlife on a shrinking landscape, making conflict inevitable (e.g. depredation, disease transmission, resource competition, etc.). These interactions disrupt operation, cause major agricultural losses, and substantial economic damage, which place owners of working land in opposition to wildlife and wildland protection.


Protecting livestock from predators and disease is a complex endeavor, and successful reduction of conflicts requires an analysis of the efficacy and economic efficiency of various techniques.


As part of a grant from the Montana State University Extension (in collaboration with the USDA and NRCS), the Upper Yellowstone Watershed Group is creating Vision AI models to help farmers and ranchers, biologists and private landowners with various co-existence strategies between wildlife, livestock and humans living in the Greater Yellowstone area. In the summer of 2021, we will be deploying multiple Low-Power AI Trigger Camera's from Grizzly Systems, Azure Percept AI Video Devices from Microsoft, and Konexios' IoT Management Platform running on Microsoft Azure.


The main goal of this project is to utilize developing technology within artificial intelligence (AI) that can be harnessed in modern, remotely deployed monitoring hardware, including trail cameras and unmanned aerial systems (drones), to provide proof-of-concept for its use in preventing wildlife/livestock conflict. We propose to evaluate if low-maintenance, battery-operated game cameras and powered video cameras equipped with image recognition (smart cameras) and drones that use visual and thermal sensors can accurately classify images/thermal video and provide real-time alerts when animals that pose a risk are present on working lands. Real-time information on when and where wildlife is interacting with livestock would shift the focus of wildlife-livestock conflicts from compensation to prevention, which has major implications for production agriculture and wildlife conservation.

For example, has a grizzly bear moved in to an area with foraging cattle. Or, is a cow that is calving (birthing) exhibiting signs of stress, such as standing up and then laying down repeatedly. Or, has an elk herd moved in to an area of cattle during calving season when disease transmission could be an issue. Or, as is often a mortality issue with cattle, has a cow rolled on to its back which will result in suffocation in less than an hour. AI Vision technology can detect these events and then send alerts so that proactive mitigation can take place. If ranchers, private landowners and agency staff can get real-time notification of these events, then non-lethal mitigation efforts can take place and business can continue to operate.


While AI Vision technology is proven in many industries and is being deployed at a rapid pace, it has struggled to make an impact in remote & rugged settings like we are discussing here due to:

  1. power constraints

  2. expensive PhDs in data science and software development required

  3. manageability at scale (from 1 device to billions)

  4. affordability (both hardware and software)

If we can solve for these (as we think we are), then remote monitoring could truly be a killer app that improves agricultural efficiency while also promoting sustainable co-existence between humans and wildlife. For a complete overview of the project, visit our Tech for Ranching page or contact jeff@reedfly.com for more information.


Grizzly Systems Trail Cam eliminates false positives such as grass blowing and lasts 6+ months on AA batteries.

And, the value of this technology doesn't stop there. For example, in the summer of 2021 we will be deploying cameras to monitor for recreational use of the Yellowstone River (in the same way cameras are used to count cars on a highway, we are automating the counting of boats).


Let's bring it down to earth with an example using Microsoft's Azure Percept to count bird species visiting a local bird feeder, which is part of a local project to create a vital signs index for our area, similar to what Yellowstone National Park publishes regularly. But, instead of looking for birds, all you have to do is use your imagination to realize that we could be counting boats on the river, elk walking across a pasture, grizzly bears moving past a game trail camera, or a cow in distress.


Doing this was literally as straightforward as step one, two, and three. For a live demo, click here. And for a deep dive in to researchers who are using this technology to track "individual" birds, see Deep learning‐based methods for individual recognition in small birds - Ferreira - 2020 - Methods in Ecology and Evolution - Wiley Online Library.


Step 1: Build a Bird Feeder to Hold an Azure Percept Dev Kit



Step 2: Train an AI Model (without Code) using Microsoft Custom Vision



Step 3: Watch the Computer Do the Work for You from the Comfort of your Desk





445 views

Comentarios


bottom of page