Group

Data management and processing tools / Feed

Conservation tech work doesn't stop after data is collected in the field. Equally as important to success is navigating data management and processing tools. For the many community members who deal with enormous datasets, this group will be an invaluable resource to trade advice, discuss workflows and tools, and share what works for you.

event

AniMove Summer School 2025

AniMove is a collective of international researchers with extensive experience in the topics of animal movement analysis, remote sensing and conservation. The AniMove Workshop is a two-week intensive training course for...

0
See full post
discussion

Prototype for exploring camera trap data

Hi, I would like to start a discussion around a prototype that aims at improving consumption of camera trap data. How is it different (in theory) from existing tools? I...

36 4
  • Regarding species richness, isn't it covered by the activity tab where we can see the entire list of detected species? What else do you think would be helpful? I can imagine a better UI focusing on species with more information for each species and easier search but the raw info would be roughly the same.

I think the app sort of covers this, using the pie chart layover on leaflet in the activity tab. However, it would be nice to have a more direct way of visualizing the species richness (i.e. scale the radii of the circle markers with the number of species detected). In addition to this you may want to think about visualizing simple diversity indices (there's alpha diversity which captures the species richness at each trap, gamma diversity to summarize the total richness in the area and beta diversity to assess how different species compositions are between traps). Note: I do not use diversity indices often enough to provide more specific guidance. @ollie_wearn is this what you were referring to?

 

  • Regarding occupancy, that makes sense to me. Only challenge is I'm not using python or R at the moment because and we use javascript for easier user interfaces with web technologies. It's still possible to add python support but I'd delay as much as possible to keep things simple. Anyway, it's a technical challenge that I'm pretty sure I can solve.

I can help you with setting up the models in R, Python or Stan/ C++. But I have no idea on how to overcome the technical challenge that you are referring to. I agree with @ollie_wearn that allowing variable selection and model building would take this to far. One thing I would suggest is to allow users to fit a spatially-explicit version of the basic occupancy model (mind that these can be slow). This type of model leverages the correlations in species detections between trap locations to estimate differences in occupancy across the study area rather than just estimating a single occupancy for the entire area. 

 

  • What about relative abundance? My readings always mention occupancy, species richness, abundance and density. I've read and ask enough about density to know that it's hard for unmarked animals and might not be necessary. If I understood correctly, relative abundance can give similar insights and as a conservationist, you probably want to know the trend of relative abundance over time.

Yes, I would leave users with the following options: a UI-switch that let's them pick either: 

  1. the number of observations
  2. the number of individuals detected
  3. the relative abundance index or RAI (based on 1.)
  4. the RAI (based on 2.)

to visualize on the leaflet map - and barchart on the side in the activity tab.

 

Regarding density: you could add a tab that calculates density using the Random Encounter Model (REM), which is often used when estimating density of unmarked animals without info on recaptures.

 

Regarding activity patterns: I would also add a tab were users can visualize either diel or annual activity cycles (often called activity patterns) computed through the activity R-package (or integrate this in an existing tab). And maybe even allow computing overlap in daily activity cycles among selected species.

 

If you manage to include all of these, then I think your app covers the 90% of use cases.

Some other features worth discussing:

  1. merging/ comparing multiple projects
  2. in line with calculating overlap between activity cycles, allow computing a spatial overlap between two species or even a spatio-temporal overlap 

 

@Jeremy_ For the Python implementation of basic occupancy models (as suggested by @ollie_wearn ), please refer to these two projects:

I second @martijnB suggestion to use spatially explicit occupancy models (as implemented in R, e.g., https://doserlab.com/files/spoccupancy-web/). However, this would need to be added to both of the aforementioned Python projects.

Lively and informative discussion, I would very much like to contribute if there is some active development work with regards to this. 
I have recent experience with using Model Context Protocol (MCP) to integrate various tools & data repositories with LLMs like Claude. I believe this could be a good idea/path whereby we can do the following:
1. use the images & labels along with any meta-data, chunk/index/store it in vector db
2. integrate with existing data sources available by exposing the data through MCP server

3. Use MCP friendly LLM clients (like Claude) to query, visualize and do other open-ended things leveraging the power of LLM and camera trap data from various sources. 
 

Regards,

Ajay

See full post
discussion

The Boring Fund 2024 - MoveApps

 We are honored to be among the winners of The Boring Fund 2024! Thank you WILDLABS and Arm for selecting our project.MoveApps is a free no-code analysis platform for...

7 8

We are pleased to inform you that we have now finalized point 2 and 3. Here some details of the update:

  • App browser improvements:
    • Improved overview and search: we have added a description of each category and
      the search and filtering options are improved.
    • Searching for Apps within a Workflow: we have added the option to include Apps
      that are not compatible with the IO type, making it easier to decide if a translator
      App is needed to include one of the incompatible Apps.

       

  • Public Workflows improvements:
    • Improved overview: the public Workflows are now organized by categories which
      can be also used for filtering.
    • More information: the details overview contains now the list of Apps included in
      each Workflow.
    • Sharing Workflows: when creating a public Workflow you will have to select one
      or more existing categories, but you can also always request a new category.

Go and check it out in MoveApps!

We are please to inform that we have implemented the point 1 and 4 and with this have finalized the project. The latest improvements:

  • Improvement in findability of help documentation: we have started to populate the platform with links (question mark icon) to the relevant
    sections of the user manual.
  • The log files of each App can now be downloaded and when an error occurs directly be sent to MoveApps support. Find more details here.

Again a great thank you for giving us the opportunity to implement these changes. We think they have greatly improved the user friendliness of MoveApps

 

 

See full post
discussion

Prospective NSF INTERN 

Hello all,My name is Frank Short and I am a PhD Candidate at Boston University in Biological Anthropology. I am currently doing fieldwork in Indonesia using machine-learning...

1 2

My name is Frank Short and I am a PhD Candidate at Boston University in Biological Anthropology. I am currently doing fieldwork in Indonesia using machine-learning powered passive acoustic monitoring focusing on wild Bornean orangutans (and other primates). I am reaching out because as a student with a National Science Foundation Graduate Research Fellowship, I am eligible to take advantage of the NSF INTERN program which supports students to engage in non-academic internships through covering a stipend and other expenses, with the only caveat being that the internship must be in-person and not remote. I was wondering if any organizations in conservation technology would be interested in a full-time intern that would be coming in with their own funding? 

In addition to experience with machine learning and acoustics through training a convolutional neural network for my research, I also have worked with GIS, remote sensing, and animal movement data through other projects. Further, I have experience in community outreach both in and outside of academic settings, as I previously worked for the Essex County Department of Parks and Recreation in New Jersey for 3 years where I created interpretive signs, exhibits, newsletters, brochures, and social media posts. Now while doing my fieldwork in Indonesia, I have led hands-on trainings in passive acoustic monitoring placement and analysis as well as given talks and presentations at local high schools and universities. 

I would love to be able to use this opportunity (while the funding still exists, which is uncertain moving forward due to the current political climate in the US) to exercise and develop my skills at a non-academic institution in the conservation technology sphere! If anyone has any suggestions or is part of an organization that would be interested in having me as an intern, please contact me here or via my email: fshort@bu.edu geometry dash. Thank you!

Hi Frank, your work sounds incredibly valuable and well-aligned with current needs in conservation tech. With your strong background in machine learning, acoustics, GIS, and outreach, you’d be an asset to many organizations. I’d recommend looking into groups like Rainforest Connection, Wildlife Acoustics, or the Conservation Tech Directory (by WILDLABS)—they often work on acoustic monitoring and might be open to in-person internships, especially with funding already in place. Best of luck finding the right match—your initiative is impressive!

See full post
discussion

No-code custom AI for camera trap images!

Thanks to our WILDLABS award, we're excited to announce that Zamba Cloud has expanded beyond video to now support camera trap images! This new functionality...

3 3

When you process videos, do you not first break them down into a sequence of images and then process the images ? I'm confused as to the distinction between the processing videos versus images here.

We do, but the way the models handle the images differs depending on whether they're coming from videos or static images. A quick example: videos provide movement information, which can a way of distinguishing between species. We use an implementation of SlowFast for one of our video models that attempts to extract temporal information at different frequencies. If the model has some concept of "these images are time sequenced" it can extract that movement information, whereas if it's a straight image model, that concept doesn't have a place to live. But a straight image model can use more of its capacity for learning e.g. fur patterns, so it can perform better on single images. We did some experimentation along these lines and did find that models trained specifically for images outperformed video models run on single images.

Hope that helps clear up the confusion. Happy to go on and on (and on)...

See full post
discussion

The boring fund: Standardizing Passive Acoustic Monitoring (PAM) data - Safe & sound

Thanks to the Boring Fund, we are developing a common standard for Passive Acoustic Monitoring (PAM) data.Why it’s important: PAM is rapidly growing, but a core bottleneck is the...

7 13

This is such an important project! I can't wait to hear about the results. 

Hey Sanne, awesome - we definitely need a consistent metadata standard for PAM.

If you haven't already, I would suggest sharing this on the Conservation Bioacoustics Slack channel and the AI for Conservation Slack channel. You would reach a lot of active users of PAM, including some folks who have worked on similar metadata efforts. 

If you're not a member of either one of those, DM me your preferred email address and I'll send you an invite!

Hello everyone,

Thank you all for your contribution!

You can read some updates about this project in this post.

Julia

See full post
article

Nature Tech for Biodiversity Sector Map launched!

Carly Batist and 1 more
Conservation International is proud to announce the launch of the Nature Tech for Biodiversity Sector Map, developed in partnership with the Nature Tech Collective! 

1 0
Thanks for sharing @carlybatist  and @aliburchard !About the first point, lack of data integration and interpretation will be a bottleneck, if not death blow to the whole...
See full post
funding

Multiple grants

I have been a bit distracted the past months by my move from Costa Rica to Spain ( all went well, thank you, I just miss the rain forest and the Ticos ) and have to catch up on funding calls. Because I still have little...

2
See full post
discussion

Nature Tech Unconference - Anyone attending?

Hi all, anyone planning to attend the Nature Tech Unconference on 28th March at the London School of Economics Campus in London, UK? (the event is free to attend but...

8 1

The Futures Wild team will be there :)

See full post
discussion

Generative AI for simulating landscapes before and after restoration activities

Hi all.Has anyone come across any generative AI tools that could be trained and used to generate photorealistic landscapes (in a web application) from habitat maps and then re-...

1 0

Yep we are working on it 

 

1/ segment 

2/remote unwanted ecosytem

3/get local potential habitat

4/generate

5/add to picture 

 

See full post
discussion

United Nations Open Source Principles

FYI, I just came across the United Nations Open Source Principles, which was recently adopted by the UN Chief Executive Board’s Digital Technology Network (DTN): It has been...

1 5

All sound, would be nice if there were only 5, though!

See full post
discussion

Standard Threshold for "Habitat Use" in Avians?

Hi all!Background: I did bioacoustic sampling at 12 sites, roughly 18 days of recording at each site, over the course of a season (both birds and bats). In reviewing our species...

7 1

Is that not a statistical question and depending in the classification performance of the SW (Birdnet), which itself may depend on the species and the variability of the call? 

I, myself, have no practical experience with Birdnet, but use a commercial product that derives from Birdnet and found in about 1000 days, for example,  the following tail statistics (22 x 1), (12 x 2), (6 x 3), (5 x 4) bird detections. In total, 223 different species were classified, with black birds being on top with 149000 detections. Overall, there seems to be a nearly exponential decay of different bird detections, which to me shows a problem with NN-based classifications, as there seems to be no clean separation between 'signal' and 'noise', as one can find in traditional signal processing. 

From these statistics (histogram) I would conclude that in the end there is no meaningful threshold, as there is no meaningful 'misclassification floor' that could be used to define a threshold.

Consequently, you may have to consider context, inspect every suspect snippet and let human decide. 

Hey Cortney, this is such a good question! 

First off - I'd be very careful about interpreting the number of BirdNET detections of a species as a vocalization index for that species. For example, let's say a recorder captures a lot of Species A, which sounds like Species B,  but Species B isn't present at that site. You will often find that BirdNET (or any other sound ID model) reports many detections for Species B and Species A, making it seem like both species are present. So, the number of detections of a species at a site isn't necessarily a more reliable indicator of species presence at that site.

One alternative to consider would be to verify species presence at each site and each day, then use the number of days that species was present as a threshold for habitat use. What threshold to use depends on the species - we've used three consecutive days of presence for some songbirds, for instance.

You could also remove dates when the species is expected solely to be migratory. However, this can get tricky... For example, I've observed that Cerulean Warblers that breed at a particular site will arrive there a week or two before migrants pass through at nearby non-breeding migration hotspots.

Doing a day-by-day analysis of presence enables you to more easily check machine learning detections than if you were checking for the number of detections for each species. You can verify the presence of the species at each site and day by listening to the highest-scoring clips for each day for each species. 

If you have 18 days of recording * 12 sites and you listen to the top 5 highest-scoring clips per species and day, that's a maximum of 1,000 five-second clips per species. In my experience, reviewing that amount of clips takes me only a few hours using a Jupyter notebook like this one:

See full post
discussion

ICCB 2025 – Let’s Connect!

Hi Everyone,I’m excited to be attending my first ICCB 2025 as a student presenter and early-career researcher! My work sits at the intersection of computational epidemiology and...

1 1

Hi everyone, I’m excited to become a member of Wild Lab! I’m currently working on my master’s thesis, focusing on dormouse conservation. My research explores the behavioral responses of dormice to temperature and habitat patterns using camera trap data.

Additionally, I’d like to incorporate agent-based modeling to simulate species behavior. However, I’m a bit unsure about how to effectively apply modeling for predictions. If anyone here has experience with modeling, I’d love to connect and discuss!

Looking forward to learning from you all.

Best regards,

See full post
event

Do more with Data: Online Workshop

Are you looking to make better use of data in your work? Nature FIRST is organising a free online workshop exploring how technology, data, and collaboration can enhance conservation efforts. No prior data experience is...

0
See full post
discussion

Sea turtle Bioacoustics Project

Hi everyone, I have an opportunity to work with Green sea turtles and Olive ridleys off the coast of Sri Lanka, this will be my first bioacoustics conservation project. I would...

5 0

Hi Sam, I  did my master's on hatchling turtle vocalisations and their role in nest emergence behaviours (currently under review for publication). I recorded nest emergence behaviour in-situ using microphones and camera traps. I worked with snapping turtles, but the methods could be quite useful. I would be happy to share my thesis if that would be helpful.

there are a few sea turtle papers that describe hatchling vocalisations but not many experiments testing hypotheses for these vocalisations. 

here are some papers that could help you get you started:

Shoot me a message if interested in chatting more :)

 

Hello Sam ...great work, would like to see the paper when it comes online. I would like to know about the device....Bests Zahir

See full post