Global Feed

There's always something new happening on WILDLABS. Keep up with the latest from across the community through the Global view, or toggle to My Feed to see curated content from groups you've joined. 

Header image: Laura Kloepper, Ph.D.

discussion

We are releasing SpeciesNet

We're extremely excited to announce (and open source) SpeciesNet today, our AI model for species recognition in camera trap images, capable of recognising more than 2000 animals...

21 20

This is great news!



I am using rather high resolution images and have just ordered some 4K (8MP) camera traps.

The standard megadetector run via Addax AI is struggling a bit with detecting relatively small animals (frame wise) although they have quite a number of pixels. This naturally follows from the resizing in megadetector.

I have noticed :

but this seem not readilly available in Addax AI. Is it somehow supported in SpeciesNet?

Cheers,

Lars

Hi Ștefan

In my current case, I am trying to detect and count Arctic fox pups. Unfortunately, Arctic fox does not seem to be included in the training data of SpeciesNet but even if it was, pups look quite different from adults. 

After a quick correspondance with Dan Morris and Peter van Lunteren on the Addax AI gitHub I was made aware of the image size option of MegeDetector. It seem to help somewhat to run the detection at full resolution (in my case up to 1920*1080). I have the impression that I get more good detections, but also less false detections (even without repeat_detection_elimination) by using higher resolution.

Dan offered to have a look at my specific challenge so I sent him 10K+ images with fox pups.

See full post
discussion

Remote Bat detector

Hi all,What devices are there in the market capable of recording bats but that can be remotely connected?I am used to work with audiomoths or SM4. But I wonder if there are any...

7 0

You will likely need to have an edge-ML set-up where a model is running in real-time and then just sending detections. Birdweather PUC and haikubox do this running birdnet on the edge, and send over bluetooth/wifi/sms- but you'd have to have these networks already set up in places you want to deploy a device. Which limits deployments in many areas where there is no connectivity

So we are building some Bugg devices on licence from Imperial College, and I am making a mod to change the mic out for a version of out Demeter microphone. This should mean we can use it with something like BattyBirdNet, as its powered by a CM4. Happy to have a chat if you are interested. Else it will likely need a custom solution, which can be quite expensive!

There are a lot of parameters in principle here. The size of the battery. How much time in the field is acceptable before a visit? Once a week? Once a month? How many devices you need? How small does the device and battery have to be etc. I use vpns over 4G USB sticks.

I ask because in principle I've build devices that can retrieve files remotely and record ultrasonic. Though the microphones I tested with (Peterssen) are around 300 euros in price. But I've been able to record USB frequencies and also I believe it will be possible to do tdoa sound localization with the output files if they can be heard on 4x recorders.

But we make commercial devices and such a thing would be a custom build. But it would be nice to know what the demand for such a device is to know at which point it becomes interesting.

See full post
article

Fires in the Serengeti: Burn Severity & Remote Sensing with Earth Engine

Fires in Serengeti and Masai Mara National Parks have burned massive areas this year. With Google Earth Engine, it's possible to quantify burn severity using the normalized burn ratio function, then calculate the total...

1 0
This was originally presented on 24 April, 2025 as part of a Geospatial Group Cafe. We will post the recording and highlights from other speakers of that session soon!
See full post
Link

Sticky Pi: A smart insect trap to study daily activity in the field

Sticky Pis are scalable, smart sticky traps using a high frequency Raspberry Pi camera to automatically score when, which and where insects were captured. Author: Quentin Geissmann, 2023

0
Link

Insect Detect: Build your own insect-detecting camera trap!

Sharing this website that provides instructions on DIY hardware assembly, software setup, model training and deployment of the Insect Detect camera trap that can be used for automated insect monitoring. Authors: Maximilian Sittinger, Johannes Uhler, Maximilian Pink, Annette...

1
discussion

Reducing wind noise in AudioMoth recordings

Hi everyone. I'm wondering if anyone has tips for reducing wind noise in AudioMoth recordings. Our study sites are open paddocks and can be subject to high wind. Many...

6 0

Just following up on this, we are suffering from excessive wind noise in our recordings. We have bought some dead cats, but our audiomoths are in the latest Dev case (https://www.openacousticdevices.info/audiomoth).

In your collective experience, would anyone recommend trying to position the dead cat over the microphone on the Audiomoth itself, or covering the entry port into the device, from either the inside or the outside?

 

Cheers,

Tom

Hi Tom! I think the furry windjammer must be outside the casing to have the desired effect. It can be a bit tricky having this nice furry material that birds and other critters might be attracted to. It may be possible to make an outer "wire cage" to protect the wind jammer. We once had to do this to protect a DIY AudioMoth case against foxes wanting to bite the case (no wind jammer then). You may however create new wind issues with wind noise created by this cage... No-one said it had to be simple! 

See full post
discussion

Experience with AudioMoth Dev for Acoustic Monitoring & Localization?

Hi everyone,I am Dana, a PhD student. I’m planning to use AudioMoth Dev recorders for a passive acoustic monitoring project that involves localizing sound...

14 0

Hi Walter,

Thanks for your reply! It looks like the experiments found very minor time offsets, which is encouraging. Could you clarify what you mean by a "similar field setup"?

In my project, I plan to monitor free-ranging animals — meaning moving subjects — over an area of several square kilometers, so the conditions won't be exactly similar to the experimental setup described.

Given that, would you recommend using any additional tools or strategies to improve synchronization or localization accuracy?

Hi Ryan,

Thanks for your reply! I'm glad to hear that the AudioMoth Dev units are considered powerful.

Have you ever tried applying multilateration to recordings made with them? I would love to know how well they perform in that context.
 

On a more technical note, do you know if lithium batteries (such as 3.7V LiPo) can provide a reliable power supply for Dev units in high temperature environments (around 30–50°C)?

Thanks, 
Dana

Hi Lana,

"similar field setup" means that the vocalizing animal should be surrounded by the recorders and you should have at least 4 audiomoths recording the same sound, then the localization maths is easy (in the end it is a single line of code). With 3 recorders that receive the sound localization is still possible but a little bit more complicated. With 2 recorders you get only some directions (with lift-right ambiguity).

Given the range of movements and assuming that you do not have a huge quantity of recorders do 'fence' the animals, I would approach the tracking slightly different. I would place the Audiomoth in pairs using a single GPS receiver powered by one recorder but connect the PPS wire also to the other recorder. Both recorders separated  by up to 1 m are facing the area of interest. For the analysis, I would then use each pair of recorders to estimate the angle to the animal. If you have the the same sound on two locations, you will have 2 directions, which will give you the desired location. The timings at the different GPS locations may result in timing errors, but each direction is based on the same clock and the GPS timing errors are not relevant anymore. It you add a second microphone to the Audiomoths you can improve the direction further.  If you need more specific info or char about details (that is not of general interest) you can pm me.

See full post
discussion

No-code custom AI for camera trap images!

Thanks to our WILDLABS award, we're excited to announce that Zamba Cloud has expanded beyond video to now support camera trap images! This new functionality...

3 3

When you process videos, do you not first break them down into a sequence of images and then process the images ? I'm confused as to the distinction between the processing videos versus images here.

We do, but the way the models handle the images differs depending on whether they're coming from videos or static images. A quick example: videos provide movement information, which can a way of distinguishing between species. We use an implementation of SlowFast for one of our video models that attempts to extract temporal information at different frequencies. If the model has some concept of "these images are time sequenced" it can extract that movement information, whereas if it's a straight image model, that concept doesn't have a place to live. But a straight image model can use more of its capacity for learning e.g. fur patterns, so it can perform better on single images. We did some experimentation along these lines and did find that models trained specifically for images outperformed video models run on single images.

Hope that helps clear up the confusion. Happy to go on and on (and on)...

See full post
discussion

First battery-only test of the SBTS thermal smart camera system

Last night I connected my SBTS local AI, thermal-enabled smart camera to a battery and placed it outside overlooking a field. I hoped to get some video of deer, which I've seen...

5 0

Nice thermal camera study with the product you mention. I think it’s the first serious work with thermal AI models. Kudos to the developers.

You should be aware of the resolution difference with this and the one I present with this article. The one above is a flir lepton with 160x120 pixels resolution, whereas the one I’m presenting is 640x512 resolution, 45x more pixels. That’s a lot of extra resolution.

The DOC AI unit above is cheaper than the unit I present which it also includes AI object detection.

I can offer a range of resolutions. Currently 640x512 and 384x288 and 1280x1024 as well as a large range of lenses including zoomables lens if you like such as 30-180mm thermal lens together with a 1280x1024 resolution thermal sensor.

Thanks for the insight Kim! It's awesome what you are doing! I am excited for any updates. 

My collegues who are looking into getting a customised 2040 DOC AI Thermal camera need something that has the battery life to be left in the field for weeks due to the remoteness of the survey sites.

Weeks with continuous inference would require a pretty big battery. I expect you would need some kind of customisation and maybe quite a bit of compromise to last weeks and on a single battery. Good luck with that. Power management is challenging.

See full post
discussion

From Field to Funder: How to communicate impact?

Conservation involves a mosaic of actors — field practitioners, local communities, funders, government agencies, scientists, and more.Each one needs different levels of...

2 2

Great questions @LeaOpenForests !

I don't have concrete answers since I am not a stakeholder in any project in particular. Based on experience with research on the potential for a similar one-stop-shop for science metrics, I would suggest that there is no simple solution: different actors do need and have different views on presenting and viewing impact. This means possible gaps between what one group of actors need and what the other is willing or able to produce. One can hope, search and aim for sufficient overlap, but I don't see how they would necessarily or naturally overlap.

Still, I would guess that if there are dimensions of overlap, they are time, space and actor-networks 

I have posted about this in a different group, but I love boosting the impact of my communication through use of visuals. 

Free graphics relating to conservation technology and the environment are available at:

  1. National Environmental Science Program Graphics Library

    Graphics below of a feral cat with a tracking collar and a cat grooming trap are examples of symbols available courtesy of the NESP Resilient Landscapes Hub, nesplandscapes.edu.au.

  2. UMCES Integration and Application Network Media Library
Feral cat with tracking collar courtesy of the NESP Resilient Landscapes Hub, nesplandscapes.edu.au

Cat grooming trap graphic courtesy of the NESP Resilient Landscapes Hub, nesplandscapes.edu.au

See full post
discussion

What is the best light for attracting moths?

We want to upgrade the UV lights on our moth traps. We currently use a UV fluorescent tube, but we are thinking about moving to a LED setup, like the LepiLED or EntoLED. We think...

10 2

I found this thread on inaturalist really helpful when considering options! Lots of cost effective set ups to consider. I only really do mothing, so this is moth specific, but perhaps helpful for other insects.

I love that folks also mentioned using an additional flashlight or outward facing light to draw in moths from farther away. I've tried that as well and it always seemed to boost the number of moths on my sheet. 

For continuity, if the light goes away for more than a few seconds I feel like the spell is broken and they fly away. But this could be tested further. Curious if blinking makes a difference.

See full post
discussion

Free graphics for conservation tech communications

Visual communication means we’re all speaking the same language. Do you know of any conservation tech or general science graphics libraries?I find them helpful for presentations,...

2 1

Not directly conservation tech imagery, but we've used the open to contributions PhyloPic library and API on a few projects to get some cute and usable sillhouettes based on taxonomies.

Thanks for this! That's great! Also on a slightly different note - Unsplash is one of the better high quality stock image websites in terms of licenses, and most images are free to download. Although always be cautious of any species ID's, I have found that it's better to just take my own photos even if just on my phone.

Shutterstock vector graphics are not free but I have found it is great value for money, especially if you have Adobe illustrator or similar so can customise the graphics. They have a great range of graphics as well. You can do a month-to-month subscription for $53 AUD for 10 images / graphics per month.

See full post
Link

SpeciesNet: first impressions from 37k images in Namibia

I put together some initial experiences deploying the new SpeciesNet classifier on 37,000 images from a Namibian camera trap dataset and hope that sharing initial impressions might be helpful to others.

3
discussion

What software to use?

I am looking for a reliable camera trap software to process images efficiently. Key considerations:✅ Handles large datasets smoothly✅ Allows for multiple people...

7 0
See full post