Autonomous camera traps for insects provide a tool for long-term remote monitoring of insects. These systems bring together cameras, computer vision, and autonomous infrastructure such as solar panels, mini computers, and data telemetry to collect images of insects.
With increasing recognition of the importance of insects as the dominant component of almost all ecosystems, there are growing concerns that insect biodiversity has declined globally, with serious consequences for the ecosystem services on which we all depend.
Automated camera traps for insects offer one of the best practical and cost-effective solutions for more standardised monitoring of insects across the globe. However, to realise this we need interdisciplinary teams who can work together to develop the hardware systems, AI components, metadata standards, data analysis, and much more.
This WILDLABS group has been set up by people from around the world who have individually been tackling parts of this challenge and who believe we can do more by working together.
We hope you will become part of this group where we share our knowledge and expertise to advance this technology.
Check out Tom's Variety Hour talk for an introduction to this group.
Learn about Autonomous Camera Traps for Insects by checking out recordings of our webinar series:
- Hardware design of camera traps for moth monitoring
- Assessing the effectiveness of these autonomous systems in real-world settings, and comparing results with traditional monitoring methods
- Designing machine learning tools to process camera trap data automatically
- Developing automated camera systems for monitoring pollinators
- India-focused projects on insect monitoring
Meet the rest of the group and introduce yourself on our welcome thread - https://www.wildlabs.net/discussion/welcome-autonomous-camera-traps-insects-group
Group curators
- @tom_august
- | he/him
Computational ecologist with interests in computer vision, citizen science, open science, drones, acoustics, data viz, software engineering, public engagement



- 8 Resources
- 53 Discussions
- 5 Groups



- 9 Resources
- 2 Discussions
- 9 Groups
Conservation biologist eager to develop AI skills

- 0 Resources
- 0 Discussions
- 1 Groups
Tech for Conservation

- 0 Resources
- 2 Discussions
- 13 Groups
- 0 Resources
- 0 Discussions
- 1 Groups
- @Cris
- | he / him
Octophin Digital
Senior Developer at Octophin Digital and constant maker.

- 0 Resources
- 5 Discussions
- 5 Groups
- @VijayKarthick
- | He/Him
Nature Conservation Foundation
I'm a PhD student from India, interested in utilising bioacoustics and technology to answer ecological questions. I'm a frog nerd :)

- 0 Resources
- 2 Discussions
- 9 Groups
- 0 Resources
- 1 Discussions
- 5 Groups
- @Valentin_Stefan
- | He/Him
Interested in emerging technologies related to camera traps for pollinators
- 0 Resources
- 4 Discussions
- 18 Groups
- @steph_pics
- | She/her
I tell visual stories about California's wild places, plants, and animals and the people connected to them.
- 0 Resources
- 0 Discussions
- 1 Groups
Interested in bioacoustics
- 0 Resources
- 0 Discussions
- 14 Groups
- @KPMcFarland
- | he, him, his
Conservation Biologist

- 0 Resources
- 5 Discussions
- 3 Groups
I am a biology undergraduate student who is interested in the field of wildlife conservation and has skills in field observation and identification
- 0 Resources
- 0 Discussions
- 12 Groups
I am actively interested in pollinator ecology, monitoring, and conservation and the use of cutting-edge artificial intelligence tools for my research. https://vargaszilay.hu/
- 0 Resources
- 0 Discussions
- 2 Groups
Sticky Pis are scalable, smart sticky traps using a high frequency Raspberry Pi camera to automatically score when, which and where insects were captured. Author: Quentin Geissmann, 2023
29 April 2025
Sharing this website that provides instructions on DIY hardware assembly, software setup, model training and deployment of the Insect Detect camera trap that can be used for automated insect monitoring. Authors:...
29 April 2025
Applications are open until April 15th
14 March 2025
Up to 6 months internship position to work on DiMON project which aims to develop a non-lethal, compact device that, when coupled with traditional entomological traps, captures high-resolution, multi-view insect images...
25 February 2025
Article
NewtCAM is an underwater camera trap. Devices are getting deployed worldwide in the frame of the CAMPHIBIAN project and thanks to the support of our kind early users. Here an outcome from the UK.
24 February 2025
Osa Conservation is launching our inaugural cohort of the ‘Susan Wojcicki Research Fellowship’ for 2025, worth up to $15,000 per awardee (award value dependent on project length and number of awards given each year)....
10 February 2025
Catch up on the highlights from our two-day Mothbox v4.5 workshop at Georgia Tech, where participants gained hands-on experience building and testing the Mothbox.
13 November 2024
Paper by Natalie Klug et al
6 October 2024
The AMI team in montreal is looking for a paid intern to do machine vision stuff!
3 July 2024
Come and do the first research into responsible AI for biodiversity monitoring, developing ways to ensure these AIs are safe, unbiased and accountable.
11 June 2024
€4,000 travel grants are available for researchers interested in insect monitoring using automated cameras and computer vision
6 June 2024
May 2025
September 2024
event
August 2024
Description | Activity | Replies | Groups | Updated |
---|---|---|---|---|
We have spent the past 9 months taking the lessons learnt from the AMI system to build an automated moth monitoring system... |
|
Autonomous Camera Traps for Insects, Camera Traps | 1 day 8 hours ago | |
The mothbox can currently be attached to a solar panel super easy (just plug in a barrel jack up to 20v 80 watts) and it charges the talentcell battery. We also monitor the power... |
|
Autonomous Camera Traps for Insects | 2 weeks 6 days ago | |
Otherwise Australian Entomology Supplies has a few options available such as UV lights and black lights, such as a this portable UV lamp. They can ship internationally. |
+5
|
Autonomous Camera Traps for Insects | 3 weeks 4 days ago | |
I'm excited to announce that Outreach Robotics has finalized the development of our new insect camera traps, now officially named ... |
|
Autonomous Camera Traps for Insects | 4 weeks 1 day ago | |
Hey Amber, that makes a lot of sense! And this effort by Insect Detect is amazing, thanks for sharing! |
|
AI for Conservation, Autonomous Camera Traps for Insects | 1 month ago | |
Hi all, I'm Vainqueur BULAMBO. I'm looking for a fully funded PhD opportunity in ecological data science or conservation... |
|
Acoustics, AI for Conservation, Autonomous Camera Traps for Insects | 1 month 2 weeks ago | |
@hikinghack that's a complete update!A lot of compassion regarding the budget situation, especially knowing how great @briannaljohns is.It’s great to see a clear plan for the... |
|
Autonomous Camera Traps for Insects | 2 months 1 week ago | |
Thanks! Yes, we added electronics to power an external UV light only during periods when the camera is set to take pictures. |
+23
|
Autonomous Camera Traps for Insects, Camera Traps | 2 months 3 weeks ago | |
Me and @briannajohns field tested amphibious deployments for the Mothbox in cool jungle rivers!We designed the Mothbox to be super... |
|
Autonomous Camera Traps for Insects | 3 months ago | |
featuring @briannajohns @hikinghack @Hubertszcz |
|
Autonomous Camera Traps for Insects | 3 months 2 weeks ago | |
Sounds like so much fun. Would love to replicate this one out here in Cambodia |
|
Autonomous Camera Traps for Insects | 3 months 4 weeks ago | |
Wow! What a start of the year! Congratulations! |
|
Autonomous Camera Traps for Insects | 4 months 1 week ago |
Update 2: Cheap Automated Mothbox
23 October 2023 8:32pm
25 October 2023 6:21pm
Thanks for these tips!
We are using a RPI4 because the people I am building it for want images from the 64MP camera from Arducam and so we have to use that to make that work.
27 October 2023 6:44am
Thanks a lot for this detailed update on your project! It looks great!
Entomological Research Specialist for Automated Insect Monitoring
25 October 2023 7:21pm
Cheap Automated Mothbox
31 August 2023 10:19pm
25 October 2023 5:04pm
I'm looking into writing a sketch for the esp32-cam that can detect pixel changes and take a photo, wish me luck.
25 October 2023 5:11pm
One question, does it even need motion detection? What about taking a photo every 5 seconds and sorting the photos afterwards?
25 October 2023 6:23pm
It depends on which scientists you talk to. I am an favor of just doing a timelapse and doing a post-processing sort afterwards. There's not much reason i can see for such motion fidelity. For the box i am making we are doing exactly that, though maybe a photo every minute or so
Metadata standards for Automated Insect Camera Traps
24 November 2022 9:49am
2 December 2022 3:58pm
Yes. I think this is really the way to go!
6 July 2023 4:48am
Here is another metadata initiative to be aware of. OGC has been developing a standard for describing training datasets for AI/ML image recognition and labeling. The review phase is over and it will become a new standard in the next few weeks. We should consider its adoption when we develop our own training image collections.
24 October 2023 9:12am
For anyone interested: the GBIF guide Best Practices for Managing and Publishing Camera Trap Data is still open for review and feedback until next week. More info can be found in their news post.
Best,
Max
Automated moth monitoring & you!
24 October 2023 8:52am
Catch up with The Variety Hour: October 2023
19 October 2023 11:59am
Q&A: UK NERC £3.6m AI (image) for Biodiversity Funding Call - ask your questions here
13 September 2023 4:10pm
21 September 2023 4:27pm
This is super cool! Me and @Hubertszcz and @briannajohns and several others are all working towards some big biodiversity monitoring projects for a large conservation project here in panama. The conservation project is happening already, but hubert starts on the ground work in January and im working on a V3 of our open source automated insect monitoring box to have ready for him by then.
I guess my main question would be if this funding call is appropriate/interested for this type of project? and what types of assistance are possible through this type of funding (researchers? design time? materials? laboratory field construction)
Wisconsin - Insect Light Trap
18 September 2023 5:34am
Best Material for Moth Lighting?
9 September 2023 1:46am
9 September 2023 5:22am
Plasticy substances like polyester can be slippery, so I imagine that's why cotton is most often used. White is good for color correction, while still reflecting light pretty well. When I've had the option I've chosen high thread count cotton sheets, so the background is smoothest and even the tiniest arthropods are on a flat background, not within contours of threads. Main problem with cotton is mildew and discoloration.
That being said, I haven't actually done proper tests with different materials. Maybe a little side project once standardized light traps are a thing?
Improving the generalization capability of YOLOv5 on remote sensed insect trap images with data augmentation
31 August 2023 8:29am
360 Camera for Marine Monitoring
25 July 2023 8:54am
28 July 2023 7:43pm
Hi Sol,
For my research on fish, I had to put together a low-cost camera that could record video for several weeks. Here is the design I came up with
At the time of the paper, I was able to record video for ~12 hours a day at 10 fps and for up to 14 days. With new SD cards now, it is pushed to 21 days. It costs about 600 USD if you build it yourself. If you don't want to make it yourself, there is a company selling it now, but it is much more expensive. The FOV is 110 degrees, so not the 360 that you need, but I think there are ways to make it work (e.g. with the servo motor).
Happy to chat if you decide to go this route and/or want to brainstorm ideas.
Cheers,
Xavier
3 August 2023 2:32am
Hi Xavier, this is fantastic! Thanks for sharing, the time frame is really impressive and really in line with what we're looking for. I'll send you a message.
Cheers,
Sol
3 August 2023 3:19am
I agree, this would be great for canopy work!
European Forum Alpbach
1 August 2023 5:23pm
Insect camera traps for phototactic insects and diurnal pollinating insects
20 March 2023 9:39am
25 May 2023 7:09am
OK, thanks!
10 July 2023 1:57pm
Hi @abra_ash , @MaximilianPink, @Sarita , @Lars_Holst_Hansen.
I'm looking to train a very compact (TinyML) model for flying pollinator detection on a static background. I hope a network small enough for microcontroller hardware will prove useful for measuring plant-pollinator interactions in the field.
Presently, I'm gathering a dataset for training using a basic motion-triggered video-capture program on a raspberry pi. This forms a very crude insect camera trap.
I'm wondering if anyone has any insights on how I might attract pollinators into my camera field of view? I've done some very elementary reading on bee optical vision and currently trying the following:
Purple and yellow artifical flowers are placed on a green background, the center of the flowers are lightly painted with a UV (365nm) coat.
A sugar paste is added to each flower.
The system is deployed in an inner-city garden (outside my flat), and I regularly see bees attending the flowers nearby.
Here's a picture of the field of view:

Does anyone have ideas for how I might maximise insect attraction? I'm particularly interested in what @abra_ash and @tom_august might have to say - are optical methods enough or do we need to add pheremone lures?
Thanks in advance!
Best,
Ross
20 July 2023 4:40pm
Hi Ross,
Where exactly did you put the UV paint? Was it on the petals or the actual middle of the flowers?
I would recommend switching from sugar paste to sugar water and maybe put a little hole in the centre for a nectary. Adding scent would make the flowers more attractive but trying to attract bees is difficult since they very obviously prefer real flowers to artificial ones. I would recommend getting the essential oil Linalool since it is a component of scented nectar and adding a small amount of it to the sugar water. Please let us know if the changes make any difference!
Kind Regards,
Abra
Welcome to the Autonomous camera traps for insects group!
1 August 2022 10:52am
9 May 2023 12:19pm
Hi Peter,
EcoAssist looks really cool! It's great that you combined every step for custom model training and deployment into one application. I will take a deeper look at it asap.
Regarding YOLOv5 insect detection models:
- Bjerge et al. (2023) released a dataset with annotated insects on complex background together with three YOLOv5 models at Zenodo.
- For a DIY camera trap for automated insect monitoring, we published a dataset with annotated insects on homogeneous background (flower platform) at Roboflow Universe and at Zenodo. The available models that are trained on this dataset are converted to .blob format for deployment on the Luxonis OAK cameras. If you are interested, I could train a YOLOv5 model with your preferred parameters and input size and send it to you in PyTorch format (and/or ONNX for CPU inference) to include in your application. Of course you can also use the dataset to train the model on your own.
Best,
Max
11 May 2023 4:59pm
Hi Max, thanks for your reply! I'll have a look and come back to you.
19 July 2023 11:08am
Greetings, everyone! I'm thrilled to join this wonderful community. I work as a postdoctoral researcher at MeBioS KU Leuven having recently completed my PhD on "Optical insect identification using Artificial Intelligence". While our lab primarily focuses on agricultural applications, we're also eager to explore biodiversity projects for insect population estimation, which provides crucial insights into our environment's overall health.
Our team has been developing imaging systems that leverage Raspberry Pi's, various camera models, and sticky traps to efficiently identify insects. My expertise lies in computer science and machine learning, and I specialize in building AI classification models for images and wingbeat signals. I've worked as a PhD researcher at a Neurophysiology lab in the past, as well as a Data Scientist at an applied AI company. You can find more about me by checking my website or my linkedin.
Recently, I've created a user-friendly web-app (Streamlit) which is hosted on AWS (FastAPI) that helps entomology experts annotate insect detections to improve our model's predictions. You can find some examples of this work here: [link1] and [link2]. And lastly, for anyone interested in tiling large images for object detection or segmentation purposes in a fast and efficient way, please check my open-source library "plakakia".
I'm truly excited to learn from and collaborate with fellow members of this forum, and I wish you all the best with your work!
Yannis Kalfas
Tools for automating image analysis for biodiversity monitoring
12 July 2023 4:50pm
New collaboration network - Computer vision for insects
26 June 2023 2:36pm
The Wildlife Society Conference
19 June 2023 5:59am
Wildlife Monitoring Engineer
8 June 2023 4:54pm
Capture And Identify Flying Insects
30 December 2022 7:34pm
3 January 2023 1:55pm
This sounds like an interesting challenge. I think depth of focus and shutter speeds are going to be challenging. You'll need a fast shutter speed to be able to get shape images of insects in flight. Are you interested in species ID or are you more interested in abundance. having a backboard on the other side of the hotel would be a good idea to remove background clutter from your images.
19 May 2023 8:18am
Hi there,
I am also trying to get some visuals from wildlife cameras of insects visiting insect hotels. Was wondering if you had gained any further information on which cameras might be used for testing this?
Postdoc for image-based insect monitoring with computer vision and deep learning
9 May 2023 12:27pm
Hack a momentary on-off button
15 April 2023 9:21pm
21 April 2023 2:30pm
Hi @hikinghack ,
If I am understanding correctly, you want to be able to have the UV lights come on and go off at a certain time (?) and emulate the button push which actually switches them on and off? Is the momentary switch the little button at the top of the image you attached? Is it going to be cotrolled by a timer or a microcontroller at all? Sorry for all the questions, but I am not 100% clear on exactly what you are after. In the meantime, I've linked to a pretty decent tutorial on the process of hacking a momentary switch with a view to automating it with an Arduino microcontroller board, although it sort of assumes a bit of knowledge of electronics (e.g. MOSFETS/transistors) in certain places.
Alternatively, this tutorial is also good, with good explanations throughout:
If neither of these help, let me know and there might be some easier instructions I can put together.
All the best,
Rob
22 April 2023 3:04am
Hi Andrew,
If I understand you correctly, you want to turn on the LEDs when USB power is applied. The easiest way I can see to do this is to reroute the red wire to USBC VBUS, via an appropriate current limiting resistor. This bypasses all the electronics in your photo.
You could insert the current limiting resistor in the USB cable for better heat dissipation, or use a DC-DC constant current source instead of a resistor if power consumption is a concern.
22 April 2023 7:45am
Further to @htarold 's excellent suggestion, you can replace that entire PCB with a simple USB breakout board (e.g. USB micro attached below) by removing the red wire and attaching it to VCC on the breakout board, and removing and attaching the black wire to GND.
SparkFun microB USB Breakout
This simple board breaks out a micro-B USB connector's VCC, GND, ID, D- and D+ pins to a 0.1" pitch header. If you want to add the popular micro-B USB to your project, but don't want to put up with soldering the tiny connector, this is the board for you.
Camera traps, AI, and Ecology
14 April 2023 10:08am
7 June 2023 9:42am
1 August 2023 10:46am
1 September 2023 8:06am
Scaling up insect monitoring using computer vision
6 April 2023 7:06pm
Who's going to ESA in Portland this year?
31 March 2023 9:27am
4 April 2023 9:58am
That sounds great. I think you should encourage people to bring a bit of tech with them, can be a good conversation starter/ice-breaker
4 April 2023 4:04pm
Good idea! I've got a ransom assortment of different acoustic recorders I can bring along
5 April 2023 11:58pm
Indeed, I'll be there too! I like to meet new conservation friends with morning runs, so I will likely organize a couple of runs, maybe one right near the conference, and one somewhere in a nearby park where we can look for wildlife. The latter would probably be at an obscenely early hour, so we can drive somewhere, ideally see elk (there are elk within 25 minutes of Portland!), and still get back in time for the morning sessions.
Camera to follow wasps/attach on wasps
9 March 2023 5:16am
11 March 2023 2:44am
Hi @Lars_Holst_Hansen @tom_august
The link to the video is amazing. Thank you for it.
The wasps that I am working on, are solitary. So, basically it is just this one female that builds the entire nest. Like what you (@tom_august) mentioned, the best option would be to keep a running camera at the nest to record the whole process of nest building. Having one placed inside will be difficult because even if we do work out a way to have lighting inside the nest, the light might be detrimental to the developing larva inside. Hence, it is likely not to be of any benefit.
I am totally smitten by the idea of having a sensor on the wasp body to track where it goes! We could get to know how far it travels to bring the prey and also to collect soil.
14 March 2023 1:30pm
@ShwetaMukundan I just saw this thesis published on tracking bees. Maybe you could use the same method?
Optimising and applying RFID technology to monitor individual honey bee behaviour in agricultural field settings
The western honey bee, Apis mellifera L. is critical for the pollination of numerous native plant species and agricultural crops globally. Although providing a crucial service; honey bee health is currently in decline. One of the most effective methodologies for understanding this decline, is monitoring variation in individual bee behaviour. This thesis presents the development of an optimised radio frequency identification (RFID) system, which is designed for monitoring full-strength colonies in field locations. The potential of this system was subsequently demonstrated through the investigation of bee longevity and behaviour in two important horticultural crops: sweet cherry and hybrid carrot seed. Furthermore, the versatility of this system was demonstrated by using RFID data to explore the relationship between various weather parameters and foraging behaviour.RFID technology has been widely utilised for researching honey bees; however previous studies have predominantly been restricted to small nucleus colonies in semi-field conditions. Application has also been limited by poor detection accuracy and lack of analytical methodologies. To address these issues, a solar powered RFID system was developed and extensively tested for use on full-strength commercial bee colonies (Chapter 2). This included the design of a unique 'maze' entrance that can be retrofitted to any Langstroth hive to enable the accurate detection of bees equipped with RFID tags. The system consists of readily available componentry, which is cost effective and can be purchased off the shelf. Recommendations also include the comparisons of different commercially available RFID tags utilised in previous studies. All facets of RFID design were considered to develop one of the most accurate systems presented to date, with tag detection rates as high as 90.5% for passing bees. Extensive analytical methodology is provided to confirm system accuracy and to facilitate biological understanding. This enables the quantification of key parameters linked to honey bee ontogeny and foraging performance such as: number of flights, flight duration, age at onset of foraging and survival.Modern agricultural production is becoming increasingly reliant on protected cropping systems such as greenhouses, bird netting and polythene tunnels. Although these systems represent distinct advantages for plant growth, little is known about the potential impacts to insect pollinators. To address this, the newly developed RFID system was utilised to investigate the impact of bird netting and two polythene rain cover systems on the behaviour of honey bees in sweet cherry (Prunus avium L.) crops (Chapter 3). Across two seasons, six commercial bee colonies were equipped with RFID systems and over 1300 tagged bees. These colonies were positioned in a neighbouring open field (control), under permanent bird netting or within polythene systems (semi-permanent VOEN system in 2019 and an automated retractable Cravo system in 2020). Individual bee behaviour was subsequently monitored throughout the pollination period to determine the impact of covers on orientation, foraging behaviour and survival. Polythene rain covers (VOEN and Cravo) were found to increase the time required for bees to orientate by 35-45%. However, once orientated, the bees placed under both netted and polythene coverings conducted up to 155% more foraging trips and spent longer outside of the colony. Crop coverings were observed to have limited impact on age of onset of foraging or survival, with all groups transitioning to foraging at an optimal age (mean 15.7-24.1 days) and meeting or exceeding life expectancy (median 19.6-27.5 days). This represents the first study of its kind utilising RFID systems on full size colonies during commercial pollination.Modern agricultural production is also reliant on hybrid vegetable varieties due to key advantages for quality, yield and disease resistance. Despite the agronomic advantages, production of hybrid vegetable seed represents a challenge for pollination due to unattractive floral resources, isolated growing locations and reliance on insecticide applications during crop pollination. The newly developed RFID system was again utilised to investigate the impact of hybrid carrot (Daucus carota L.) seed production on the behaviour of honey bees (Chapter 4). Across two seasons, instrumented colonies were placed in either on-crop or off-crop groups with over 900 bees tagged per season. Differences in bee foraging age and behaviour, over the commercial pollination period, were again compared between groups using RFID data and the collection of pollen pellets and nectar foragers. Hybrid carrot crops were found to have a significant impact on honey bee foraging behaviour, with bees on the crop conducting less frequent but longer foraging trips. Substantial variation in behaviour was associated with the poor attractiveness of hybrid carrot floral resources. Although impacting foraging behaviour, there was little evidence to suggest that the hybrid seed crops were detrimental to bee health, with carrot bees orientating successfully, transitioning to foraging 4-5 days later, surviving longer (median 24.3-30.6 days) and collecting 3x more pollen in general, as compared to off-crop groups. However, only 2% of this pollen originated from carrot. This represented the first direct investigation into individual bee behaviour within hybrid cropping systems.A significant factor impacting bee behaviour and pollination in all environments is weather. The conditions preventing bees from foraging are well understood; however, it is not known how bees adapt their behaviour in response to changes in weather during a foraging trip. In Chapter 5, RFID trip data (1728 trips from 105 bees) in combination with high resolution weather recordings, were used to successfully model the influence of a variety of weather parameters on foraging flight duration. A series of nested linear mixed models revealed that temperature, humidity, solar radiation, barometric pressure and windspeed impacted trip duration. The additional factors of bee age, colony and rainfall were also considered in all models. Barometric pressure was found to be the most effective individual weather variable for modelling trip duration, accounting for 61.2% of variation. However, the most effective model overall, incorporated the minimum, maximum and mean of all-weather variables, accounting for 72.4% of variation in trip duration. This represents a highly complex relationship, with bees perceiving changes in weather and modifying behaviour accordingly.In the final chapter, a synopsis of the experimental findings is provided including recommendations for future research. This thesis presents critical advancements in the methodologies for studying variation in individual bee behaviour. The potential of the newly developed RFID system has been demonstrated in key agricultural environments, generating data with both practical and biological relevance. It is hoped that this can contribute to optimising both bee health and pollination in the future.
30 March 2023 1:14pm
Hi @ShwetaMukundan,
this could be interesting for you:
https://www.science.org/doi/10.1126/scirobotics.abb0839
https://www.youtube.com/watch?v=VwiHf2T9bLU
https://edition.cnn.com/2020/10/13/us/murder-hornet-track-washington-trnd/index.html
All from this working group:
https://homes.cs.washington.edu/~gshyam/
Exploring storage options for mass data collection
22 March 2023 3:20am
22 March 2023 7:36pm
Hi Adam!
I mostly live within the ecoacoustics space so I'll just speak on the hydrophone part of your request; Arbimon is a free web/cloud-based platform with unlimited storage for audio files. We've got an uploader app as well for mass-uploading lots of files. There's also a bunch of spectrogram visualization/annotation tools and analysis workflows available. It's AWS running under the hood.
I have some experience working directly with AWS & Microsoft Azure, and I've found personally that AWS was more user-friendly and intuitive for the (fairly simplistic) kinds of tasks I've done.
27 March 2023 5:23am
RECORDING Workshop on India-focused projects on insect monitoring
23 March 2023 4:24pm
Catch up with The Variety Hour: March 2023
23 March 2023 11:09am
Monitoring airborne biomass
14 March 2023 10:30am
14 March 2023 1:34pm
Looks like you want to have a read of this thread:
20 March 2023 2:44pm
Our project in very short is, setting up a sensor network for monitoring airborne biomass, mainly insects, birds and bats in near realtime, and to develop a forecast model to be used for mitigation with respect various types of human-wildlife conflicts (e.g. wind power, pesticide application, aviation). Our expertise is mainly in radar monitoring, but we aim on add insect camera information to be merged with the quantitative biomass measeurments by radar.
24 October 2023 9:05am
Hi Andrew,
thanks for sharing your development process so openly, that's really cool and boosts creative thinking also for the readers! :)
Regarding a solution for Raspberry Pi power management: we are using the PiJuice Zero pHAT in combination with a Raspberry Pi Zero 2 W in our insect camera trap. There are also other versions available, e.g. for RPi 4 (more info: PiJuice GitHub). From my experience the PiJuice works mostly great and is super easy to install and set up. Downsides are the price and the lack of further software updates/development. It would be really interesting if you could compare one of their HATs to the products from Waveshare. Another possible solution would be a product from UUGear. I have the Witty Pi 4 L3V7 lying around, but couldn't really test and compare it to the PiJuice HAT yet.
Is there a reason why you are using the Raspberry Pi 4? From what I understand about your use case, the RPi Zero 2 W or even RPi Zero should give enough computing power and require a lot less power. Also they are smaller and would be easier to integrate in your box (and generate less heat).
I'm excited for the next updates to see in which direction you will be moving forward with your Mothbox!
Best,
Max