Camera traps have been a key part of the conservation toolkit for decades. Remotely triggered video or still cameras allow researchers and managers to monitor cryptic species, survey populations, and support enforcement responses by documenting illegal activities. Increasingly, machine learning is being implemented to automate the processing of data generated by camera traps.
A recent study published showed that, despite being well-established and widely used tools in conservation, progress in the development of camera traps has plateaued since the emergence of the modern model in the mid-2000s, leaving users struggling with many of the same issues they faced a decade ago. That manufacturer ratings have not improved over time, despite technological advancements, demonstrates the need for a new generation of innovative conservation camera traps. Join this group and explore existing efforts, established needs, and what next-generation camera traps might look like - including the integration of AI for data processing through initiatives like Wildlife Insights and Wild Me.
Group Highlights:
Our past Tech Tutors seasons featured multiple episodes for experienced and new camera trappers. How Do I Repair My Camera Traps? featured WILDLABS members Laure Joanny, Alistair Stewart, and Rob Appleby and featured many troubleshooting and DIY resources for common issues.
For camera trap users looking to incorporate machine learning into the data analysis process, Sara Beery's How do I get started using machine learning for my camera traps? is an incredible resource discussing the user-friendly tool MegaDetector.
And for those who are new to camera trapping, Marcella Kelly's How do I choose the right camera trap(s) based on interests, goals, and species? will help you make important decisions based on factors like species, environment, power, durability, and more.
Finally, for an in-depth conversation on camera trap hardware and software, check out the Camera Traps Virtual Meetup featuring Sara Beery, Roland Kays, and Sam Seccombe.
And while you're here, be sure to stop by the camera trap community's collaborative troubleshooting data bank, where we're compiling common problems with the goal of creating a consistent place to exchange tips and tricks!
Header photo: Stephanie O'Donnell
- @lucytallents
- | She/Her
VerdantLearn
Sociable e-learning for conservation capacity building

- 4 Resources
- 6 Discussions
- 4 Groups
- 0 Resources
- 0 Discussions
- 3 Groups
- @CitlalliMJ6
- | she/her
Conservation biologist, executive director at the Tesoro Escondido Reserve Foundation in the Ecuadorian Chocó, working on participatory action research, environmental education, primate and amphibian conservation
- 0 Resources
- 0 Discussions
- 7 Groups
St. Lawrence University
Professor of Biology at St. Lawrence University
- 0 Resources
- 2 Discussions
- 13 Groups
- @MorenaRodriguez
- | She/Her
- 0 Resources
- 1 Discussions
- 5 Groups
- @diego_lizcano
- | He/Him
Wildlife biologist interested in biodiversity monitoring and the conservation of mammals. Passionate photographer.
- 0 Resources
- 0 Discussions
- 7 Groups
wildlife crime WWF-NL
- 0 Resources
- 2 Discussions
- 6 Groups


- 0 Resources
- 33 Discussions
- 3 Groups
- @jenlaw
- | She/Her
Biodiversity scientist specialising specialising in the study of tropical ecosystems and their biodiversity using multiple forms of technology, including acoustics, images and robotics.
- 0 Resources
- 0 Discussions
- 14 Groups
World Wide Fund for Nature/ World Wildlife Fund (WWF)
- 0 Resources
- 7 Discussions
- 12 Groups
- @ldbraunholtz
- | she/her
(Tropical) forest ecologist w interest: biodiversity, camera traps, community led conservation & more. Passionate about inclusive nature for all.
- 0 Resources
- 0 Discussions
- 10 Groups
- @Riley
- | she/they
I'm a Data Scientist at Western EcoSystems Technology. I am interested in AI processing and statistical modeling of acoustic data and camera trap and drone imagery.
- 2 Resources
- 2 Discussions
- 5 Groups
WILDLABS is partnering with FLIR to give away a FLIR ONE Edge Pro to 5 community members.
26 June 2024
Article
I’m Eliminatha Ambross, an enthusiastic conservationist who recently participated in the WICT 2023 program in Tanzania. My last update detailed my experiences at the Wildlife Conservation and Information Technology (...
14 June 2024
Come and do the first research into responsible AI for biodiversity monitoring, developing ways to ensure these AIs are safe, unbiased and accountable.
11 June 2024
€4,000 travel grants are available for researchers interested in insect monitoring using automated cameras and computer vision
6 June 2024
WildLabs will soon launch a 'Funding and Finance' group. What would be your wish list for such a group? Would you be interested in co-managing or otherwise helping out?
5 June 2024
Do you have photos and videos of your conservation tech work? We want to include them in a conservation technology showcase video
17 May 2024
€2,000 travel grants are available for researchers interested in insect monitoring using automated cameras and computer vision
3 May 2024
Article
Read in detail about how to use The Inventory, our new living directory of conservation technology tools, organisations, and R&D projects.
1 May 2024
Article
The Inventory is your one-stop shop for conservation technology tools, organisations, and R&D projects. Start contributing to it now!
1 May 2024
Technology to End the Sixth Mass Extinction. Salary: $132 - $160k; Location: Seattle WA; 7+ years of experience in hardware product development and manufacturing; View post for full job description
1 May 2024
The incumbent will develop models and metrics that can be used to shape conservation policy using multiple data sources including camera traps, movement data and citizen science concerning the diversity and...
23 April 2024
Careers
The Smithsonian National Zoo & Conservation Biology Institute is seeking a Program Manager to help coordinate multiple organizations in an effort to integrate movement data & camera trap data with global...
22 April 2024
June 2025
July 2025
June 2023
May 2023
event
58 Products
Recently updated products
4 Products
Recently updated products
Description | Activity | Replies | Groups | Updated |
---|---|---|---|---|
Hi everyone,We all know camera traps are great at telling us what species showed up, where and when—and sometimes even ... |
|
Camera Traps, Data management and processing tools | 12 hours 2 minutes ago | |
We have spent the past 9 months taking the lessons learnt from the AMI system to build an automated moth monitoring system... |
|
Autonomous Camera Traps for Insects, Camera Traps | 1 day 8 hours ago | |
Lively and informative discussion, I would very much like to contribute if there is some active development work with regards to this. I have recent experience with using... |
+31
|
Data management and processing tools, Camera Traps, Software Development | 5 days 23 hours ago | |
|
Latin America Community, Acoustics, AI for Conservation, Camera Traps, Drones, Early Career | 2 weeks 6 days ago | ||
@LukeD, I am looping in @Kamalama997 from the TRAPPER team who is working on porting MegaDetector and other models to RPi with the AI HAT+. Kamil will have more specific questions. |
|
AI for Conservation, Camera Traps | 3 weeks ago | |
Hi Ștefan! In my current case, I am trying to detect and count Arctic fox pups. Unfortunately, Arctic fox does not seem to be included in the training data of SpeciesNet but... |
+16
|
AI for Conservation, Camera Traps | 3 weeks 1 day ago | |
Interesting. Thanks for the explanation. Nice to hear your passion showing through. |
|
AI for Conservation, Camera Traps, Data management and processing tools, Open Source Solutions, Software Development | 3 weeks 3 days ago | |
Weeks with continuous inference would require a pretty big battery. I expect you would need some kind of customisation and maybe quite a bit of compromise to last weeks and on a... |
|
Camera Traps | 3 weeks 3 days ago | |
📸 Do you use camera traps in your work? Take part in our survey!Hi everyone! I’m currently a final-year engineering... |
|
Camera Traps, AI for Conservation, Data management and processing tools, Open Source Solutions, Software Development | 3 weeks 4 days ago | |
That's great! |
|
Camera Traps | 1 month ago | |
True, the US ecosystem is a challenging space right now, for basically all sectors. We should not let the US chaos prevent us from engaging with opportunities in other... |
|
AI for Conservation, Camera Traps, Connectivity, Drones, Emerging Tech, Ethics of Conservation Tech, Marine Conservation, Sensors | 1 month 1 week ago | |
Yes, I know about this big limitation,As far as I know they are working to increase the coverage available for this solution.For trusted developers, there are more regions... |
|
Connectivity, Camera Traps | 1 month 1 week ago |
Camera trapping workshop, London
16 April 2018 12:00am
[ARCHIVED]: Camera-trapping data consultancy
13 April 2018 5:00pm
The Plant-Powered Camera Trap Challenge
3 April 2018 12:00am
#Tech4Wildlife Photo Challenge 2018: Our Top 10
3 March 2018 12:00am
Research: A rigorous, realistic and reproducible test of camera trap perfomance
7 November 2016 1:49pm
15 November 2016 6:23am
Thank you Julia
Camera testing is certainly not Toffee's favorite activity - he would much rather be sniffing for scent marks !
The adverse effects of high temperatures on PIR are well established and they are a major problem anywhere that air temperatures get above about 30C. There is also a problem with cameras staying hotter than their surroundings for a few hours after sunset. I have also noticed that the infrared illuminator on the Reconyx actually heats up the camera.
Birds might be trickier to train than dogs, but you only need a reliable way to lure them to particular points within the field of view.
Certainly there are more factors to consider than only detection capability (though arguably that is the most important - better a fuzzy picture than none at all probably) and reliability is one of those. Bushnell Trophycams are notorious for losing their date settings (and this morning the one I am testing had done just that) for example. All sorts of equipment gets put through accelerated durability tests, and there is no reason why camera traps should not be similarly tested.
Given the huge projects that are built around camera trapping, and the scale of the conservation management decisions that are based on camera trap data it is a real problem that their performance is not tested and validated as fit for purpose.
Peter
7 February 2018 8:23pm
The sequence of images in the attached brief report shows why camera traps must be tested with real animal targets, and not with humans. The camera easily detects a human, but misses multiple images of the target dog.
Technology Empowered Conservation Lecture Series
18 January 2018 12:00am
Instant Detect 2.0: A Connected Future for Conservation
17 January 2018 12:00am
Resource: Camelot - new camera trap software
1 November 2016 11:16am
18 December 2017 12:16pm
Hi Egil, I think Camelot has a reasonable story around most of these things. Here's how I see it:
I set up a survey and I add the information about all the camera trapping stations (including camera IDs).
I import the photos. The database reads the data, time and camera ID from the photos so photos are linked to a camera trap location.
Camelot has two modes for importing data: a bulk import, and a per-session import. It sounds like you have many images up-front, and so bulk import may be the way to go. In this case Camelot can create the camera trap stations based on the location of the images, and their metadata.
http://camelot-project.readthedocs.io/en/latest/bulkimport.html
Then I do a quick first pass where I assign a species (tiger, leopard, etc) to each photo.
Yes, I expect you'll find the library UI suitable for this.
In a second round I want to identify the tigers. So I get all photos labelled with 'tiger'. Obviously the first one is a new animal thus I want a 'button' which allows me to add a new animal. Then in following pictures there is a drop-down menu with identified tigers. The order of the pictures presented is by camera trap location, data, time and then the next nearest camera trap location.
The sighting added in the first pass can be edited to add-in the individual, and new individuals can be added straight from the dropdown menu. Camelot does not have the ability to define a custom ordering of images: the ordering is always by camera trap station, by capture time (broadly; it's a bit more nuanced than this in reality). However does have the notion of "reference images", where images flagged in this way can be displayed, in another window (which can handy where multiple monitors are available), based on the currently selected sighting field data.
If things go as planned I'll likely be running into 100,000's of pictures, so I need to be able to do the first round pretty quickly. Then there will be 4-5 species I need to identify at the level of the individual. There will probably up to about 200 individuals of a species, but never more than about 20 at each camera trapping station, and the vast majority of individuals won't show up at more than 4-5 camera trapping stations either.
Camelot does support this sort of raw scale (x00,000 images) and has a reasonably efficient UI for identification. The information here may be relevant, depending on how many multiples of 100,000 it turns out to be.
However Camelot does not currently offer the ability to limit individuals in a field depending on the selected species (it is always the same individuals available in a dropdown regardless of the selected species). Potentially this limation could be worked around by having a dropdown available for each species, which will at least highlight in an export where a data-entry error may have arisen. (e.g., individual chosed in the "individuals" for species X field, but the selected species is Y.)
At 200 individuals the workflow for reference images could start to break down too, depending on the level of familiarity with the individuals. (i.e., repeatedly searching through all reference-quality images of a species, or images for a couple dozen individuals to make an identification could be onerous.)
I've looked around and I think Camelot is closest to what I want, but I wonder if I could get it really close to what I want. I have quite a bit of experience with SQL and building access databases, but I thought, with the prevalence of camera traps these days, that there would be more packages out there.
I agree Camelot really only goes part-way to meeting the requirements. It should be workable to use Camelot for the purpose, though support for individuals is relatively new and really hasn't been optimised for this sort of scale. It seems like the two places where Camelot is most lacking for this workflow are (and correct me if I'm wrong):
- lack of some more flexible ordering system for reference images against the selected image(s)
- lack of ability to filter options for a single "individuals" field based on the selected species
It'll likely be some time before these features are available in Camelot, but I'll add them to the development backlog.
-Chris
18 December 2017 4:21pm
Thanks for the reply Chris!
When you state:
"Camelot has two modes for importing data: a bulk import, and a per-session import. It sounds like you have many images up-front, and so bulk import may be the way to go. In this case Camelot can create the camera trap stations based on the location of the images, and their metadata."
When you mention 'location' you refer to the location of the images from where they are imported from? And not the gps location from the EXIF data if it's available, right? In my case no gps data would be available. But if I've downloaded the images to a hard drive in the field, and then back in the office import them into Camelot I can tell Camelot that the pictures from folder X belong to camera station 1?
You're right about the two points for optimizing. If photos can be ordered by station, date, time, and reference images ordered by 'distance to station', it greatly reduces the number of reference images to look through as the vast majority of individuals will be recorded at only a few stations, effectively trimming down the number of individuals to look through from 100's to a few dozen.
Cascading the drop down lists Species > individuals (or even area > species > individuals, or species > area > individuals, or in some cases species > group > individuals) would be beneficial too in the process of identifying individuals. In a SQL database this isn't hard to code into the fields on a form, but might not be that easy in Camelot as you would have to chose if the field you add depends on another field on the form.
19 December 2017 8:10am
Hi Egil,
When you mention 'location' you refer to the location of the images from where they are imported from? And not the gps location from the EXIF data if it's available, right?
Yes, that's right -- I should have said the 'directory of those images'.
But if I've downloaded the images to a hard drive in the field, and then back in the office import them into Camelot I can tell Camelot that the pictures from folder X belong to camera station 1?
Yes, that's right. Images can be dragged & dropped into existing camera trap stations, which have GPS coordinates associated. For the bulk import case, the data scanned from the image would need to be joined with GPS data to produce the final CSV for upload.
And thanks for the confirmation on how you'd expect that functionality should behave.
-Chris
FIT Cheetahs
4 December 2017 12:00am
HWC Tech Challenge Update: Meet the Judges
20 October 2017 12:00am
Research: Trail Camera Comparison Testing (results)
3 October 2017 12:53pm
19 October 2017 11:14am
.
Best Practices: Camera trap survey guide released
12 October 2017 12:17pm
Recommendations Needed: Best camera traps for Central African rainforests?
10 October 2017 4:13pm
12 October 2017 11:34am
Hi John,
They will want a camera with good tolerance to humidity + precipitation (so a camera with a proper O-ring seal, and pack it with regularly-dried silica gel). Elephants will likely have a go at the cameras - so they will want to think about protecting their cameras with heavy-duty steel security boxes (perhaps with welded on spikes, which has worked in Thailand) and minimising smell left on and around their cameras during setup (e.g. use gloves, don't smoke or leave food).
Apart from that, the usual considerations apply: good detection circuitry (less of a problem if targeting elephants though, as they are massive) and battery life always helps.
Laila Bahaa-el-din et al. used Panthera and Scoutguard in Gabon; the Goualougo Triangle Ape Project in Rep Congo uses Reconyx I think; Julia Gessner et al. used Reconyx in Rep. Congo and Cameroon. The latter reported that (of 47 cameras), 4 were taken out by elephants and one by a leopard!
Ollie
Download New Conservation Tech Guidelines: Camera Traps, Acoustics and LiDAR
11 October 2017 12:00am
How to lose a BRUV in 10 days
26 September 2017 12:00am
Survey: Camera Trap Survey with WWF-UK
20 September 2017 9:48pm
Recommendations Needed: GSM Camera Traps
24 May 2017 1:42pm
10 July 2017 1:24pm
Hi Chloe,
I've had promising results with the Scoutguard MG983G-30M, and I believe a 4G version has just been released. The 3G version has some useful features, like two-way communication to change settings/get images and a audio call feature (which I haven't used). I'd be very pleased to hear how your tests go.
Cheers,
Rob
10 July 2017 4:04pm
Thanks everyone,
@TopBloke @Kai - I've not come across the Ltl Acorn ones - will take a look. So does that mean if it was purchased in the UK it would be locked to a UK SIM then? That might be problemativ as we tend to purchase here to test and set up before sending out when staff are heading back to the country of deployment.
@Rob+Appleby - That's good to know about the Scoutguard MG983G-30M too, thanks.
A supplier is going to send me a new model by the same people who make Scoutguard to test but he hasn't said what it is. Maybe it's the 4G one. Whatever it is if it is any good I will feed back on here
Chloe
10 July 2017 4:42pm
@@Chloe+Aust - pls google “ltl acorn uk”, you will get a lot info. No, no limited, just the company's sale rules. You could use it anywhere, but for example, if you wants to use it in kenya, you need a local sim card, such as safaricom... when you check photos from china, you will find most of them photoed by are ITI acorn. Panda,tiger, snow leopord, cloud leopard, leopard, golden cat....
Scoutguard is also a good china camrea traps. oldest company. only company have espano menu. but they are focus on US hunting market. not special for wildlife researching use.
The LOREDA is also good, the only brand focus on wildlife-bio researching camra traps in china. Might be do not have dealers/stock in uk.
From the Field: Developing a new camera trap data management tool
7 July 2017 12:00am
Article: Google's cloud vision for automated identification of camera trap photos
12 April 2016 1:04pm
7 August 2016 11:25pm
An update to the automated species identification debate:
A paper has recently come out which used deep learning ("very deep convolutional networks") and managed 89% accuracy for the Snapshot Serengeti Zooniverse dataset, IF the image was first manually cropped around the animal. Seriously, who has time to do that? If the image remained uncropped they managed a woeful 35% accuracy.
Perhaps we have a long wait ahead of us for this to become a practical reality?
22 June 2017 7:22am
Note that paper isn't actually a journal article, hasn't gone through peer review, its a preprint.
24 June 2017 12:50am
Fair point, it isn't a peer reviewed article as yet. I had a poke around out of curiosity and wasn't able to track down a fully published paper yet (thought it's been a while since this preprint was released...). On reading, were there any issues that stood out to you that others should be aware of? And, as Ollie pointed out, the research wasn't getting a great response rate from uncropped images - I wonder if perhaps, given the time since the original publication, the results may have actually improved by now? It seems (from my interested but unqualified observer perspective) that the field is moving forward in leaps and bounds, such that a paper pre-published in march 2016 may very well be out of date by this stage...
Resources: Panthera Camera Trap information
5 April 2017 8:47pm
26 May 2017 12:39am
My understanding is that the Panthera cameras are only available to people or groups in Panthera's network. I tried to get some a few years back and had no luck, even for a project that had received Pathera funding in the past.
Regarding the poacher cam, I know @ColbyLoucks at WWF has developed a system using infrared thermal cameras that worked in the field trials.
Putting on my black thinking hat, the design as shown of the poacher cam will not be effective long-term once poachers know to look for it. Eric Dinerstein had worked on a project where such cameras would be concealed in a vine or some other organic-looking encasement.
26 May 2017 12:04pm
black thinking hat ! Interesting and honest.
Machine learning, meet the ocean
10 May 2017 12:00am
From the Field: Dr Raman Sukumar and Technology Developments Needed to Conserve Elephants
5 April 2017 12:00am
Camera Trap Pictures Wanted
6 March 2017 11:38am
31 March 2017 10:18am
Paul,
I can get some when I do my next deployment (sometime next week at this stage).
Colin
31 March 2017 10:21am
Nice, thanks Colin! Are you deploying cameras as part of the phascogale project?
Paul - are you quite sure you don't need some camera trap photos of some gorgeous Australian mammals for your guidelines as well?
31 March 2017 10:28am
Steph,
Not quite. I'm coordinating a fauna monitoring project for a Landcare network and, of course, we hope to get phascogales. I've just retrieved a batch of cameras from nest box monitoring duty for the phascogale project and their next role will be to investigate a possible sighting of a Squirrel Glider in the area.
Colin
How to stop the thieves when all we want to capture is wildlife in action
23 March 2017 12:00am
Recommendations Needed: Real-time enabled camera traps
17 August 2016 2:56pm
9 March 2017 7:32am
Hi Kai,
thanks for the links. Interesting. I requested them for pricing details.
The Panthera Anti-Poaching Cam makes use of existing networks and costs about 350 USD. That is not very expensive, is it?
Cheers,
Jan Kees
9 March 2017 9:51am
Hi Jankees
Thanks for your reply,
If there have existing network like RoyalKPNN.V. provide mobile phone service, there will be no problem, 3G/4G camera traps will works well. each cam 250 USD at the moment.
Now days hundreds of chinese tech team are working on LoRa and NB-IoT solutions as well. All of them want to be successfull like HUAWEI,ZTE and DJI. Nice people, good team, team work ,working 12 hours each day like machine, like arms race.
I am sure they are willing to support anti-poaching. I am not techman,but we are chinese, we have the duty to solve problems, be responsible for it. If you come to china oneday, let me know.
Thanks, and your sensingclues is great.
Regards
Kai
14 March 2017 6:13pm
A number of good points have been made. In terms of remote-enable camera traps, you will mainly find one that use cellular data signals to transmit images. Typically these are thumbnails rather full resolution, so you will likely still need to retrieve teh emory cards for analysis. Also, traps that transmit images tend to have a lionger time lag between shots, which can be a problem. Hunting web sites tend to have the most complete reviews and discussions of the various models.
If you want to use citizen science in the data analysis process, I would suggest looking at Zooniverse (www.zooniverse.org). It has a pretty well-thought-out platform.
Survey: Camera trap effects on people
2 March 2017 4:25pm
5 March 2017 3:34pm
Thanks, i send the it to my chinese friends, i am sure some of them finished the survey.
cheers
Kai
#Tech4Wildlife Photo Challenge: Our favourites from 2016
1 March 2017 12:00am
Remote Camera Data Processing - Counting Individual Animals?
10 November 2016 12:30am
20 January 2017 12:04am
Hi Kate, It all depends on the specie and data set accuracy you are trying to collect. If you are purely documenting frequency of visits, this is straight forward. However, if you are recording numbers then further analysis of photos may be needed. Some animals like foxes have distinctive marks or shapes, whilst smaller mammals can be a lot harder. If this is impossible, then a time limit should be defined across all camera traps. From all the papers I have read and projects we have worked with, there doesn’t seem to be clear standard. Hope this helps. Mike - handykam
3 February 2017 2:00am
Hi Kate, this may be a bit late for your analysis but I will put here for future reference. The case you mention would be to detemine independent events, not necesarily different individuals. Unless there is conspicuous colour, pattern, size, sex or other scars/markings then you shouldn't label them as different individuals. I haven't found much analysis or discussion on this point so it seems prior experience or a "best-guess" is used to determine the time between independent events. Within the R package "camtrapR" vignette it is mentioned (https://cran.r-project.org/web/packages/camtrapR/vignettes/DataExploration.html) "The criterion for temporal independence between records ... will affect the results of the activity plots" and you may have other data to correlate activity with abundance or density.
If you are using and occupancy analysis this may be trivial as you will likely be lumping multiple nights of survey into a single detection period for each camera anyway (we have used six 10-day detection periods for cats in central Aus).
SECR methods may also be more useful for calculating density. Timing of events BETWEEN cameras can be used to show that photos are of different indivduals as they cannont exist in two places simultaneously or travel at fast enough rate to be captured in that time frame (clocks must be synchronised).
I hope you have had as good a season in Bon Bon as we have in Alice Springs. Cheers, Al
5 February 2017 8:58am
The facts are simple. If you violate the assumption of independence of sampling events you will bias the result. In the event of multiple observations by overestimating occupancy due to counting the same animal twice. There is therefore no answer to your problem. People try to overcome it by chosing a particular random number to try and standardise the method across sampling sessions. Let us say we decide to use one hour between new samples in all our sampling sessions. If abundance changes between sessions our ability to detect an animal may change and our result may therefore be biased up or down depending on the trend. So we allow detectability to vary between samples to try and reduce this bias. This is done through assessing the results of the replicate samples i.e. n days.
However, the session time determines how many counts are reduced to presence only and as session time increases the variance of the occupancy estimate may grow. But we don't know if this increasing variance/ loss of accuracy is based on a real loss of information or not because we don't know if we are correctly throwing away information portraying the same individual or information on the presence of multiple individuals. Therefore selection of session time may effect our ability to detect a change in occupancy even with accounting for varying detection. i.e. detection may be equal if our session time is too short. In other words if in both one hour AND two hours we detect multiple animals on the same number of plots, convert this to the binomial (presence/absence), then detectability will be equal for both sessions. But if in one hour we detect half the number of animals as in two hours what does this say about occupancy?
Spatially explict (SE) models will also not help because as far as I know they are based on allowing multiple detections across plots, not within plots. Or cameras in this case.
Having said this there is apparently some hope in marking a portion of individuals to increase the accuracy of SECR or SE unmarked in this case. So my only advice would be to run an experiment with marking some foxes and use a SE model with predominantly unmarked indiviudals.
On a more jovial note you could simply mine the data by assessing the power to detect a change between pre/post baiting using a variety of different session times. The reality of such a result is highly dubious but it might inspire more thought on the behaviour of this particular situation. Failing all this youy could run away and join the circus.
Discussion: Wildlife Institute of India to conduct first tiger estimation in nine countries
28 November 2016 1:55pm
28 November 2016 2:07pm
@wildtiger @Shashank+Srinivasan @NJayasinghe Do you have thoughts on this?
31 January 2017 7:05pm
What species monitoring protcols do you know of that explicitly focus on one species?
-John
Discussion: Opinion of TEAM network and Wildlife Insights
7 June 2016 5:36am
24 June 2016 6:08am
Thank you. I had not seen TRAPPER.
I had seen Snoopy, CameraBase, the Sanderson & Harris executables, and TEAM. And none of them had seemed suitable for the work that I was doing.
I will investigate TRAPPER further.
30 June 2016 4:38am
I like TRAPPERs video capability.
I am yet to test it out though.
13 December 2016 12:47am
Hi Heidi,
You might also consider becoming involved in the eMammal project (emammal.si.edu) that the Smithsonian has recently developed. It has tools for both uploading and coding camera trap images, with the bonus that they are ultimately archived and made available by the Smithsonian for future researchers. My opinion is that moving toward open-access data will make all of our efforts more valuable into the future. I'm using eMammal, but not one of the developers or associated with the Smithsonian.
Cheers,
Robert
15 November 2016 6:08am
Also, just a general comment: some less epxensive cameras peform very well but may be more prone to 'glitches' over the duration of an extended study. I think the expectation of long term reliability is part of the reason some people choose the expenisve brand. Systematic tests of long duration reliability in field conditions would be really interesting, albeit probably too difficult/expensive to achieve.