Ultra close-range photogrammetry is widely used for metrological analyses and is gaining popularity in biodiversity research as it provides multiple perspectives for the same specimen. Handling small objects, such as insects, often requires fixing them with pins to position them in front of the camera. However, this method can damage the specimen, and it's not feasible for very small specimens. Acoustic levitation offers a solution to these limitations by enabling lightweight specimens to be suspended in the air without physical contact. This is an attractive solution because many insects (especially many undescribed species) are small and lightweight. We demonstrate how small insects can be automatically imaged from different perspectives after using acoustic levitation to position them in mid-air. To rotate the insect by a pre-defined angle, we use ultrasonic (US) transmitters that are controlled via a Field Programmable Gate Array (FPGA), allowing for a stationary camera setup. This way, automated digitization, extended depth of field (EDOF), and multi-view imaging of insects is possible. A feasibility assessment indicates this approach can be used to create detailed 3D representations using neural radiance fields (NeRF).
Add the first post in this thread.