About This Project
Camera traps photos tell you what animals are in an area, but figuring out how many is tricky. E.g. two foxes in a photo doesn't mean there are only two; or a camera might see the same fox twice. But, knowing how many is key to answering questions like "Is the population of a species declining?" We're making a low-cost 3D vision add-on for camera traps. This is because 3D photos include spatial information, and existing mathematical models can use it to make much-needed population estimates.
Ask the ScientistsJoin The Discussion
What is the context of this research?
Camera traps are one of the most important tools for ecologists and wildlife researchers. They are motion-triggered cameras – usually with night vision – which capture photos of wildlife in an unobtrusive way. Scientists set up a grid of dozens of camera traps in a study area to catalogue its biodiversity. However, camera traps only tell you which species of animals are present, not how many.
Measuring population sizes is a key research aim in ecological science. There are mathematical methods like the random encounter model or distance sampling which can estimate populations. But they all require a key piece of data – the distance of an animal from the observer – that camera traps do not provide. Right now, loads of manual labor is needed to obtain this critical spatial information.
What is the significance of this project?
If you search online, you'll see that there are clip-on stereoscopic lenses for smartphones, like Kúla Bebe or Poppy. They let a camera capture two side-by-side images, which – just like how human eyes give us binocular vision – can be processed into a 3D image. With the principle of distance measure by parallax, we can calculate the distance of any object in a 3D image.
Camera traps are expensive & sturdy. They need to withstand weather and animals bumping into them. So, instead of modifying camera traps, we plan to develop "3D glasses" for them in the form of stereoscopic lenses. Smartphone clip-on lenses are more expensive, and do not fit camera traps. So we will create ones that do and cost much less for cash-strapped scientists. No need to buy new cameras!
What are the goals of the project?
The main research question is if images from camera traps with 3D glasses are enough to get the spatial data needed to estimate wildlife populations.
We'll test this with camera traps of different models/brands, wearing our 3D glasses to capture images in the field. The plan is to set up "scenes", initially with fixed objects at known distances from the camera traps. The raw data will be the 3D photos of these objects, which can be processed with OpenCV to calculate their distances. By comparing these estimates with the real distances we already know, we will know how accurately our 3D glasses can obtain useful spatial data.
Once that's done, we'll also publish a data workflow that processes the 3D images into spatial data to feed into existing models to estimate wildlife populations.
The funding will support Pen's time to test a data workflow that takes the raw 3D images and computes spatial data about the animals within them. The budget will also be used to purchase off-the-shelf camera traps of different makes and brands to test the 3D glasses with. Pen knows sellers who provide a discount, allowing us to buy at least 5-6 of them.
Jeremy and Joshua will advise on hardware design and provide additional academic rigour to this work.
We plan to begin this work by the end of 2023, but it's a conservative estimate and we could begin earlier.
Jeremy already has some initial, "back of the envelope" CAD designs for the 3D glasses. Pen is based in the UK and co-founded the MammalWeb citizen science camera-trapping network. He can leverage his extensive network to gain access to sites for field-testing these camera traps with 3D glasses. This will save valuable time. Joshua will provide critical advice along the way.
Jun 20, 2023
Dec 14, 2023
Iterate and publish CAD files for camera trap 3D glasses
Dec 14, 2023
Obtain permission to visit sites to field test camera traps
Dec 22, 2023
Fabricate 3D glasses in a makerspace
Jan 14, 2024
Field test camera traps with 3D glasses, ideally with real animals
Meet the Team
Our team members are part of the Gathering for Open Science Hardware, which works to make open source hardware the norm in scientific research. Pen-Yuan Hsing ("Pen") bring in 10+ years of wildlife research experience, including with camera traps. Jeremy is an expert engineer who can design and fabricate low-cost tools. Joshua is highly experienced in open hardware development for scientific applications. Joshua has extensive academic experience developing open source hardware for science.
Interdisciplinary researcher studying open research and metascience, with a goal of improving how research is done in any discipline.
I act in an advisory capacity to UNESCO and NASA on developing policies to encourage open research. Currently, I am a postdoctoral researcher evaluating the potential of a new open publishing platform called Octopus, during which I employed qualitative research methods.
In addition, I am a co-founder of the MammalWeb project for citizen science wildlife monitoring with camera traps. I am also a board member of the Open Science Hardware Foundation, and in the elected leadership of the Gathering for Open Science Hardware. Both advocate for the role of open source hardware in open science.
Open source software and hardware developer with an interest in developing solutions in open science, appropriate technology and assistive technology. Interested in collaborating with scientists to create open hardware solutions to scientific challenges.
Founder of 7B Industries, a company dedicated to open culture - hardware, software, data, etc. Former board member of Mach 30 Foundation for Space Development, an open hardware non-profit focusing on spaceflight. A mechanical engineer with 20+ years of experience developing engineering related software, it was a natural fit to become a core maintainer of the CadQuery parametric CAD project. Continuously working to improve the state of the art in CAD modelling and documentation.
Joshua M. Pearce is the John M. Thompson Chair in Information Technology and Innovation at the Thompson Centre for Engineering Leadership & Innovation. He holds appointments at Ivey Business School, the top ranked business school in Canada and the Department of Electrical & Computer Engineering at Western University in Canada, a top 1% global university. He runs the Free Appropriate Sustainability Technology research group. His research concentrates on the use of open source appropriate technology (OSAT) to find collaborative solutions to problems in sustainability and to reduce poverty. His research spans areas of engineering of solar photovoltaic technology, open hardware, and distributed recycling and additive manufacturing (DRAM) using RepRap 3-D printing, but also includes policy and economics. His research is regularly covered by the international and national press. According to Elsevier’s citation metrics last year he was in the top 0.06% most cited scientists globally and is continually ranked in the top 0.1% for his accessible research on Academia.edu. He is the editor-in-chief of HardwareX, the first journal dedicated to open source scientific hardware and the author of the Open-Source Lab:How to Build Your Own Hardware and Reduce Research Costs, Create, Share, and Save Money Using Open-Source Projects, and To Catch the Sun, an open source book of inspiring stories of communities coming togetherto harness their own solar energy, and how you can do it too!
The 3D glasses will be published as open source hardware, with open source licenses and published in the peer-reviewed literature. The technique/tooling if successful could be useful for many other applications. For example, it could be used for calibrating open source hangprinters (cable robot 3-D printers) placed in a random environment (e.g. between trees, buildings, or cranes) using a random camera in order for it to be used for construction. We might also be able to use the same technique to have single camera solutions to crop monitoring and fruit/vegetable volume/mass estimations for farming or agrivoltaic research. In other words, if our attempt is successful, the applications of camera traps with 3D glasses are broad and highly multidisciplinary.
- $13,850Total Donations
- $989.29Average Donation