This experiment is part of the Ocean Solutions Challenge Grant. Browse more projects

Can we identify individual manta rays in real-time using AI and remote technology?

$4,825
Raised of $4,825 Goal
100%
Funded on 12/05/21
Successfully Funded
  • $4,825
    pledged
  • 100%
    funded
  • Funded
    on 12/05/21

About This Project

This project combines AI, high-resolution cameras, and existing manta behavior research to test a program that will provide real-time individual identification and optimize underwater data collection. The results of this experiment will help us better understand manta behavior in real-time and enhance the tools and methods that are available for marine research. Because manta rays exhibit predictable movements and site affinity, individuals can be instantaneously identified and monitored.

Ask the Scientists

Join The Discussion

What is the context of this research?

It is widely understood that manta rays demonstrate "site affinity" by returning to the same places for cleaning and feeding. However, mantas remain one of the least understood of the marine mega vertebrates. Based on this knowledge, our project will create a tool that will identify individuals and provide real-time feedback about manta location and behavior to enhance our understanding of the species.

Organizations around the world have engineered devices to monitor underwater species; however, none have connected existing databases with individual identification in real-time, underwater. With our partners, we will design and deploy a device that identifies individuals and aggregates data on manta patterns, migrations and behaviors.

What is the significance of this project?

This project builds on existing research while also developing and deploying innovative and original technology.

The critical components exist. Live streams can be cast globally in real-time by The Coral City Camera. Scubotics uses AI to identify species and View Into the Blue has outfitted cameras to collect data underwater. Organizations like Flukebook have even collected data on individual marking identification; no one has combined these components into one device.

The combination of the above concepts will result in a method of efficient, useful, accurate data collection. This project could change how researchers and conservationists monitor species in oceans around the world, informing species research, tracking, and protection.

What are the goals of the project?

If manta rays exhibit predictable movements at cleaning stations, then it will be possible to identify and track individuals in real-time.

In order to identify individuals in real-time, this project will develop a cost-effective, efficient, and user-friendly device. First, this project will make real-time data collection to identify individual mantas, their patterns and tracking possible. This would revolutionize underwater research. The second goal is to incentivize citizen science and engagement with local populations of mantas.

If successful, data collection of manta ray location, movement and behavior can be more efficiently researched. This improved understanding can inform actionable responses, policy building and conservation measures for marine species.

Budget

Please wait...

The items listed provide an opportunity to engineer a custom camera and associated AI prototype that will efficiently and successfully deliver real-time feedback and identification of individual manta rays. The labor, cost of rights to the database and rig costs are critical to completing the device and coding the AI to make this possible.

We are currently looking to fund a full location-specific field deployment through other promising sources.

Endorsed by

This project would be a significant data collection win for conservation efforts that rely on underwater camera systems. This is a strong collaboration with key experts in the field. As an expert in the field I can attest that this project has a lot of potential and am excited to see where it takes underwater real-time data collection. I am confident Kate can co-lead this project as she has experience with partnering with people and organizations to make some really incredible things happen, especially in the ocean conservation space.
The way marine scientists collect data is undergoing a massive transformation. We are seeing an industry which has been dominated by time consuming manual data entries, leverage advancements in artificial intelligence to rapidly assess and respond to environmental changes. As a domain expert in the fields of marine remote sensing, and artificial intelligence, I see great potential for this project and am confident that the technology available today can meet the technical requirements laid out by Kate and her team.
Environmental data is ripe for disruption. This effort will lead the charge in that transformation for maritime data collection. I am excited to watch this effort progress. Because of its a strong team of collaborators, including scientists and technologists, I am eager to endorse this project for its use of cutting edge AI to advance marine conservation and understanding.

Project Timeline

This experiment will take approximately 4 months. If accepted, we will start work immediately with our partners to design, develop and engineer the custom AI algorithms and technology. We estimate this will take between 2 and 3 months.

The design, engineering and development of the physical device will start concurrently. We anticipate this phase to take between 3 and 4 months pending material availability.

Oct 14, 2021

Project Launched

Dec 30, 2021

Develop the AI and custom technology

Jan 24, 2022

Develop the device and associated rig

Feb 28, 2022

Deploy the device (proof of concept)

Meet the Team

Kate Sutter
Kate Sutter
Taylor Berry
Taylor Berry

Kate Sutter

Kate Sutter is an advocate and creative communicator for ocean exploration, marine sciences and global environmental issues. She is currently an Associate Program Officer for LabX, a program of The National Academy of Sciences where she develops and produces creative engagement programs to connect people to science. Kate also leads the community engagement and communications strategy for World Ocean Day, an organization that celebrates our ocean and garners hundreds of millions of impressions. Before joining LabX, Kate worked as a Program Associate at The Ocean Agency managing projects in coordination with the team who produced Google Underwater Street View and the Netflix Documentary, Chasing Coral. Kate also worked at The American Museum of Natural History supporting several departments including Exhibitions, Communications, and Conservation.

Kate is a member of the Explorers Club and won the Adventure Canada Young Explorer Scholarship which allowed her to study citizen science in the Arctic. She is a drone pilot, an avid scuba diver and instructor and was featured on an original Disney show, The Big Fib, as a ‘shipwreck expert'.

Taylor Berry

Taylor leads the strategic impact analysis and fundraising efforts for Conservation X Labs. As the Development & Impact Specialist, she is responsible for seeking funders and maintaining relationships, measuring and presenting program impact, and developing and executing fundraising strategy.

Taylor previously worked at the National Marine Sanctuary Foundation building development strategy, fundraising for critical programs, and creating and maintaining a custom donorCRM. Prior to working at the Foundation, she completed fieldwork in the Dominican Republic, where she designed, implemented, and managed a red mangrove restoration project with an emphasis on building community engagement and support around the wetland system.

Taylor received her bachelor's in Political Science from Lycoming College and her master’s degree in Environmental Law and Policy from Vermont Law School. She loves diving, biking, and backpacking.

Additional Information

We have been in touch with all partners mentioned above. The ones we have consulted with have agreed to participate or consult on the proposed experiment.


Project Backers

  • 25Backers
  • 100%Funded
  • $4,825Total Donations
  • $193.00Average Donation
Please wait...