This experiment is part of the AI for Interspecies Communication Challenge Grant. Browse more projects

Can we decipher vocalizations of parrots in a complex social network?

$6,500
Raised of $6,500 Goal
100%
Funded on 11/12/23
Successfully Funded
  • $6,500
    pledged
  • 100%
    funded
  • Funded
    on 11/12/23

About This Project

Parrots can "talk" like humans. However, whether their complex vocalizations follow syntactic rules and convey semantic meanings remains unclear. We propose to address this question by using the latest wireless methods to simultaneously record vocalizations, social events, and physiology of all birds in a complex social network. We will then use deep learning to map the bird’s vocalizations to its physiology and social experiences, a long-held dream of animal communication researchers.

Ask the Scientists

Join The Discussion

What is the context of this research?

Parrots can imitate thousands of human words throughout their life. This extremely rare capacity stems from their complex communication system and social dynamics when they interact with each other. Our research team is fascinated by budgerigars, a small parrot, and their warble songs. Superficially, warbles possess many features of human language, e.g. combinatorial syllable construction and non-repetitive improvisation. However, it remains unclear if warbles follow certain syntactic rules and convey semantic meanings. We propose that to understand the meanings and functions of warble, we need to observe the birds’ natural interaction in a complex social network, record all salient events, then map the vocalizations of each bird to its physiology and personal experiences.

What is the significance of this project?

Animals evolve sophisticated communication systems to interact with each other. Therefore, to properly understand the meanings of their communication signals, we need to study them in their social network and keep track of all the salient events that happen to each animal. Field researchers adopt this approach, but it’s challenging to be comprehensive. Budgerigars present a unique opportunity because they readily form natural and complex social relationships in human environments. With the latest wireless recording methods and deep learning algorithms, it now becomes possible to simultaneously record vocalizations, social events, and physiology of all birds in a complex network. We are able to ask fundamental questions such as how personal experiences shape the social signals.

What are the goals of the project?

We are developing a novel wireless backpack system to resolve vocalizations produced by each bird in a complex social network, and monitor its physiology at the same time. We will assemble semi-natural groups (5-10 birds) of budgerigars in flight cages, with each bird wearing a wireless backpack to record its vocalizations and physiology. We will use high-resolution cameras to track all social events that happen to each bird in the group and changes in the social dynamics. These efforts will result in a unique and comprehensive dataset, where the vocalizations, social events and experiences, and physiology can be mapped to each other. We will then take advantage of the powerful deep learning algorithms to analyze the syntactic structures and semantic information in the vocalizations.

Budget

Please wait...

We will use the listed items to build a behavior recording system, where we simultaneously record high-resolution videos, audios, and physiology of a group of socially interacting parrots. We will train deep learning models on the recorded videos to track the position and postures of each bird, in order to reconstruct the entire social network and events that happen to particular birds. Using directional microphones and wireless recording backpacks, we will resolve the vocalizations produced by each animal. We will also record real-time physiology, e.g. body temperature and heart rate, of each bird. With these measurements, we are able to collect a unique and comprehensive dataset that combines vocalizations, gestures, events, and physiology in a complex social network of parrots.

Endorsed by

Zhilei has demonstrated a solid ability to communicate, collaborate, and carry out projects through creative solutions and a solid work ethic. This is an exciting project that is well positioned to help us understand one of the most complex vocal repertoires of the avian song learners. The progressive approach of combining cutting edge technology with a paradigm in which the animals can interact in a freely-behaving setting should yield a new perspective of how warble facilitates the fascinating social dynamics of this complex species.
Dr. Zhao, a postdoctoral associate in the lab where I am a research support specialist, is a careful experimentalist and a insightful scientist. The project he is proposing has the potential to make exciting leaps in our understanding of animal communication. This team is very well positioned with the tools and experience it needs, such as machine learning and custom electronics and hardware design, to work respectfully with these amazing birds to discover the secrets of their vocalizations.

Project Timeline

This project will start with building the recording system and wireless backpacks in the fall of 2023. Once the functionality of the systems is fully tested, we will assemble groups of budgerigars in the winter of 2023 and start the recording experiments. We expect the recording to last around 3 months. We will likely complete data collection in early 2024 and enter the data analysis stage. We will share project progress and datasets with our backers as we acquire them.

Oct 13, 2023

Project Launched

Nov 15, 2023

Finish building and testing the recording system

Feb 15, 2024

Complete the data collection and share datasets

May 15, 2024

Finish data analysis

Meet the Team

Zhilei Zhao
Zhilei Zhao
Postdoctoral fellow

Affiliates

Cornell University, Department of Neurobiology and Behavior
View Profile

Team Bio

In the Goldberg Lab at Cornell University, we have formed a team of neuroscientists, engineers, physicists, and computer scientists to study the complex social communication signals of parrots.

Zhilei Zhao

Zhilei Zhao is a postdoctoral researcher in the Goldberg Lab at Cornell University. Zhilei is passionate about interesting animal behaviors and the underlying brain mechanisms. He earned his B.S. in life sciences at Peking University in Beijing and Ph.D. in evolutionary biology and neuroscience at Princeton University.

Zhilei has been studying budgerigars for more than two years now and still amazed by the complexity in their vocalizations.

Zhilei enjoys collaboration with experts in other fields. He has teamed up with engineers to develop wireless recording methods, and with physicists and computer scientists to develop interpretable deep learning models to analyze complex animal vocalizations.

Lab Notes

Nothing posted yet.


Project Backers

  • 4Backers
  • 100%Funded
  • $6,500Total Donations
  • $1,625.00Average Donation
Please wait...