About This Project
Parrots can "talk" like humans. However, whether their complex vocalizations follow syntactic rules and convey semantic meanings remains unclear. We propose to address this question by using the latest wireless methods to simultaneously record vocalizations, social events, and physiology of all birds in a complex social network. We will then use deep learning to map the bird’s vocalizations to its physiology and social experiences, a long-held dream of animal communication researchers.
Ask the ScientistsJoin The Discussion
What is the context of this research?
Parrots can imitate thousands of human words throughout their life. This extremely rare capacity stems from their complex communication system and social dynamics when they interact with each other. Our research team is fascinated by budgerigars, a small parrot, and their warble songs. Superficially, warbles possess many features of human language, e.g. combinatorial syllable construction and non-repetitive improvisation. However, it remains unclear if warbles follow certain syntactic rules and convey semantic meanings. We propose that to understand the meanings and functions of warble, we need to observe the birds’ natural interaction in a complex social network, record all salient events, then map the vocalizations of each bird to its physiology and personal experiences.
What is the significance of this project?
Animals evolve sophisticated communication systems to interact with each other. Therefore, to properly understand the meanings of their communication signals, we need to study them in their social network and keep track of all the salient events that happen to each animal. Field researchers adopt this approach, but it’s challenging to be comprehensive. Budgerigars present a unique opportunity because they readily form natural and complex social relationships in human environments. With the latest wireless recording methods and deep learning algorithms, it now becomes possible to simultaneously record vocalizations, social events, and physiology of all birds in a complex network. We are able to ask fundamental questions such as how personal experiences shape the social signals.
What are the goals of the project?
We are developing a novel wireless backpack system to resolve vocalizations produced by each bird in a complex social network, and monitor its physiology at the same time. We will assemble semi-natural groups (5-10 birds) of budgerigars in flight cages, with each bird wearing a wireless backpack to record its vocalizations and physiology. We will use high-resolution cameras to track all social events that happen to each bird in the group and changes in the social dynamics. These efforts will result in a unique and comprehensive dataset, where the vocalizations, social events and experiences, and physiology can be mapped to each other. We will then take advantage of the powerful deep learning algorithms to analyze the syntactic structures and semantic information in the vocalizations.
We will use the listed items to build a behavior recording system, where we simultaneously record high-resolution videos, audios, and physiology of a group of socially interacting parrots. We will train deep learning models on the recorded videos to track the position and postures of each bird, in order to reconstruct the entire social network and events that happen to particular birds. Using directional microphones and wireless recording backpacks, we will resolve the vocalizations produced by each animal. We will also record real-time physiology, e.g. body temperature and heart rate, of each bird. With these measurements, we are able to collect a unique and comprehensive dataset that combines vocalizations, gestures, events, and physiology in a complex social network of parrots.
This project will start with building the recording system and wireless backpacks in the fall of 2023. Once the functionality of the systems is fully tested, we will assemble groups of budgerigars in the winter of 2023 and start the recording experiments. We expect the recording to last around 3 months. We will likely complete data collection in early 2024 and enter the data analysis stage. We will share project progress and datasets with our backers as we acquire them.
Oct 13, 2023
Nov 15, 2023
Finish building and testing the recording system
Feb 15, 2024
Complete the data collection and share datasets
May 15, 2024
Finish data analysis
Meet the Team
In the Goldberg Lab at Cornell University, we have formed a team of neuroscientists, engineers, physicists, and computer scientists to study the complex social communication signals of parrots.
Zhilei Zhao is a postdoctoral researcher in the Goldberg Lab at Cornell University. Zhilei is passionate about interesting animal behaviors and the underlying brain mechanisms. He earned his B.S. in life sciences at Peking University in Beijing and Ph.D. in evolutionary biology and neuroscience at Princeton University.
Zhilei has been studying budgerigars for more than two years now and still amazed by the complexity in their vocalizations.
Zhilei enjoys collaboration with experts in other fields. He has teamed up with engineers to develop wireless recording methods, and with physicists and computer scientists to develop interpretable deep learning models to analyze complex animal vocalizations.
Nothing posted yet.
- $6,500Total Donations
- $1,625.00Average Donation