ITongue: A Smartphone App for Personal Health Monitoring Based on Tongue Image

$339
Raised of $20,000 Goal
2%
Ended on 1/03/14
Campaign Ended
  • $339
    pledged
  • 2%
    funded
  • Finished
    on 1/03/14

About This Project

For over 5,000 years, traditional Chinese medicine has used tongue features to classify the health status. It has been done visually in an empirical way. In this project, we will develop a tongue image analysis App—ITongue that will allow a user to automatically analyze and monitor health status, and get warning on potential health issues and advices on lifestyle changes based on a photo of the tongue taken by a smartphone or a tablet.

Ask the Scientists

Join The Discussion

What is the context of this research?

Traditional Chinese medicine (TCM) is not only widely practiced in Asia, but also considered as the most effective alternative medicine in Western countries. In TCM, the features of tongue, including shape, color, coating, and pattern, are used as a basis for diagnosis of disease and classification of health status. Recent scientific studies have also confirmed some relationships between tongue features and diseases such as cancers and AIDS.

However, there are a number of issues in the traditional tongue diagnosis. In particular, the inspection of the tongue by eye may be subjective depending on doctor’s condition and experience. The traditional method also lacks standards for quantification. There are no tongue images recorded as part of electronic medical records.

Therefore, an electronic system that automatically records and systematically interprets tongue images is highly desirable. In recent years, mobile devices are explored as a platform for medical purposes. We believe that mobile devices are ideally suited for tongue diagnosis.

What is the significance of this project?

In this project we will develop a tongue image analysis App—ITongue. ITongue will allow the user to take a photo of the tongue with a smartphone or tablet, such as an Android phone or an iPad. Then the App will automatically analyze the image to interpret the person’s health status and provide advices accordingly.

Although the App is not intended for professional diagnosis, this household product can provide warning, especially in physical and mental weaknesses, and the users can follow up with hospital visits or lifestyle changes. Any person can use the App anywhere. In particular, the App can record the tongue images of the same person over time so that abnormal changes can be detected sensitively and timely. Hence, the proposed App may potentially benefit a broad population and reduce healthcare costs by preventive medicine.

What are the goals of the project?

Our long-term goal is a smartphone App that can be downloaded from an App store, and used by millions of people. The objective of this proposed project is to develop a fully functional tongue image analysis App—ITongue. This App will automatically analyze the tongue image to annotate person’s health status and provide advices through (1) image segmentation of the photo to isolate the tongue image, (2) feature retrieval of the tongue image, (3) comparison of the retrieved features with the annotated features in the database and medical annotations of health status, and (4) suggestion of hospital visits and lifestyle changes based on an expert system. Items (1) and (2) have been achieved so far. The funds to be raised will be used to support the development of items (3) and (4). For this purpose, we will conduct the following work:

(a) We will develop a knowledge base of annotated tongue images. We have a collection of tongue images and will collect more. We will collaborate with a professional TCM doctor, who is a visiting scholar at our university for the annotations. The annotations will include various TCM features, such as color, coating, shape, etc., and the health status associated with the person.

(b) We will develop an annotation system for health status based on the comparison between a new tongue photo and each of the annotated tongue images in the knowledge base. We will first develop a computational algorithm to compare tongue images from the TCM perspective. We will perform feature selections for the comparison and explore a distance measure that best reflects the medical relevance of the tongue image comparison. We will then develop a consensus-based method for assessing the health status based on the top hits of tongue image comparison.

(c) We will develop an expert system to advise the users based on the tongue image analysis and additional questionnaires. The advices include changing of life styles, avoiding or taking certain types of food, seeing a doctor (for severe health problems), etc.

(d) We will develop an application programming interface (API). The input the API is a tongue photo and the output is the annotation and medical advice. This API will be implemented as a cloud service, which can reduce the computing time into sub-second level.

As a token of appreciation, each donor will receive a beta version of the App for free.

Budget

Please wait...

While some preliminary work has been done, significant further developments are needed for a comprehensive product that many people can use. The only support that we have received for this project so far was some internal university fund, which has been expired. We have tried to apply for some external funding, such as the one from National Institutes of Health, but we have not achieved any success. The support from the crowd funding at Microyza will be very timely for our development. We will use the fund to support two graduate students for 8 months. If we receive more than $20,000, we will use the additional funds to conduct more work, such as improving the computational accuracy and speed, covering more types of diseases, usability tests and market analysis.

Meet the Team

Dong Xu
Dong Xu

Team Bio

Dr. Dong Xu is a James C. Dowell Professor and Chair of Computer Science Department, with appointments in the Christopher S. Bond Life Sciences Center and the Informatics Institute at the University of Missouri-Columbia. He obtained his PhD from the University of Illinois, Urbana-Champaign in 1995 and did two years of postdoctoral work at the US National Cancer Institute. He was a Staff Scientist at Oak Ridge National Laboratory until 2003 before joining the University of Missouri. He has published more than 230 papers.

Dr. Ye Duan is an Associate Professor of Computer Science Department, with appointments in the Thompson Center for Autism and the Informatics Institute at the University of Missouri at Columbia. He obtained his PhD from the State University of New York at Stony Brook. His research focus is on computer graphics, computer vision and biomedical imaging.

Additional Information

We have done significant tongue image analysis study in the past three years. Three research papers have been published, including one published in “Evidence-Based Complementary and Alternative Medicine”, a top journal in alternative and traditional medicine (Ratchadaporn Kanawong, Tayo Obafemi-Ajayi, Tao Ma, Dong Xu, Shao Li, and Ye Duan. Automated Tongue Feature Extraction for ZHENG Classification in Traditional Chinese Medicine. Evidence-Based Complementary and Alternative Medicine, doi:10.1155/2012/912852, 2012). The paper describes a collaboration with hospitals in China, in which we carried out a comparative tongue image study for 263 gastritis patients and 48 healthy volunteers. Our study has demonstrated an excellent performance of our method in disease labeling based on tongue features.

Project Backers

  • 6Backers
  • 2%Funded
  • $339Total Donations
  • $56.50Average Donation
Please wait...