Analyzing placentas using AI

Computerized image analysis could lessen barriers to placental analysis and improve health outcomes for mother and child.


The placenta is an organ that develops in women’s uterus during pregnancy. This structure provides oxygen and nutrients to the growing baby and removes waste products from the baby’s blood.

Analyzing placentas can provide essential information on the health of both baby and mother. But only 20 percent of placentas are assessed by pathology exams after delivery in the U.S. The cost, time, and expertise required to analyze them are prohibitive.

Now scientists from Penn State University have come up with a novel solution: analyzing placentas using AI after delivery. They have developed a digital tool that uses AI to examine an image of each side of the placenta and then produces a report with critical information that could impact the clinical care of the mother and child, such as whether the fetus was getting enough oxygen in the womb or if there is a risk of infection or bleeding.

Alison Gernand, assistant professor of nutritional sciences at Penn State’s College of Health and Human Development, said, “The placenta drives everything to do with the pregnancy for the mom and baby, but we’re missing placental data on 95 percent of births globally. Creating a more efficient process that requires fewer resources will allow us to gather more comprehensive data to examine how placentas are linked to maternal and fetal health outcomes, and it will help us to examine placentas without special equipment and in minutes rather than days.”

Currently, most of the countries lack resources to conduct even a baseline placental analysis. Scientists think that their digital tool could become a solution. A user needs a smartphone or tablet with the appropriate software.

Scientists analyzed 13,000 high-quality images of placentas and their corresponding pathology reports from Northwestern Memorial Hospital to develop their tool. Each training set of images were labeled data points critical to understanding the placenta, such as areas of incompleteness and the umbilical cord insertion point.

Scientists then used those images to train neural networks using CPU and GPU servers that could automatically analyze new placental images to detect features linked to abnormalities and potential health risks.

When tested, their system was able to predict on unlabeled images efficiently. Not just this, comparisons with the original pathology reports demonstrated the system’s high accuracy and clinical potential.

James Wang, a professor in Penn State’s College of Information Sciences and Technology, said, “Past analyses have typically examined features independently and used a limited number of images. Our tool leverages artificial intelligence and a large and comprehensive dataset to make many decisions at the same time by treating the different parts of the placenta as complimentary. To our knowledge, this is the first system for comprehensive, automated placental analysis.”

“Also, the tool could advance pregnancy research and be useful for long-term care by providing clinically meaningful information to patients and practitioners.”

Alison Gernand, assistant professor of nutritional sciences at Penn State’s College of Health and Human Development, said, “We’re working to make the placental data accessible by translating it into something that’s both clinician and patient-friendly. We know placental development and function is vital to the health of the pregnancy, but we only know a fraction of how much it can tell us about the health of the mom and baby. This research is a critical first step in building big data to understand better what we can learn from the placenta.”

The team’s study was presented at the International Federation of Placenta Associations meeting held in Buenos Aires, Argentina, in September and at the International Conference on Medical Image Computing and Computer-Assisted Intervention held in Shenzen, China, in October.


See stories of the future in your inbox each morning.