In the summers of 2019 and 2020, Â鶹Porn professors Krista Ingram and Ahmet Ay set course for Casco Bay, Maine, with student researchers Lauren Horstmyer ’22 and Hailey Orff ’24. Together, they captured thousands of images of harbor seals for SealNet, their innovative new seal facial recognition software.
The team’s research was officially published this spring — a second paper, to be published in the near future by Ingram and Horstmyer, will continue to explore results of their efforts.
The group’s research took them to seven different “haul-out” sites, areas where seals commonly come out of the water to rest on land.
“The majority of sites were accessed via boat, so it was extremely important for us to stay far away from the sites and be as quiet as possible in an effort to create minimal disturbance,” says Orff. “It was also important for us to pay attention to the behavioral changes in the seals as we approached because certain behaviors such as barking or flushing into the water meant that we were too close and needed to back off.”
Once the photos were sent back to campus, student researchers Zachary Birenbaum ’22 and Hieu Do ’23 used them to design and train SealNet.
They entered dozens of photographs from the Brandt Ledges haul-out site in 2019 to grow its database of seal faces and features and help to train it to differentiate between them. Then, upon the team’s return a year later, Birenbaum and Do entered a new round of photos from the Mitchell Fields haul-out site into SealNet. The software recognized four harbor seals (Armani, Petal, Clove, and Cystine), providing valuable insight into their visitation patterns and behaviors.
SealNet’s success is due to machine learning, specifically face detection and recognition. According to Do, we interact with machine learning recognition and detection every day — they power Face ID on our iPhones and help self-driving cars detect other vehicles. “For our research,” Do says, “we took these same ideas of detection and recognition and modified them to work for seal faces.”
The software automatically detects the face in a picture and crops around it, saving the researchers time. SealNet then bases its recognition of seal faces on general patterns, including face, eye, and nose shape as well as distances between features.
“For our detection model,” says Do, “we trained the software to focus on seal photos. However, if you had enough good training data of other animals in which they are facing directly into the camera and clearly displaying all of their facial features, then it will work on them as well.”
In fact, a primate facial recognition software known as PrimNet had been retrained for use on seals before, but this repurposed software operated with only 88% accuracy. Drawing upon 1,752 photos of 408 individual seals, SealNet outperformed PrimNet by 8%.
The implications of the SealNet program are promising and far-reaching. Using the program, researchers can identify individual harbor seals from field photographs with high accuracy and use this information to inform conservation efforts, estimate population growth, identify haul-out site visitation patterns, determine trends in migration, and explore harbor seal social behavior.
The SealNet project also aided students in discovering or strengthening their passions for research.
“I feel incredibly grateful to have been a part of this publication,” Horstmyer says. “From photographing the seals to analyzing their visitation patterns to working on the original draft of the research paper, I know that the skills that I developed throughout this project will serve me well as I continue working toward a career in marine science post-Â鶹Porn.”
As for Birenbaum, “Finding out that our research had been accepted for publication gave me a dopamine rush,” he said. “After so much time spent working on SealNet, it felt amazing to finally see the results of our work put out into the world.”