As David Ancalle opened video after video of diarrhea this year, it struck him: This is not what he expected to be doing for his Ph.D.
Ancalle, a mechanical engineering student at Georgia Tech who researches fluid dynamics, is currently working to demystify the acoustics of urination, flatulence, and diarrhea. His team is training AI to recognize and analyze the sound of each bathroom phenomenon; in fact, research suggests that tracking the flow of our excretions could benefit public health.
What’s new — Ancalle and Maia Gatlin, an aerospace engineer at the Georgia Tech Research Institute (GTRI), created a mechanical device loaded with pumps, nozzles, and tubes meant to recreate the physics — and sounds — of human bodily function. They named it the Synthetic Human Acoustic Reproduction Testing machine (yep, S.H.A.R.T.).
S.H.A.R.T. is now preparing an AI algorithm to one day pick up on deadly diseases like cholera and stop an outbreak in its tracks, according to a presentation at last week’s American Physical Society’s annual Fluid Dynamics conference. Ancalle and Gatlin’s results haven’t yet been published in a peer-reviewed journal.
Here’s the background — Diarrheal diseases like cholera kill 500,000 children yearly, making them the third leading cause of child mortality worldwide. “There's an outbreak and resurgence in Haiti as we speak,” Gatlin says. Ramping up the detection of diseases would bolster treatment and prevent outbreaks, she explains.
Why it matters — The goal is to combine the machine-learning model with inexpensive sensors and deploy them in regions susceptible to outbreaks of diarrheal disease. “And as we classify those events, we can start to collect that data,” Gatlin says. “It can say, ‘Hey we're seeing an outbreak of lots of diarrhea.’ Then we can start to quickly diagnose what's going on in an area.’”
What they did — Until recently, Ancalle wasn’t thinking much about diarrhea. “Our initial focus for that first year was really on flatulence and urination,” he says. He and his colleagues were trying to relate the sound of farts to the internal geometry of a rectum — abnormal changes could mean cancer. “After discussing with gastroenterologists we thought that it would be a good way to try for a non-invasive route.”
But the project soon expanded: Ancalle teamed up with researchers at GTRI who were figuring out ways to passively detect outbreaks of gastrointestinal diseases. Perhaps, they wondered, next-gen toilets could do more than collect excrement — they could also help alert communities of an outbreak.
That’s where acoustics come in. Sound is easier to analyze remotely than video or self-reporting, and it’s less invasive or cumbersome than a medical examination. And the sounds of our outputs — urination, flatulence, solid defecation, and diarrhea — are distinct. The team realized that an inexpensive device and an AI algorithm could organize this toilet information.
They began by sorting through publicly available audio and video of excretions, capturing the frequency spectrum from each, and feeding it to a machine-learning algorithm. Their AI then learned from all that doodoo data until it was primed for S.H.A.R.T. machine testing.
The S.H.A.R.T. machine is a couple of feet wide and has loads of nozzles and attachments. The team pumps water through the machine and records the sounds. They learned the physics behind the sound of each excretion and designed the device to simulate those same dynamics — tinkering with different attachments for each subsystem. “A lot of thought went into each of the sounds,” Gatlin says. “There was a subsystem for each sound on this little machine.”
“It actually performs pretty well,” she continues. Their algorithm identified the correct “excretion event” up to 98 percent of the time, according to early data.
The team is also exploring the fundamental physics at play. In the conference presentation, Ancalle described how the team modeled the sound of male urination (streams turning into droplets that splash in succession.)
If the geometry of the urethra changes, so do the stream and the sound. Now, Ancalle is working with urologists to use the same machine-learning approach to detect irregular changes in urination and flatulence based on this idea.
“Self-reporting is not very reliable,” Ancalle says. “We're trying to find a non-invasive way where people can get a notification on whether or not they should go get checked out. Like ‘Hey, your urine is not flowing at the rate that it should. Your farts are not sounding the way they should. You should check it out.’” They propose that changes in the tract — from cancer or another condition — would manifest in these acoustics.
“It’s reasonable to assume that you can detect it with microphones,” says Jared Barber, an applied mathematician from Indiana University who chaired the session but isn’t involved in the research. Ancalle has also worked on a female urination model but had only completed the male model in time for his presentation.
What’s next — The researchers are looking to expand their testing and eventually build a deployable device, which could include a tiny Raspberry Pi computer. Gatlin envisions pairing this project with ongoing sustainable toilet projects.
Barber notes that the work is very preliminary, but he was encouraged by the talk. “It seems like it could have a very large impact,” Barber says. “It all seems feasible. They are using techniques that can deliver the hopeful capability for diagnosis.”
It’s still early days, but the team is designing with the end product in mind. “We're not trying to come up with million-dollar equipment,” Ancalle says. “We are trying to make this something that can be afforded by just everyone, particularly since the project is focused on urban areas with weak health systems. The affordability aspect is very important for us.”