Karen Navarra, 67, died alone in her dining room in San Jose, California, the victim of a grisly homicide. There were no witnesses, but there was a single line of evidence that new reports say helped the police department catch her killer. As she died, her FitBit fitness tracker recorded the exact time of her heart’s panicked, final beats, narrowing down their search to the only person who could have been with her when she finally flatlined.
Fitness data, tracked constantly on our wrists and in our pockets and extracted from private databases by police warrant, have become an important part of solving crimes in just a few years. As the San Jose murder shows, “fit leaking,” as University of Toronto senior researcher John Scott-Railton calls it, provides an unprecedentedly intimate window into the aspects of human behavior we long assumed were private. That’s useful in a murder case, Scott-Railton cautions Inverse, but there’s so much more it can reveal to whoever it it that can access it.
The Telltale FitBit
Currently, it takes a police warrant to access user data stored on the servers of fitness tracking companies. After the San Jose Police Department did so, it led to the September 25 arrest of Anthony Aiello, Navarro’s 90-year-old stepfather, for murdering his stepdaughter. The New York Times reported that Navarro’s dining room was spattered with blood, and Navarro herself was found slumped at her dining room table with “lacerations on her head and neck” and a large kitchen knife in her right hand.
But even more crucial was the object on her left hand, a FitBit Alta HR, which, upon release of Navarra’s data, revealed a “significant spike” in Navarro’s heart rate at 3:20 pm, followed by a “rapid slowing.” Finally, at 3:28 pm, the device stopped transmitting data. This time window helped police narrow in on Aiello, whose car was parked outside during those critical eight minutes.
While this is a cool victory for Fitbit as a crime-fighting tool, Scott-Railton can’t help but ask: What other details about our private lives can fitness data reveal? “Whenever you have a new kind of data, especially one that criminals might not be thinking about it’s bound to be the case that more and more investigations will want to use that data to solve traditional investigative problems,” he tells Inverse.
“As more and more streams of data exist about human behavior it empowers investigators in new ways, but it also highlights just how much information exists within that data.”
“A Transcript of Human Behavior”
Navarra’s death is isn’t the first murder case to be solved thanks to a wearable, and it will not be the last. In April 2017, a FitBit was used to charge Richard Dabate with murdering his wife after his version of events didn’t line up with the movement detected on his wife’s device. This April, a case in Australia used Apple Watch heart rate data to narrow in on the exact seven minutes in which a woman fought for her life and then lost consciousness. The data are useful for solving crimes because it reveals how humans behave in all sorts of circumstances, from the mundane to the morbid, Scott-Railton says.
“So with every successful investigation that is conducted with Fitbit data, what it also shows that data is generating an extremely revealing transcript of human behavior,” he says. “This data is going to be used more and more, and then the question becomes what kind of oversight will that use have?”
Deciding who has oversight of personal location data — only part of what a fitness tracker has access to — has always been controversial. For example in the early 2000s, cellphone records could reveal when a person was lying about their whereabouts. A series of contentious court cases culminated in a Supreme Court decision reasserting that police need a warrant to obtain those records. It was a protective layer of privacy in the wake of past decisions in district courts that allowed police to access this data without one.
All that controversy was focused only on data on location. But a fitness tracker knows far more than that: It knows how healthy you are. As such, Scott-Railton cautions, there may be other entities outside of law enforcement that might want — and very well may get — access to this intimate transcript of human behavior.
“It’s not limited to police investigators, it’s just that they can request that data,” Scott-Railton says. “The message to take away here is that data from fitness trackers is a valuable tool, but it’s also clearly a tool that people who collect it are going to be deriving value from because it tells so much about human behavior.”
Can You Misuse Tracker Data?
There are some perks to opting into sharing data. For example, some insurance companies have programs in which users allow their employers access to fitness tracker data in return for cash rewards for reaching a certain step count each day.
From that perspective, it seems like wearable data is doing its part to tackle some of society’s darker aspects: We have FitBits catching killers and exercise programs fighting declines in physical activity. But that’s no reason to forget that workout data doesn’t just stay untouched on your watch — and that it can reveal a lot about you, even if your name isn’t attached to it.
The world was reminded to stay cautious this January when Strava, a social network that allows users to share location data from workouts, released a global heat map showing exercise activities from all over the world. Unfortunately, this move also revealed the secret locations of certain military bases because soldiers were tracking workouts done at the base on their wearables and sharing that data with Strava’s servers.
Scott-Railton is cautious about embracing the hype around solving murder cases with fitness data. Catching criminals is just one of many ways to interpret what has become of one of the most detailed transcripts of human behavior in history, and where there’s information like that, there’s always the potential that someone might abuse it, despite our best efforts to protect privacy.
“The challenge with any new technology is to imagine the way it will be misused,” he says. “And I think the mistake we make is not to imagine the ways that bad apples may want to misuse that information”.