Facebook recently announced a revamped version of its user research app. Launched on June 11, Study from Facebook is essentially a more private version of Facebook Research, the controversial program which was shut down after TechCrunch revealed that it was paying users, including some teens, to allow root access to their phone for up to $20 per month in gift cards.
Not only was Facebook allowing teens under 18 to sign up for the program — and using targeting ads to incite them to join — the access Facebook received allowed the company to not only to monitor what apps they were using, but also their web activity, data potentially owned by their friends, and even encrypted information. After a vocal backlash, Facebook Research has been transformed into Study from Facebook, which adds a few new privacy guardrails.
The key improvement? It will no longer track your web activity or monitor encrypted information, but it still monitors your app activity.
To sign up, you’ll have to be targeted by a Facebook ad. After clicking the ad, Facebook will go through a number of steps to verify your identity. Once you’re up and running, Facebook says it collects and analyzes information such as:
- The apps installed on a participant’s phone
- The amount of time spent using those apps
- Participant’s country, device and network type
- App activity names, which may show us the names of app features participants are using
The announcement states that Facebook will not collect user IDs, passwords, photos, videos, or messages. The company says it will not sell information from the app to third parties or use it to target ads. Finally, it will let you know how your data is being used, and intermittently send you an update reminding you the program is running.
Study from Facebook offers the transparency guardrails that were absent from Facebook Research, but still perpetuates a number of troubling dynamics when it comes to how we think about privacy.
Privacy Is a Right, Not a Commodity
Continuing to pay users for their data contributes to the increasingly prevalent standard that consumers should have to pay for their privacy in one way or another. In the case of Study from Facebook, you are forgoing money by refusing to give access to your app activity. Apple does this too, albeit in a different way, by making privacy such an important plank of its marketing and product strategy that it could be reasonably described as a “privacy as a service” company, as TechCrunch did after WWDC.
With Facebook, individuals could be asked to pay more for privacy if they don’t want to be tracked, or they could pay a discounted cost for a service if they choose to give up their data. In the instance of Facebook, you’re forgoing payment if you don’t want to participate in Study from Facebook, but these are far from hypotheticals. AT&T previously offered users a $30 discount on monthly broadband service if they allowed browser tracking.
Put another way, this is essentially charging customers for un-monitored internet.
Privacy Is a Luxury for Those Who Can Afford It
There are some other disconcerting precedents. Facebook announced that it will release Study from Facebook only on Android, where users can grant greater access to their phone than on iOS (Apple has even gone so far as to ban these kind of apps, no doubt as part of its privacy as service strategy).
Android phones are disproportionately purchased by lower socioeconomic status individuals in the US. And in India, the only other country where Study from Facebook has launched, 97% of smartphones are Android, with many of them costing less than $100, according to Quartz.
People with less money are going to be more incentivized to share their data with Facebook, and Study from Facebook is readily available through their Google Play Store, which comes preloaded on Android phones. This means that for poorer people, there are more avenues and reasons to use this product and in doing so, to give up their privacy.
Scrutiny Hasn’t Affected Facebook’s Core Behavior
It’s also worth noting that this also how Facebook behaves under scrutiny. Not only has presidential candidate Elizabeth Warren expressly called for for the breakup of large tech companies like Facebook, but Federal Trade Commission is also examining Facebook for evidence of anti-trust violations, according to the Wall Street Journal.
After the scandals, Facebook announced a pivot to privacy, and discussed integrating the backend of Instagram, Facebook, and WhatsApp, in addition to adding end to end encryption (which WhatsApp already featured). Continuing to incentivize users to surrender their privacy clearly undermines these initiatives.
Study from Facebook allows the company to continue precisely what got it to its current position of power, monitoring app use at scale so they’re ahead of the curve when it comes to potential acquisitions. At the end of the day, their core business model is getting and monetizing data about its users. It doesn’t have to be in the form of pervasive data extraction that you didn’t consent to. It can be with your permission, but the goal is still data extraction, even if they’re paying you for it.
This doesn’t even count the ways that Facebook still analyzes and utilizes the data from how you engage with the platform, outside of this research app.
Privacy International previously released a report in December of 2018 which examined the data traffic of 34 Android apps that had between 10 million to 500 million installs. Twenty three of those apps were automatically sending data to Facebook. Some apps have been updated, but not nearly all of them, according to an updated version of the report that was released in March 2019, the day before Zuckerberg announced the company’s pivot to privacy.
Study from Facebook’s changes are a positive development. But it doesn’t fundamentally change the way our conversation about privacy is happening, and still reverts to data as the main way Facebook bolsters its business.