The collection of private data is neither social good nor social ill. It is a reality of modern life that has facilitated the creation of both experiences that consumers embrace and surveillance tools that the public fears. The convenience of big data and its role in eroding norms around privacy are not at odds — people allow businesses access to their information in return for free services and for streamlined interactions. In fact, the one facilitates the other. Information given away in transaction is added to the pool of information that is surreptitiously taken, potentially allowing companies or agencies to understand crowds on an individual level. This is what experts are referring to when they call this the “Golden Age of Surveillance,” an era that is likely to continue until the American public makes up its mind about data collection — or comes to understand the social and cultural implications of failing to do so.
Pew polls reveal we’re a mess of contradictions when it comes to privacy. Some 57 percent of Americans think it’s unacceptable for the government to monitor, 52 percent are concerned, and 46 percent are not. Given that those numbers imply that 3 percent of Americans think surveillance is unacceptable, but aren’t concerned, it’s fair to say that people aren’t fully getting their arms around the issue. This problem is likely a product of bad guys versus good guys, “I have nothing to fear” mentality. But that dialectic veils a major issue: the very real psychological effect of living under surveillance.
Stanford Cyber Initiative Fellow Tara Behrend studies the psychology of information privacy. She says that the rise of surveillance won’t divide criminals from model citizens, but instead separate and potentially victimize a massive psychographic. Behrend walked Inverse through the major four major issues that crop up at the intersection of psychology and surveillance.
Behavior and Beliefs Don’t Line Up
How monitoring is framed and described affects how people react to it. When it comes to how willing someone is to give their information, Behrend says current research takes less of a psychological approach and more of an economic one. Privacy-related decisions are somewhat expected to be determined by a “privacy calculus.” In this scenario your privacy and your data is seen as a commodity and you have to figure the pros and cons of trading out that commodity. A team of Swiss and German researchers broke down the idea of a privacy calculus in their 2013 paper as a “situation-specific tradeoff of privacy-related risk and benefit perceptions, bounded by dispositional tendencies and irrational behavior.”
In other words, people figure out the pros and cons. But irrationality always has its day. In a study of German university students, 78 percent of them said in a questionnaire that they knew that it was risky to click on a link in an email sent by a stranger. But when the students were sent an anonymous email claiming to contain a link to photos taken of them at a party, 56 percent of the students who said that they knew better still clicked the link. They decided the benefit of seeing these pictures outranked the chance of a potential hack.
Social Obligation Triggers Sharing
Impression management is the process — sometimes conscious, sometimes not — people use to determine and manipulate how other people see them. Behrend says that she’s found that people who are “high impression managers,” the ones who care a lot about how people think of them, are generally more comfortable sharing information online.
“It is totally counter-intuitive to me,” Behrend says, “I would have thought that if you correlate with high impression management, then you would be more worried about the information people might see on say your Facebook profile or your LinkedIn. But, in fact, it’s the opposite.”
Behrend says the reason why this is remains unclear, but it may have to do more with complying with perceived expectations. “It’s not that they feel more comfortable doing it,” she says, “but that they feel obligated to do so.”
Top Down Solutions Don’t Work
“The objection I have to a lot of technology people who talk about these issues, is that they the solution is that people ‘just have to be smart about privacy’ — you just ‘have to know where your data is,’” says Behrend. “That’s not a reasonable thing to ask and it also implies that if people did know more, then they would change their behavior. I don’t agree with that.”
This comes back to a difference in impression management, how pros and cons are weighed, and a general psychological outlook. Behrend argues that you can’t make a hypothesis that a form of technology will affect people in an exact way, because there’s enough evidence that differing attitudes creates a spectrum of reactions.
“We all know people who are very paranoid about their email, and other people who don’t care and use it as a diary,” Behrend says, “It’s not really about the email — it’s about the people using it.”
Data Leads to Manipulation Leads to Data
When asked whether or not private companies and the government leverage psychology to put pressure on people to share information, Behrend didn’t mince words.
“I expect they do this and I view that sort of manipulation as unethical,” she says. “I don’t think we should be encouraging people to share their data. We should hold organizations and government agencies accountable for collecting only the data they need to provide their services, and not treat data like a commodity.”
That puts Americans in an interesting position. Donald Trump has scolded agencies for spying on civilians, but he has also evinced an interest in domestic data collection and surrounded himself with law-and-order types generally supportive of broadly defined government oversight. What does that mean for American citizens? It’s totally unclear. The executive branch can only do so much and that cultural conversation remains lively. Legislation and governance likely won’t make sense until we make sense of the issue.
“We’re in an important period of change right now,” says Behrend. “The laws and protections we have developed for citizens, employees, and students are outdated. Pervasive data collection presents a real danger: of discrimination, harassment, and just plain bad decisions.”