If you’re concerned about privacy, you might hear advice about using Linux, Tor, Signal, or PGP. To most people, those words mean nothing.

One strategy you probably won’t hear about is replacing your friends’ names with emojis.

Your mom (or a jealous boyfriend) has no idea who 💩 is — but you do. That was the approach taken by one user in a study conducted by Simply Secure, a non-profit that seeks to help experts understand how everyday people are using, or not using, the latest in encryption technology. The small study asked 12 people in the Harlem or Brownsville neighborhoods in New York City their thoughts on privacy. The greatest concern wasn’t the NSA — although the assumption of government surveillance was taken as a given — it was in-person spying from friends, co-workers, family, or others. If a messaging service didn’t offer ephemeral texts, like Snapchat, they weren’t interested.

There is a disconnect between the people who develop secure ways to communicate, and the people who need ways to communicate securely. That was the thesis put forward by Sara “Scout” Brody, of Simply Secure, in a presentation this week at the O’Reilly Conference, in New York City. Brody used several real-world examples to show that until developers and experts listen to the rest of the population, these groups will be talking past each other. At best, that means no one is taking advantage of great new programs. At worst, it means ordinary people will find ways around clunky security measures and could inadvertently introduce all kinds of vulnerabilities.

One example comes from Brody’s own research in health care facilities. She interviewed the IT team in one hospital who assured her all of its practices were being followed and there were no issues. Later, hanging out at the nurses’ desk, Brody heard nurses talk about how they don’t sign out after shifts because it’s a pain for the next person to sign in. They shared binders with passwords written out in longhand. Some monitors had sticky notes reminding coworkers not to log out since it slowed down workflow.

The IT team later implemented a proximity sensor to automatically log a user out when the left a terminal. Although there was grumbling at first, they stopped getting complaints and assumed the new system was working. In actuality, the nurses had put a Styrofoam cup over the sensor so it wouldn’t log them out anymore.

“Security systems that don’t work for users don’t work,” Brody says.

It’s easy to imagine an expert ripping their hair out when they hear about the Styrofoam cup work-around. But in a health-care context, as Brody lays it out, if you’re about to give someone an injection, you need to know what they’re allergic to right away, which means you don’t have time to log in to whatever machine you happen to be standing next to.

For Brody, these problems need to be fixed on the development side, because pushing people to change their everyday habits is a long, slow process. “We need to build systems for people as they actually live,” Brody said.

Photos via Getty Images / Theo Wargo, Flickr / xelipe

John Knefel is a freelance journalist covering national security and civil liberties. He is the co-host of Radio Dispatch, a daily political podcast.