It used to be easy to tell, as your fluttering heart and damp palms gave way to a full-body blush, whether you were attracted to a potential crush. But faced with the glow of a Tinder screen, there’s more time to override our physical impulses, think, and overthink. Our inherent neuroses don’t do us any favors in the realm of digital romance, where the nuances of finding love are distilled into a single choice: Right or left? Nicole He, creator of the True Love Tinder Robot, thinks it’s time we put those decisions in the hands of machines. Robots, at least, don’t psych themselves out.

In her work at NYU’s Interactive Telecommunications Program, He has been pushing the limits of our relationships with computers, exploring just how much we’re willing to put our lives in their hands. “That’s something we do, whether we think about it or not,” she tells Inverse, pointing out how quickly we allowed FitBits and FuelBands to tell us what we should do with our bodies. Her Tinder Robot essentially does the same thing, only it looks to tell us who we should be doing, as well. Still, it’s just another form of outsourced personal data analysis. Why should we treat it any differently?

Using a sensor that picks up biometric data from a user’s palms as he or she views potential Tinder matches, the Tinder Robot takes on the burden of swiping, putting it in the hands of, well, a mechanical hand He describes as “creepy.” The sensor measures the galvanic skin response — that is, how much sweat your potential mate’s selfie elicits from your helpless palms — and swipes left or right accordingly. While nothing about your initial physiological response changes, interpreting what to do with your lusty perspiration (or lack thereof) becomes the robot’s responsibility, not yours.

“While you’re swiping on Tinder, instead of giving you that control over making your own decisions, the idea is that the robot reads your body and knows better than you about whether you should swipe left or swipe right,” He says. In a sense, her project posits that both humans and robots are computers being fed the same biometric data. The robots, in this case, are simply better at analyzing it than we are.

How we define better, however, is entirely up to us. What does perfectly executed analysis of biometric data look like? Is it something we’re incapable of doing without a computerized crutch? How much faith should we have in our physiological responses in the first place? She doesn’t have the answers, but He doesn’t doubt our curiosity, trumping our hesitation about algorithmically selected lovers — which, if you think about it, is not that foreign a concept — will help us find them in the near future.

Her design is, of course, missing other crucial biometric inputs traditionally related to attraction — she has also considered heart rate, pupil dilation, and facial expression. The real question is why we’re willing to trust computers to interpret our bodies better than our brains can. Our confidence in personal tech is a trend she thinks is only going to continue.

“I bet in two years, this would not be an interesting project because it would just be kind of normal,” He says. “Not in this creepy robot hand form, but allowing the tech to read data from our bodies to do things like this.”

Photos via Nicole He