Buzz off

These insects are annoying, but they might just save your life

By studying how flies react to stimuli in this controlled, enclosed environment, researchers can improve search and rescue robots.

Macro shot. The Calliphoridae (commonly known as blow fly, carrion fly, bluebottle, greenbottle, or ...
Shutterstock

For humans, one reality is never enough. But what about for flies?

Humans have designed different virtual worlds to recover from traumas, learn to be more empathetic, or even just to play videogames. And now, researchers have designed a similar virtual experience for these common household pests: flies.

By placing the common apple fly, as well as other flying insects like mosquitos, in a controlled virtual environment, researchers are able to better understand how they experience and respond to different sensory stimuli such as wind or smell.

While understanding this most-common of insects is a reward unto itself, this research will also change how scientists design essential search and rescue robots.

Insects (including common flies) are the most diverse multicellular taxa on Earth, making them an extremely valuable wealth of knowledge, Shannon Olsson, co-author and director of the National Centre for Biological Sciences' Naturalist-Inspired Chemical Ecology (NICE) research group, tells Inverse.

"We're really interested in insects for a number of reasons," says Olsson. "One is that they're by far the most numerous taxa on the planet.... They're also able to do a lot of really remarkable behaviors. If you think about how a mosquito in your room at night is always able to find you, no matter how you try to avoid it... or how well a pollinator can find flowers long distances away... They're able to do this with really very, very tiny brains.... Because of that, insects have been really inspirational for many scientists who are trying to look at how they can reduce computational requirements but still do very, very complex behaviors."

But researchers tell Inverse that trying to control but these insects, as well as sensory stimuli in experiments, can be extremely challenging.

So, instead of chasing these flies through the lab all day, in a study published Monday in the journal Proceedings of the National Academy of Sciences, doctoral researcher in the NICE group and first author on this study, Pavan Kaushik, tells Inverse they designed an entire virtual 3D world for just a single, small fly.

"It's more of a practical issue," says Kaushik. "[Insects] migrate over 100 kilometers... so tracking such a tiny thing over such a large scale is just not possible.... In virtual reality [the fly] is stuck to a tiny needle but you can make the fly, in principle, fly for kilometers [and] also know what its seeing, feeling or smelling."

To immerse these flies in the VR environment, the researchers tethered them in place using a pin and placed them in a semi-enclosed circular VR chamber. The VR chamber was designed to move in sync with the movement of the tethered flies' wings, giving it the sensation of naturally flying through an environment of grass and trees. Once in the environment, the researchers were able to carefully control what stimuli the flies encountered and designed experiments to test how they were able to parse and interact with multiple forms of stimuli, or, in other words, how smelling something and seeing something would cause them to react differently than just seeing something alone.

For their behavior studies, the team chose to focus on a well-understood member of the insect taxa: the common apple fly. Because apple flies only interacts with apple trees (Malus domestica), the team was able to render only these trees in the VR environment to judge the fly's ability to do something called motion parallax, or, how it judges the depth of objects against complex background environments.

When rendering apple trees of the same size but having them grow closer to the fly at different rates, the researchers observed that the fly chose the "closer" tree, showing that it was able to distinguish depth from motion when choosing its target.

Scientists exposed the fly subjects to different air flow and scent stimuli to see how they would use this information to interact with the virtual world around them.

Photograph courtesy of Shoot for Science: Deepak Kakara, Dinesh Yadav, Sukanya Olkar, and Parijat Sil.

The researchers were also able to see how stimuli like puffs of air or added smells affected this ability and found that while flies can orient themselves based on puffs of air alone (without needing visual cues as well,) flies were unable to do the same when exposed to only scent stimuli.

Going forward, the researchers say that the information collected from these experiments can be used for slightly more human endeavors as well: search and rescue robots.

"One of the things that came out of our study was that flying insects can utilize motion parallax," says Olsson. "[But] now we need to understand what exactly they're utilizing in the environment to allow them to do that.... We now really have to disentangle those cues and that can give us really fundamental inputs for what they require in order to do things like motion parallax and those are key parameters that could be used for robotics."

Search and rescue robots come in all shapes and sizes and Kaushik tells Inverse that providing them naturalistic information from these flies -- instead of motion information based on insect experiments in non-realistic environments -- can improve how these robots interact with the real world and how they're able to effectively traverse potentially dangerous environments and even save our lives.

Abstract: The exemplary search capabilities of flying insects have established them as one of the most diverse taxa on Earth. However, we still lack the fundamental ability to quantify, represent, and predict trajectories under natural contexts to understand search and its applications. For example, flying insects have evolved in complex multimodal three-dimensional (3D) environments, but we do not yet understand which features of the natural world are used to locate distant objects. Here, we independently and dynamically manipulate 3D objects, airflow fields, and odor plumes in virtual reality over large spatial and temporal scales. We demonstrate that flies make use of features such as foreground segmentation, perspective, motion parallax, and integration of multiple modalities to navigate to objects in a complex 3D landscape while in flight. We first show that tethered flying insects of multiple species navigate to virtual 3D objects. Using the apple fly Rhagoletis pomonella, we then measure their reactive distance to objects and show that these flies use perspective and local parallax cues to distinguish and navigate to virtual objects of different sizes and distances. We also show that apple flies can orient in the absence of optic flow by using only directional airflow cues, and require simultaneous odor and directional airflow input for plume following to a host volatile blend. The elucidation of these features unlocks the opportunity to quantify parameters underlying insect behavior such as reactive space, optimal foraging, and dispersal, as well as develop strategies for pest management, pollination, robotics, and search algorithms.
Related Tags