|LED flight arena used by Dr. Michael Reiser and others at HHMI|
This brilliant approach, however, has a drawback. A tethered animal can behave or "think" differently due to the lake of other sensations associated with flight (i.e. inertial forces, air flow, and ambient light variations). The measured neural or behavioral output can be skewed in some ways. For animals with relatively simple nervous systems (i.e. flies) performing stereotypical behaviors, tethered flight is quite representative of the real flight. For animals with more complicated brains performing complicated behaviors involving sensory fusion, tethered experiments may be insufficient.
One new trend in animal flight is to perform visual guidance experiments on freely flying animals. Instead of controlling the visual cues, we control the visual environments for the animal to fly. By tracking the head orientation and the whole body flight trajectories, one can reconstruct the visual fields of a flying animal. With proper calibration, one can actually reconstruct the entire image an animal see in each instance.
One last thing we must discuss is the philosophy behind this kind of experiments. In tethered experiments, we control the visual input precisely because the animal's visual field is fixed to a specific display screen. We can treat the visual cues as independent variables while measuring the behavioral or neural output as the dependent variables. In free-flight experiments, however, the animal's flight path and head orientation determine the visual input which will guide the animal into a future trajectory. The visual input we reconstructed is an arbitrary independent variable for the near future flight trajectory. Since the animal's flight trajectory directly produces visual sensation which in turns affects the future trajectory, we could have a "chicken and egg problem". One way to get around this problem is to construct a navigation model which predicts flight paths. I will explain this approach in the next post.