Huai-Ti's motto in biomechanics research


Huai-Ti's motto in biomechanics research

Ask questions like a physicist:

Try to aim for the most general fundamental questions that apply to multiple organisms and across different systems.

Think like a biologist:
Frame hypotheses by considering the evolutionary constraints and the life history of the organisms/systems.

Work like an engineer:
Break down the research project into engineering tasks and jump over each technical hurdle one by one.

Wednesday, August 31, 2011

Untethering the visual guidance experiments

Many animals rely on vision to guide their locomotor behaviors. This is especially critical for flying animals that need to integrate visual cues quickly. A classic approach to study questions in visual guidance involves tethering the animal at a stationary mount and presenting a series of carefully designed visual stimuli. By measuring the neural activities or locomotor behaviors, scientists can deduce some basic components of flight control.

LED flight arena used by Dr. Michael Reiser and others at HHMI

This brilliant approach, however, has a drawback. A tethered animal can behave or "think" differently due to the lake of other sensations associated with flight (i.e. inertial forces, air flow, and ambient light variations). The measured neural or behavioral output can be skewed in some ways. For animals with relatively simple nervous systems (i.e. flies) performing stereotypical behaviors, tethered flight is quite representative of the real flight. For animals with more complicated brains performing complicated behaviors involving sensory fusion, tethered experiments may be insufficient.

One new trend in animal flight is to perform visual guidance experiments on freely flying animals. Instead of controlling the visual cues, we control the visual environments for the animal to fly. By tracking the head orientation and the whole body flight trajectories, one can reconstruct the visual fields of a flying animal. With proper calibration, one can actually reconstruct the entire image an animal see in each instance.
One major challenge is to expand the 3D tracking volume to capture flight trajectories long enough for visual guidance analyses. With limited number of high speed cameras, this is done often at the cost of resolution. Degraded resolution unfortunately also renders the precision of the 3D reconstruction. Thanks to some breakthroughs in sensors and wireless communication, we now can deploy inertial measurement units (IMUs) on flying animals to measured fine rotations, unloading the responsibility of optical tracking via cameras. For my experiments, the high speed cameras only need to pin down the overall body position and head position. All the rotations can be sensed by the IMUs on the bird.

One last thing we must discuss is the philosophy behind this kind of experiments. In tethered experiments, we control the visual input precisely because the animal's visual field is fixed to a specific display screen. We can treat the visual cues as independent variables while measuring the behavioral or neural output as the dependent variables. In free-flight experiments, however, the animal's flight path and head orientation determine the visual input which will guide the animal into a future trajectory. The visual input we reconstructed is an arbitrary independent variable for the near future flight trajectory. Since the animal's flight trajectory directly produces visual sensation which in turns affects the future trajectory, we could have a "chicken and egg problem". One way to get around this problem is to construct a navigation model which predicts flight paths. I will explain this approach in the next post.      
        

1 comment: