Huai-Ti's motto in biomechanics research

Huai-Ti's motto in biomechanics research

Ask questions like a physicist:

Try to aim for the most general fundamental questions that apply to multiple organisms and across different systems.

Think like a biologist:
Frame hypotheses by considering the evolutionary constraints and the life history of the organisms/systems.

Work like an engineer:
Break down the research project into engineering tasks and jump over each technical hurdle one by one.

Wednesday, August 31, 2011

Untethering the visual guidance experiments

Many animals rely on vision to guide their locomotor behaviors. This is especially critical for flying animals that need to integrate visual cues quickly. A classic approach to study questions in visual guidance involves tethering the animal at a stationary mount and presenting a series of carefully designed visual stimuli. By measuring the neural activities or locomotor behaviors, scientists can deduce some basic components of flight control.

LED flight arena used by Dr. Michael Reiser and others at HHMI

This brilliant approach, however, has a drawback. A tethered animal can behave or "think" differently due to the lake of other sensations associated with flight (i.e. inertial forces, air flow, and ambient light variations). The measured neural or behavioral output can be skewed in some ways. For animals with relatively simple nervous systems (i.e. flies) performing stereotypical behaviors, tethered flight is quite representative of the real flight. For animals with more complicated brains performing complicated behaviors involving sensory fusion, tethered experiments may be insufficient.

One new trend in animal flight is to perform visual guidance experiments on freely flying animals. Instead of controlling the visual cues, we control the visual environments for the animal to fly. By tracking the head orientation and the whole body flight trajectories, one can reconstruct the visual fields of a flying animal. With proper calibration, one can actually reconstruct the entire image an animal see in each instance.
One major challenge is to expand the 3D tracking volume to capture flight trajectories long enough for visual guidance analyses. With limited number of high speed cameras, this is done often at the cost of resolution. Degraded resolution unfortunately also renders the precision of the 3D reconstruction. Thanks to some breakthroughs in sensors and wireless communication, we now can deploy inertial measurement units (IMUs) on flying animals to measured fine rotations, unloading the responsibility of optical tracking via cameras. For my experiments, the high speed cameras only need to pin down the overall body position and head position. All the rotations can be sensed by the IMUs on the bird.

One last thing we must discuss is the philosophy behind this kind of experiments. In tethered experiments, we control the visual input precisely because the animal's visual field is fixed to a specific display screen. We can treat the visual cues as independent variables while measuring the behavioral or neural output as the dependent variables. In free-flight experiments, however, the animal's flight path and head orientation determine the visual input which will guide the animal into a future trajectory. The visual input we reconstructed is an arbitrary independent variable for the near future flight trajectory. Since the animal's flight trajectory directly produces visual sensation which in turns affects the future trajectory, we could have a "chicken and egg problem". One way to get around this problem is to construct a navigation model which predicts flight paths. I will explain this approach in the next post.      

Tuesday, August 16, 2011


Imagine yourself driving in an unknown city without a map (not to mention a GPS). You can see the skyscraper that you are going but you absolutely have to deal with the city jungle in front of you.  This is the kind of navigation problem we are trying to solve. By "we", I mean a huge group of scientists and engineers around the country working under the Multidisciplinary University Research Initiative (MURI) sponsored by the Office of Naval Research. To speed up the R&D for compatible technologies, we choose to take the bio-inspiration route again. As a biologist, I was recruited on the team to investigate how birds negotiate obstacles in a cluttered environment.

Most forms of locomotion operate in a semi-open-loop condition. This means that the animal's body is under a stereotypical movement commend with some sensory feedback to tune it. Moving with such a control strategy minimizes the computation and sensing and spare the animal's attention for other activities such as forging, mating, or escaping. We believe that short-range navigation such as negotiating obstacles is also a lower-level neuromechanics. In other words, flying animals have to plan their immediate future flight with minimum sensory data. Speed is the key and it's doubtful that animals take a second thought about where they go under such a condition.

In this blog, I will attempt a monthly update at the very least. It will be heavily graphic driven and formatted for the general public. This post is only an initial statement to set the site up. You may expect another update by the end of August with my CV and other gadgets.

Monday, August 15, 2011

Huai-Ti's new research and new blog

My name is Huai-Ti Lin, and I am an experimental biologist and a bio-inspired roboticist currently based at Harvard University (Concord Field Station).  I was formally involved with development of soft robotics at Tufts University during my Ph.D years. You can check out my doctoral research at my former blog

My primary research interest is emerging intelligence in biomechanics. For example, I am fascinated by how animals develop robust locomotion in different environments using simple patterns and minimal feedback. I am also intrigued by how different sensory inputs shape animal behaviors and intelligence. Currently I occupy myself in the question of visual guidance in flying animals. In particular, I have the fortune to work with some pigeons from our animal facility at Harvard. 

Pigeons are very robust birds which have adapted to the cluttered urban environment. They are also very capable of carrying weight. Many pigeons perform vertical take-off routinely. In my current project, I challenge the birds with different obstacle arrays (as resembled by the artificial pole forest). By tracking the 3D flight trajectories pigeons take, we hope to extract some control principles in close-range navigation and obstacle avoidance. Various sensors have been mounted on the bird to collect critical information about flight maneuvers. Among the telemetry components, a CMOS camera captures the pigeons' frontal view. We are indeed, trying to get a bird's eye view of this navigation problem.