Meeting Abstract

P1-288  Friday, Jan. 4 15:30 - 17:30  Decoding the Algorithms for Head and Body Coordination during Visually Guided Flight CELLINI, BO*; MONGEAU, J-M; Penn State University; Penn State University boc5244@psu.edu

In flies, head and wing-driven body movement are coordinated to stabilize gaze, yet the underlying algorithms that permit simultaneous control of the head and body remain elusive. Revealing the algorithms that control the pattern of head and body movement in flight is critical to reverse engineer gaze stabilization because these movement patterns actively shape the visual inputs that enter the brain. Furthermore, it can help us predict the computations that the brain implements to demultiplex sensory inputs for redundant control of multiple-degree-of-freedom flight systems. We use frequency domain analysis to elucidate the tuning and interplay between head and wing movement in rigidly tethered Drosophila in virtual reality. Frequency analysis of sinusoidal, visual stimuli showed that head yaw movements were tuned to low frequencies and strongly phase locked at low frequencies. Wing-beat amplitude signals were similarly tuned but were more out of phase than head movements. Coherence analysis and insensitivity to changes in amplitude of sinusoids together suggest that head movements can be modeled with linear dynamics. Wing-beat amplitude signals had an overall lower coherence with the stimuli. These differences could be attributed to the body being fixed while the head is free to move. Analysis of individual trials revealed that head saccade dynamics were broadly tuned. Measurements of the free response of the head demonstrate that the passive neck-head system is highly damped, which may have important implications for passive head stability in flight. Elucidating the algorithms for efficient gaze control during flight can inspire the development of more agile, vision-based aerial vehicles.