Spatiotemporal receptive field of the elementary motion detector in Drosophila
Abstract/Contents
- Abstract
- Visual motion cues are used by many animals to guide a diverse set of behaviors across a wide range of environments. What algorithm does the brain use to extract these cues? What are the neural circuit elements that implement these algorithms? Using a combination of two photon imaging, genetic techniques and modeling, my work has explored these questions in the visual system of the fruit fly Drosophila. My work demonstrates that the algorithmic basis of motion detection in flies is strikingly similar to that used in the vertebrate visual cortex. In the past decade, Drosophila has emerged as a powerful model for understanding neural circuit function. The strength of this system reflects a unique combination of a rich repertoire of innate and learned behaviors, a highly stereotyped wiring diagram and, critically, the development of a wealth of genetic tools. In Chapter 1, I review how these factors have interacted to advance this field at a remarkable pace. A central goal in the dissection of any neural circuit lies in the identification of the neuron types that are central to the computation of interest. Taking advantage of many of the tools outlined in Chapter 1, Chapter 2 describes the characterization of Tm9, a neuron type that this work demonstrates provides essential input to a specific local motion detector. Surprisingly, using a reverse correlation technique for two-photon imaging, I demonstrated that the spatial receptive field of this cell is much larger than one would have predicted from the anatomy of Tm9, and its functional role in providing input to a local motion detector. This observation reveals both the importance of combining functional and anatomical studies, and enriches our understanding of local motion detectors. Motion vision relies on neurons that respond preferentially to stimuli moving in one, preferred direction over the opposite, null direction. Long-standing theoretical models have made predictions about the computations that compare light signals across space and time to detect motion. In Chapter 3, I describe quantitative measurements of the spatiotemporal receptive fields of the first direction-selective cell types in the fly, T4 and T5, and use these measurements to test alternative theoretical models underpinning this computation. These results argue strongly that flies implement a "motion-energy" model for detecting local motion signals, and that the particular algorithmic implementation used by the fly achieves high selectivity for moving edges by all two- and three-point spacetime correlations relevant to motion in this stimulus class. These studies set the stage for future investigation of the cellular and molecular implementation of this computation.
Description
Type of resource | text |
---|---|
Form | electronic; electronic resource; remote |
Extent | 1 online resource. |
Publication date | 2017 |
Issuance | monographic |
Language | English |
Creators/Contributors
Associated with | Leong, Jonathan C. S |
---|---|
Associated with | Stanford University, Neurosciences Program. |
Primary advisor | Clandinin, Thomas R. (Thomas Robert), 1970- |
Thesis advisor | Clandinin, Thomas R. (Thomas Robert), 1970- |
Thesis advisor | Boxer, Steven G. (Steven George), 1947- |
Thesis advisor | Ganguli, Surya, 1977- |
Thesis advisor | Smith, Stephen J |
Thesis advisor | Wandell, Brian A |
Advisor | Boxer, Steven G. (Steven George), 1947- |
Advisor | Ganguli, Surya, 1977- |
Advisor | Smith, Stephen J |
Advisor | Wandell, Brian A |
Subjects
Genre | Theses |
---|
Bibliographic information
Statement of responsibility | Jonathan C. S. Leong. |
---|---|
Note | Submitted to the Neurosciences Program. |
Thesis | Thesis (Ph.D.)--Stanford University, 2017. |
Location | electronic resource |
Access conditions
- Copyright
- © 2017 by Jonathan Chit Sing Leong
- License
- This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).
Also listed in
Loading usage metrics...