Relating large-scale population activity to movement using machine learning
Abstract/Contents
- Abstract
- Neural representations in cognitive tasks are typically considered in terms of abstract cognitive performance, yet it is known that cognitive and emotional states are associated with typical movements, e.g., pacing when nervous. This makes the assignment of neural representations to movement, internal state or other factors difficult. To better understand this problem, We developed and applied a series of methods to directly relate neural activity to high-speed video of mice performing a short-term memory task. Our methods span from interpretable but less expressive tracking of manually defined animal features, through projection of videos to low-dimensional embedding vectors with convolutional auto-encoders, to expressive but less interpretable end-to-end learning by neural networks. We dissect motor-related and sensory-related neurons, by measuring the time offsets between the embedding and neural activity. We map the brain region and location of motor or sensory time offsets, and demonstrate this map is consistent with previous findings in the literature regarding sensory vs. motor processing. Further, we revisit the long suggested but relatively unexplored question of micro-movements directly associated with specific task contingencies. We demonstrate that we can directly predict single-trial animal choices from video embedding vectors. Based on this prediction, we analyze preparatory micro-movements, and divide neurons into neurons whose activity more closely follows micro-movements (motor-type) and neurons whose activity is more agnostically linked to trial type. This suggests an initial direction in separating the internal-state information and motor-movement information. We also introduce other computational methods for analyzing the connection between videos and neural activities, including canonical correlation analysis (CCA), session non-specific auto-encoder, DeepLabCut marker corrector, and soft GLM-HMM. Apart from the mice short-term memory dataset, we analyzed a drosophila walking dataset and developed inferred dynamics models to transform the joint angles into a simpler, low-dimensional representation, with potential application in understanding how ventral nerve cord controls walking. These methods are important steps in linking neural representations to movement.
Description
Type of resource | text |
---|---|
Form | electronic resource; remote; computer; online resource |
Extent | 1 online resource. |
Place | California |
Place | [Stanford, California] |
Publisher | [Stanford University] |
Copyright date | 2022; ©2022 |
Publication date | 2022; 2022 |
Issuance | monographic |
Language | English |
Creators/Contributors
Author | Wang, Ziyue |
---|---|
Degree supervisor | Druckmann, Shaul |
Degree supervisor | Ganguli, Surya, 1977- |
Thesis advisor | Druckmann, Shaul |
Thesis advisor | Ganguli, Surya, 1977- |
Thesis advisor | Haroush, Keren |
Degree committee member | Haroush, Keren |
Associated with | Stanford University, Department of Applied Physics |
Subjects
Genre | Theses |
---|---|
Genre | Text |
Bibliographic information
Statement of responsibility | Ziyue Wang. |
---|---|
Note | Submitted to the Department of Applied Physics. |
Thesis | Thesis Ph.D. Stanford University 2022. |
Location | https://purl.stanford.edu/rq302fd7808 |
Access conditions
- Copyright
- © 2022 by Ziyue Wang
- License
- This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).
Also listed in
Loading usage metrics...