I’m currently a first-year PhD student in Maneesh Sahani’s group at the Gatsby Computational Neuroscience Unit. Broadly speaking, I’m primarily interested in the complex interrelationship between biological and artificial intelligence and in leveraging the algorithms used by the brain to build better machine learning models. Before arriving at Gatsby, I worked with the Horizons and Deep Collective teams at Uber AI Labs on optimization techniques for deep learning. Previously, I was a Masters student in computer science at Columbia, where I worked with Larry Abbott, Ashok Litwin-Kumar, and Ken Miller at the Center for Theoretical Neuroscience. I did my undergrad at Princeton, where I majored in neuroscience with minors in computer science and linguistics, and was advised by Jonathan Pillow.
Li WK, Moskovitz T, Kanagawa H, Sahani M (2020). Amortised Learning by Wake-Sleep. International Conference on Machine Learning. arXiv.
Moskovitz T, Wang R, Lan J, Kapoor S, Yosinski J, Rawal A (2019). Learned First-Order Preconditioning. Beyond First Order Methods in ML Workshop, Neural Information Processing Systems. arXiv
Lindsay G, Moskovitz T, Yang G, Miller K (2019). Do Biologically-Plausible Architectures Produce Biologically-Realistic Models? Conference on Cognitive Computational Neuroscience. link
Sun M, Li J, Moskovitz T, Lindsay G, Miller K, Dipoppa M, Yang G (2019). Understanding the Functional and Structural Differences Across Excitatory and Inhibitory Neurons. Conference on Cognitive Computational Neuroscience. link
Moskovitz T, Litwin-Kumar A, Abbott LF (2018). Feedback alignment in deep convolutional networks. Pre-print. arXiv
Moskovitz T, Roy NA, Pillow JW (2018). A comparison of deep learning and linear-nonlinear cascade models to neural encoding. Pre-print. bioRxiv
Hsu E, Fowler E, Staudt L, Greenberg M, Moskovitz T, Shattuck DW, Joshi SH (2016). DTI of corticospinal tracts pre- and post-physical therapy in children with cerebral palsy. Proceedings of the Organization of Human Brain Mapping. link
Course Projects and Theses
Moskovitz T, Krone J, Brand R (2018). Toward Improved Meta-Imitation Learning, Final Project for Humanoid Robotics course at Columbia. link
Moskovitz T (2018). Assessing the Resistance of Biologically-Inspired Neural Networks to Adversarial Attack, Final Project for Security & Robustness of ML Systems course at Columbia. link
Moskovitz T (2017). Deep Transfer Learning for Language Generation from Limited Corpora, Final Project for Advanced Topics in Deep Learning course at Columbia. link
Moskovitz T (2017). Deep Learning Models for Neural Encoding in the Early Visual System, Princeton Senior Thesis. link