The broad motivation of my research is to understand mathematically why neural networks and other machine learning models work so well in practice and to apply those mathematical foundations to make ML a more equitable tool. To that end, my research specifically focuses on the representational capacities of neural networks (see HSSV21 and SC22) and the generalization properties and inductive biases of over-parameterized ML models (see ASH21, SAH22, and BBSS22.).
I am also interested in applied work on the intersection of AI and climate change. I did an internship during summer 2022 with the Allen Institute for AI climate modeling team, where I worked on improving the stability of ML-corrected climate models.
As an undergraduate, I did a variety of research projects on mathematical modeling, dynamical systems, molecular biology, and ML under the supervision of Bjorn Sandstede, William Fairbrother, and Eli Upfal.
In my "gap year" between Brown and Columbia, I lived in San Francisco as a data scientist at LinkedIn and a software engineering intern at Lumi Labs.
I love teaching. At Brown, I TA'd five computer science and applied math classes: CSCI 190 (accelerated intro to CS), CSCI 220 (discrete math), CSCI 1010 (CS theory), APMA 1360 (intro to dynamical systems), and CSCI 1570 (algorithms), for which I was Head TA. At Columbia, I was a graduate TA for COMS 4252 (computational learning theory) in spring 2021 with Rocco Servedio. I am currently co-teaching a supplemental lab with Sam Deng for Natural and Articifial Neural Networks, which is taught by Christos Papadimitriou and John Morrison. We developed the lab from scratch and all materials are available.
I have a habit of over-committing:
I am grateful for funding from an NSF GRFP fellowship, which I received in March 2021.
Find me online: [linkedin] [github] [arxiv] [google scholar] [dblp]
[SAH22] Clayton Sanford*, Navid Ardeshir*, Daniel Hsu. "Intrinsic dimensionality and generalization properties of the R-norm inductive bias." Preprint. [arxiv]
[BBSS22] Alberto Bietti*, Joan Bruna*, Clayton Sanford*, Min Jae Song*. "Learning single-index models with shallow neural networks." NeurIPS 2022. [arxiv]
[CPSS22] Vaggos Chatziafratis*, Ioannis Panageas*, Clayton Sanford, Stelios Stavroulakis*. "On Scrambling Phenomena for Randomly Initialized Recurrent Networks." NeurIPS 2022. [arxiv]
[HSSV22] Daniel Hsu*, Clayton Sanford*, Rocco Servedio*, Emmanouil-Vasileios Vlatakis-Gkaragkounis*. "Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals." COLT 2022. [paper] [arxiv] [NYU seminar slides] [conference talk]
[SC22] Clayton Sanford, Vaggos Chatziafratis. "Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem." AISTATS 2022. [paper] [arxiv] [conference talk]
[ASH21] Navid Ardeshir*, Clayton Sanford*, Daniel Hsu. "Support vector machines and linear regression coincide with very high-dimensional features." NeurIPS 2021. [paper] [arxiv] [blog post] [reviews] [conference talk] [Brown seminar slides]
[HSSV21] Daniel Hsu*, Clayton Sanford*, Rocco Servedio*, Emmanouil-Vasileios Vlatakis-Gkaragkounis*. "On the Approximation Power of Two-Layer Networks of Random ReLUs." COLT 2021. [paper] [arxiv] [blog post] [conference talks] [Columbia DSI poster session] [MIT and BU seminar slides] [UW seminar slides]
[SKW+22] Clayton Sanford, Anna Kwa, Oliver Watt-Meyer, Spencer Clark, Noah Brenowitz, Jeremy McGibbon, Christopher Bretherton. "Improving the predictions of ML-corrected climate models with novelty detection." NeurIPS 2022 "Tackling Climate Change with Machine Learning" workshop; journal submission in progress. [workshop arxiv] [workshop slides]
[CRSSCS22] Tracy Chin*, Jacob Ruth*, Clayton Sanford*, Rebecca Santorella*, Paul Carter*, Bjorn Sandstede*. "Enabling equation-free modeling via diffusion maps." Journal of Dynamics and Differential Equations, 2022. [journal] [arxiv]
[S18] Clayton Sanford. "Applying Rademacher-Like Bounds to Combinatorial Samples and Function Selection." Honors Thesis, Brown Department of Computer Science, 2018. [thesis]
[CSF17] Kamil Cygan*, Clayton Sanford*, William Fairbrother. "Spliceman2 - A Computational Web Server That Predicts Sequence Variations in Pre-mRNA Splicing." Bioinformatics 33 (18), 2017. [paper]
[GSK16] Julia Gross*, Clayton Sanford*, Geoff Kocks*. "Projected Water Needs and Intervention Strategies in India." Undergraduate Mathematics and its Applications 37 (2), 2016. [paper] [article]