The broad motivation of my research is to understand mathematically why neural networks and other machine learning models work so well in practice and to apply those mathematical foundations to make ML a more equitable tool. To that end, my research specifically focuses on the representational capacities of neural networks (see HSSV21 and SC21) and the generalization properties of over-parameterized ML models (see ASH21). I intend to keep working in these spaces, and I also am interested in finding some more applied projects in the intersection of AI and climate.

As an undergraduate, I did a variety of research projects on mathematical modeling, dynamical systems, molecular biology, and ML under the supervision of Bjorn Sandstede, William Fairbrother, and Eli Upfal.

In my "gap year" between Brown and Columbia, I lived in San Francisco as a data scientist at LinkedIn and a software engineering intern at Lumi Labs.

I love teaching. At Brown, I TA'd five computer science and applied math classes: CSCI 190 (accelerated intro to CS), CSCI 220 (discrete math), CSCI 1010 (CS theory), APMA 1360 (intro to dynamical systems), and CSCI 1570 (algorithms), for which I was Head TA. At Columbia, I was a graduate TA for COMS 4252 (computational learning theory) in spring 2021 with Rocco Servedio. In spring 2022, I will develop course materials for and TA the first interation of Christos Papdimitriou and John Morrison's class on Natural and Artificial Neural Networks.

I started the CS Theory Student Seminar at Columbia (which is now coordinated by Shivam Nadimpalli) and coordinated the CS theory retreat for fall 2021 with Tim Randolph. I was a mentor for the Undergrad TCS Student Seminar on ML theory for summer and fall 2021.

I am grateful for funding from an NSF GRFP fellowship, which I received in March 2021.

Find me online: [linkedin] [github] [arxiv] [google scholar] [dblp]

Conference Publications


[SC22] Clayton Sanford, Vaggos Chatziafratis. "Expressivity of Neural Networks via Chaotic Itineraries beyond Sharkovsky's Theorem." Appearing at AISTATS 2022. [arxiv]

[ASH21] Navid Ardeshir*, Clayton Sanford*, Daniel Hsu. "Support vector machines and linear regression coincide with very high-dimensional features." NeurIPS 2021. [paper] [arxiv] [blog post] [reviews]

[HSSV21] Daniel Hsu*, Clayton Sanford*, Rocco Servedio*, Emmanouil-Vasileios Vlatakis-Gkaragkounis*. "On the Approximation Power of Two-Layer Networks of Random ReLUs." COLT 2021. [paper] [arxiv] [blog post] [conference talks]


Journal Publications


[CRSSCS22] Tracy Chin*, Jacob Ruth*, Clayton Sanford*, Rebecca Santorella*, Paul Carter*, Bjorn Sandstede*. "Enabling equation-free modeling via diffusion maps." Appearing in Journal of Dynamics and Differential Equations, 2022. [arxiv]

Undergraduate Publications


[Sanford18] Clayton Sanford. "Applying Rademacher-Like Bounds to Combinatorial Samples and Function Selection." Honors Thesis, Brown Department of Computer Science, 2018. [thesis]

[CSF17] Kamil Cygan*, Clayton Sanford*, William Fairbrother. "Spliceman2 - A Computational Web Server That Predicts Sequence Variations in Pre-mRNA Splicing." Bioinformatics 33 (18), 2017. [paper]

[GSK16] Julia Gross*, Clayton Sanford*, Geoff Kocks*. "Projected Water Needs and Intervention Strategies in India." Undergraduate Mathematics and its Applications 37 (2), 2016. [paper] [article]

A few random things...