Research Interests

I currently am working on efficient verification of robustness of neural networks. As an offshoot of my work, I’m also exploring using universal adversarial perturbations to identify neurons activated by particular objects, as well as approaches to pruning networks to improve their verifiability without significantly decreasing their performance.


I’m currently at Google, working on building systems that enable programmers to automatically optimize parameters in their code.

Before Google, I was at Cruise for 2+ years, working on simulation infrastructure. Some of the problems I worked on include:

  • Designing an expressive and user-friendly system for specifying car data schema migrations, reducing the risk of user error by eliminating verbosity.
  • Disentangling complex dependencies and establishing clear interfaces between sub-modules, allowing for code re-use and more effective caching of simulation output.

Outside of my core responsibilities, I also spent time on:

  • Making information accessible: My dashboards summarizing simulation results are among the twenty most utilized at the firm, and cost estimates I added to data on simulation runtime have helped catalyze work to improve simulation efficiency.
  • Improving code quality: Among other things, I applied our lint rules to hundreds of files that were previously manually excluded, and have deleted >1,000,000 lines of dead code across the codebase.
  • Improving developer tooling: My custom settings for VSCode are now the default for our main repository used by hundreds of engineers.

Before Cruise, I interned at Palantir, Benchling and Jane Street.


I received my B.S. in Electrical Engineering and Computer Science from MIT in 2017, and my M.Eng. with a concentration in Artificial Intelligence in 2018.

My two favorite computer science courses at MIT were Inference and Information (6.437) by Greg Wornell and Advanced Algorithms (6.854) by Ankur Moitra. 6.437 provided a framework for me to properly understand the formalization underpinning ideas such as maximum likelihood estimation and the expectation-maximization algorithm, while 6.854 developed my mathematical maturity by demanding a high degree of rigor.

In addition to those classes, here are some other classes that I really enjoyed: