I am a PhD student in Applied Mathematics at the University of Arizona, where I work with Dr. Jason Pacheco in the Stochastic Systems and Learning Group. My work blends math, machine learning and a healthy respect for uncertainty.
Before Arizona, I earned a Master’s in Mathematics at Brigham Young University, where I worked with Dr. Jared Whitehead on the TsunamiBayes project developing Bayesian tools for analyzing tsunamis from historical records.
I study how to make good decisions under uncertainty: How to ask the right questions, gather the right data, and take the right actions. My research focuses on Bayesian optimal experimental design, information-theoretic goals, and inference via MCMC, with applications in geophysics and political science.
Outside of research, you’ll find me in the mountains, backpacking or skiing, making music, or off-grid with my family.
Expected information gain (EIG) is a crucial quantity in Bayesian optimal experimental design (BOED), quantifying how useful an experiment is by the amount we expect the posterior to differ from the prior. However, evaluating the EIG can be computationally expensive since it generally requires estimating the posterior normalizing constant. In this work, we leverage two idiosyncrasies of BOED to improve efficiency of EIG estimation via sequential Monte Carlo (SMC). First, in BOED we simulate the data and thus know the true underlying parameters. Second, we ultimately care about the EIG, not the individual normalizing constants. Often we observe that the Monte Carlo variance of standard SMC estimators for the normalizing constant of a single dataset are significantly lower than the variance of the normalizing constants across datasets; the latter thus contributes the majority of the variance for EIG estimates. This suggests the potential to slightly increase variance while drastically decreasing computation time by reducing the SMC population size, which leads us to an EIG-specific SMC estimator that starts with a only a single sample from the posterior and tempers backwards towards the prior. Using this single-sample estimator, which we call reverse-annealed SMC (RA-SMC), we show that it is possible to estimate EIG with orders of magnitude fewer likelihood evaluations in three models: a four-dimensional spring-mass, a six-dimensional Johnson-Cook model and a four-dimensional source-finding problem.
Monitoring networks increasingly aim to assimilate data from a large number of diverse sensors covering many sensing modalities. Bayesian optimal experimental design (OED) seeks to identify data, sensor configurations or experiments which can optimally reduce uncertainty and hence increase the performance of a monitoring network. Information theory guides OED by formulating the choice of experiment or sensor placement as an optimization problem that maximizes the expected information gain (EIG) about quantities of interest given prior knowledge and models of expected observation data. Therefore, within the context of seismo-acoustic monitoring, we can use Bayesian OED to configure sensor networks by choosing sensor locations, types and fidelity in order to improve our ability to identify and locate seismic sources. In this work, we develop the framework necessary to use Bayesian OED to optimize a sensor network’s ability to locate seismic events from arrival time data of detected seismic phases at the regional-scale.
We discuss difficulties of evaluating partisan gerrymandering in the congressional districts in Utah and the failure of many common metrics in Utah. We explain why the Republican vote share in the least-Republican district (LRVS) is a good indicator of the advantage or disadvantage each party has in the Utah congressional districts. Although the LRVS only makes sense in settings with at most one competitive district, in that setting it directly captures the extent to which a given redistricting plan gives advantage or disadvantage to the Republican and Democratic parties. We use the LRVS to evaluate the most common measures of partisan gerrymandering in the context of Utah’s 2011 congressional districts. We do this by generating large ensembles of alternative redistricting plans using Markov chain Monte Carlo methods. We also discuss the implications of this new metric and our results on the question of whether the 2011 Utah congressional plan was gerrymandered.
![]() 2023-Present Ph.D in Applied MathematicsPublications:
| ||
![]() 2021-2023 M.Sc. in MathematicsThesis:Advisor:Jared Whitehead | ||
![]() 2016-2020 B.Sc. in Mathematics: Applied and Computational EmphasisTaken Courses:
|
May 2021 - Present
Livermore, CA
Sandia is a U.S. national lab conducting science-based technology development to support national security.
May 2021 - Present
Aug 2023 - Present
Tucson, AZ
The University of Arizona is a tier-1 research university known for strengths in applied math, geosciences, and computational modeling.
Aug 2023 - Present
Jan 2021 - Apr 2023
Provo, UT
BYU is a major research university with a strong applied mathematics program focused on computation and modeling.
Jun 2020 - Apr 2023
Jan 2021 - Apr 2023
May 2019 - Jun 2021
Orem, UT
OrderBoard was a recruiting tech startup focused on optimizing job placement in difficult-to-source markets.
May 2019 - Jun 2021
Jun 2020 - Aug 2020
Charlotte, NC
Honeywell is a global technology company working across aerospace, automation, and software sectors.
Jun 2020 - Aug 2020