Mark Transtrum > Research Interests
Research InterestsThe unifying theme of my research is the development and application of novel mathematical and computational methods for solving problem in physics and biology. I am especially interested in problems that lie at the interface between these fields. I synthesize ideas that are superficially unrelated, such as modeling and differential geometry, to probe more deeply the mathematical methods that unify the sciences. Some of the specific projects I have worked on include:
1. Multi-Parameter Models, Differential Geometry, and Interpolation Models with many tunable parameters often have strikingly similar statistical properties although they describe very different phenomena. Specifically, models often exhibit an insensitivity to large-scale fluctuations in many of their parameter combinations, a phenomenon sometimes known as "sloppiness". Using differential geometry, we can interpet models as manifolds embedded in the space of all possible predictions. The geometric properties of these manifolds reflect the universal statistical propertes of the models. Specifically, models are often bounded with a hierarchy of widths (described as a hyper-ribbon) and a hierarchy of curvatures that are often much smaller than the widths. We can explain these observations using theorems from interpolation theory. Whenever the observed behavior has fewer effective degrees of freedom than the number of parameters, the model behavior is largely controlled by only a few parameter combinations, while remaining insensitive to the other parameters. This approach gives quantitive predictions about the geometric properties of models and leads us to think of complex models as generalized interpolation schemes. 2. Numerical methods for data fitting and Bayesian posterior sampling. The universal geometric properties of multi-parameter models have a number of very useful applications. In particular, understanding the properties of the model manifold in data space allows us to make very general statements about the cost surface in parameter space when the model is fit to data. Exploiting the small curvature, we develop an algorithm for data fitting known as the geodesic Levenberg-Marquardt algorithm. We can also propose Markov Chain Monte Carlo sampling techniques to efficiently explore the cost surface in a Bayesian Posterior sampling. Especially for large models that are computationally intensive, these methods expand our ability to numerically analyze and understand models. 3. Optimal Experimental Design. One of the barriers to understanding complex systems is the amount of information necessary to fully specify the model. This problem is exacerbated since most experimental data explores redundant degrees of freedom, i.e. can be interpolated from previous observations. Optimally designing experiments requires that one probe novel degrees of freedom with each new experiment in order to estimate all of the microscopic parameters most efficiently. On the other hand, if one is only interested in a few predictions of the model, these predictions can typically be tightly constrained with only a few optimally chosen experiments. In either case, the initially large uncertainty in the parameters renders optimal experiment selection a computationally intensive task that can be accurately and efficiently approximated using the universal geometric features of multi-parameter fits. 4. Coarse-graining complex models. Complex systems with many microscopic parameters often exhibit collective behavior that is surprisingly comprehensible. Since the collective behavior is insensitive to the many parameter combinations, we develop a method to coarse-grain away the unnecessary degrees of freedom. The boundaries of the model manifold (described in point 1) represent coarse-grained models with one less degree of freedom. This boundary model also has a hierarchy of widths. By repeatedly replacing a model by its boundary, we can systematically coarse-grain away all of the unnecessary parameters of a model, producing an effective model describing the emergent physics or biology of the system. The method naturally produces such things as modular units with parameters describing, for example, effective rates of information flow through protein signaling networks. The method also has a striking similarity to properties of the Renormalization Group, an important method in statistical physics. We find multi-parmaeter models exhibit behavior similar to phase transitions and critical points. 5. Superheating field of superconductors. Superconductors in an external magnetic field expel the magnetic flux, a phenomenon known as the Meissner effect. In a sufficiently large magnetic field, the superconductor undergoes a phase transition to either a normal metal or a superconductor with a array of magnetic flux tubes. This transition does not necessarily occur at the field field strength where the normal metal is energetically favorable. Instead, the superconductor can persist in a meta-stable "superheated" state up to a larger magnetic field, known as the superheating field. This field turns out to be important in the design of magnetic resonance cavities in particle accelerators, as it is a fundamental limit in their operating efficiency. We have used linear stability theory to study this transition using both Ginzburg-Landau theory and Eilenberger theories of superconductivity in order to predict the fundamental limits of cavity efficiency. Last Modified: 13 September 2012 |