Selected publications/preprints. Also available on Google Scholar. Any in-progress manuscripts will be labeled "In Progress", and generally mean that they will be available within the next 2-3 months at least as a preprint (not guaranteed).
2024
Arxiv
Data-Driven Estimation of Failure Probabilities in Correlated Structure-Preserving Stochastic Power System Models
Hongli Zhao, Tyler E. Maltba, D. Adrian Maldonado, and 2 more authors
We introduce a data-driven and physics-informed framework for propagating uncertainty in stiff, multiscale random ordinary differential equations (RODEs) driven by correlated (colored) noise. Unlike systems subjected to Gaussian white noise, a deterministic equation for the joint probability density function (PDF) of RODE state variables does not exist in closed form. Moreover, such an equation would require as many phase-space variables as there are states in the RODE system. To alleviate this curse of dimensionality, we instead derive exact, albeit unclosed, reduced-order PDF (RoPDF) equations for low-dimensional observables/quantities of interest. The unclosed terms take the form of state-dependent conditional expectations, which are directly estimated from data at sparse observation times. However, for systems exhibiting stiff, multiscale dynamics, data sparsity introduces regression discrepancies that compound during RoPDF evolution. This is overcome by introducing a kinetic-like defect term to the RoPDF equation, which is learned by assimilating in sparse, low-fidelity RoPDF estimates. Two assimilation methods are considered, namely nudging and deep neural networks, which are successfully tested against Monte Carlo simulations.
Arxiv
Model discovery for nonautonomous translation-invariant problems
Discovery of mathematical descriptors of physical phenomena from observational and simulated data, as opposed to from the first principles, is a rapidly evolving research area. Two factors, time-dependence of the inputs and hidden translation invariance, are known to complicate this task. To ameliorate these challenges, we combine Lagrangian dynamic mode decomposition with a locally time-invariant approximation of the Koopman operator. The former component of our method yields the best linear estimator of the system’s dynamics, while the latter deals with the system’s nonlinearity and non-autonomous behavior. We provide theoretical estimators (bounds) of prediction accuracy and perturbation error to guide the selection of both rank truncation and temporal discretization. We demonstrate the performance of our approach on several non-autonomous problems, including two-dimensional Navier-Stokes equations.
Arxiv
Tensorizing flows: a tool for variational inference
We propose the tensorizing flow method for estimating high-dimensional probability density functions from observed data. Our method combines the optimization-less feature of the tensor-train with the flexibility of flow-based generative models, providing an accurate and efficient approach for density estimation. Specifically, our method first constructs an approximate density in the tensor-train form by efficiently solving the tensor cores from a linear system based on kernel density estimators of low-dimensional marginals. Subsequently, a continuous-time flow model is trained from this tensor-train density to the observed empirical distribution using maximum likelihood estimation. Numerical results are presented to demonstrate the performance of our method.
2022
Springer CN
Autonomous learning of nonlocal stochastic neuron dynamics
Tyler E. Maltba, Hongli Zhao, and Daniel M. Tartakovsky
Neuronal dynamics is driven by externally imposed or internally generated random excitations/noise, and is often described by systems of random or stochastic ordinary differential equations. Such systems admit a distribution of solutions, which is (partially) characterized by the single-time joint probability density function (PDF) of system states. It can be used to calculate such information-theoretic quantities as the mutual information between the stochastic stimulus and various internal states of the neuron (e.g., membrane potential), as well as various spiking statistics. When random excitations are modeled as Gaussian white noise, the joint PDF of neuron states satisfies exactly a Fokker-Planck equation. However, most biologically plausible noise sources are correlated (colored). In this case, the resulting PDF equations require a closure approximation. We propose two methods for closing such equations: a modified nonlocal large-eddy-diffusivity closure and a data-driven closure relying on sparse regression to learn relevant features. The closures are tested for the stochastic non-spiking leaky integrate-and-fire and FitzHugh-Nagumo (FHN) neurons driven by sine-Wiener noise. Mutual information and total correlation between the random stimulus and the internal states of the neuron are calculated for the FHN neuron.