Speaker: Liu Liu (UT Austin)
Title: A Bi-fidelity method for multiscale kinetic equations with uncertainties
In this talk, we introduce a bi-fidelity numerical method for solving high-dimensional parametric kinetic equations. We first briefly discuss about the Boltzmann equation and its fluid dynamic limit, then introduce a bi-fidelity stochastic collocation method for its uncertainty quantification problem. By combining computational efficiency of the low-fidelity model–chosen as the compressible Euler system–with high accuracy of the high-fidelity (Boltzmann) model, our bi-fidelity approximation can successfully capture well the macroscopic quantities of solution to the Boltzmann equation in the random space. A uniform error estimate of the bi-fidelity method, based on a series of our theoretical work on hypocoercivity for the uncertain kinetic equations, will be shown. Lastly we present numerical results to validate the efficiency and accuracy of our proposed method.
Mathematical Analysis and Numerical Methods for an Underground Oil Recovery model
Ying Wang (University of Oklahoma)
In this talk, I will discuss a new class of entropy solutions of the modified Buckley-Leverett equation, which models underground oil recovery. This model includes a third-order mixed derivatives term resulting from the dynamical effects in the pressure difference between the two phases. Analytic study on the computational domain reduction will be provided. Strong stability preserving operator splitting method will be introduced. A variety of numerical examples will be given. They show that the solutions may have many different profiles depending on the initial conditions, diffusion parameter, and the third order mixed derivatives parameter. The results are consistent with the study of traveling wave solutions and their bifurcation diagrams.
Sui Tang (Johns Hopkins University)
Title: Machine learning on dynamic data
Abstract: High-dimensional dynamical data arise in many fields of modern science and introduce new challenges in statistical learning and data recovery. In this talk, I will present two sets of problems. One is related to the data-driven discovery of dynamics in systems of interacting agents. Such kind of systems is ubiquitous in science, from the modeling of particles in physics to prey-predator in Biology, to opinion dynamics in social sciences. Given only observed trajectories of the system, we are interested in estimating the interaction laws between the agents using tools from statistical/machine learning. We show that at least in particular circumstances, where the interactions are governed by (unknown) functions of distances, the high-dimensionality of the state space of the system does not affect the learning rates. We can achieve an optimal learning rate for the interaction kernel, equal to that of a one-dimensional regression problem. The other one is related to the dynamical sampling: a new area in sampling theory that deals with processing a linear time series of evolving signals and aims at recovering the initial state and the forward operator from its coarsely sampled evolving states. We provide mathematical theories to show how the dynamics can inform feasible space-time sampling locations and the fundamental limit of space-time trade-off.
The logic seminar today will be given by David Webb. A title and abstract are below.
Title: On The Levin-V’yugin Degrees
Abstract: I will define and discuss the Levin-V’yugin degrees, a measure algebra defined on collections of reals closed under Turing equivalence. Roughly speaking, in this ordering collections A and B have that A<B if for any probabilistic algorithm, the probability that it produces an element of A that is not in B is 0. Time permitting, I will prove that the computable reals and the random reals each form an atom in this Boolean algebra, and discuss other degrees and their positions in the lattice.
The paper this talk is based on is here: https://arxiv.org/pdf/1907.
Speaker: Farzana Nasrin (U. Tennessee)
Title: Bayesian Topological Learning for Complex Data Analysis
Abstract: Classification is an important problem with multiple applications from materials science, and chemistry to biology and neuroscience. In this talk, we will approach the problem of classification by focusing on the shape of data and summarizing it with topological descriptors called persistence diagrams. Viewing persistence diagrams through the lenses of point processes, one could define a pertinent probabilistic framework and potentially quantify the uncertainty present in these summaries. Taking into account this framework and historical data, we will present a novel generalized Bayesian framework for persistent homology, which provides an effective, flexible and noise-resilient scheme to analyze and classify complex datasets. A closed form solution of the posterior distributions of persistence diagrams based on a family of conjugate priors will be provided, and Bayes factors on the space of persistence diagrams will be computed to yield robust classification results. An example of classifying high entropy alloy materials will demonstrate the applicability of this novel Bayesian classifier.
Speaker: Jeremy Hoskins (Yale U)
Title: Elliptic PDEs on regions with corners
Abstract: Many of the boundary value problems frequently encountered in the simulation of physical problems (electrostatics, wave propagation, fluid dynamics in small devices, etc.) can be solved by reformulating them as boundary integral equations. This approach reduces the dimensionality of the problem, and enables high-order accuracy in complicated geometries. Unfortunately, in domains with sharp corners the solution to both the original governing equations as well as the corresponding boundary integral equations develop singularities at the corners. This poses significant challenges to many existing integral equation methods, typically requiring the introduction of many additional degrees of freedom. In this talk I show that the solutions to the Laplace, Helmholtz, and biharmonic equations in the vicinity of corners can be represented by a series of elementary functions. Knowledge of these representations can be leveraged to construct accurate and efficient Nyström discretizations for solving the resulting integral equations.The performance of these methods will be illustrated with several numerical examples.
Speaker: Tingran Gao (U. Chicago)
Title: Manifold Learning on Fibre Bundles
Spectral geometry has played an important role in modern geometric data analysis, where the technique is widely known as Laplacian eigenmaps or diffusion maps.
In this talk, we present a geometric framework that studies graph representations of complex datasets, where each edge of the graph is equipped with a non-scalar transformation or correspondence.
This new framework models such a dataset as a fibre bundle with a connection, and interprets the collection of pairwise functional relations as defining a horizontal diffusion process on the bundle driven by its projection on the base.
The eigenstates of this horizontal diffusion process encode the “consistency” among objects in the dataset, and provide a lens through which the geometry of the dataset can be revealed. We demonstrate an application of this geometric framework on evolutionary anthropology.