Computational and Applied Mathematics Seminar

Location and Time

University of Wyoming Ross Hall 247, Fridays from 3:10-4:00 (unless otherwise stated).

Support

The CAM seminar series is currently supported through volunteers and the financial contributions by the UW Mathematics Department, MGNet.org, and and an energy grant from ExxonMobil. The speaker's entire expenses are paid by the sponsor(s) if noted.

Schedule

For Spring 2017, the speakers are as follows:

Date Speaker From/Note
February 10 Prof. Craig Douglas and Xiukun Hu University of Wyoming
February 17 Bang Huang University of Wyoming
February 28* Russell Johnson University of Wyoming
April 7 Eugenie Jackson University of Wyoming
April 14 Prof. Hakima Bessaih University of Wyoming
April 21 Grigorii Sarnitskii University of Wyoming
April 28 Xiukun Hu University of Wyoming
May 5 Dr. Li Deng Douglas University of Wyoming

* Tuesday (2:00-3:00) in EN 3114

I am still looking for speakers for Spring 2017! The topics can be original research, a survey of an area, or an interesting paper or papers that would interest the CAM community.

The schedule, titles, and abstracts from Fall 2016 are here.

Titles and Abstracts

February 10

Big Data and Seismic Imaging
Prof. Craig Douglas and Xiukun Hu, Department of Mathematics, University of Wyoming

A generation ago, a typical dataset of seismic traces filled enough computer tapes (roughly a foot in diameter each) so that only two eighteen wheel standard trucks were filled to capacity. Processing the tapes down to a single tape took 100 processors approximately 6 months. The data size was part of the problem, but the computation time was the bottleneck. Service companies had standalone data centers doing nothing but processing seismic data for other companies. Today, a typical dataset of traces need only at most a few high capacity solid state drives (which currently top out in 2.5 inch devices at 4 Terabytes). Processing a dataset still takes weeks on a laptop scale device, but can be processed far more quickly using a typical parallel cluster based on commodity computer nodes. Seismology and Big Data have gone hand in hand since the beginning of data collection for oil and gas exploration. A typical dataset is on the order of multiple Terabytes from which an image of the subsurface can be calculated. The datasets are no longer state of the art sized datasets for Big Data, but large enough to provide interesting problems for solution. In this talk, we review a collection of problems encountered and possible data intensive scientific discovery techniques that are commonly being applied.

February 17

On Solving Ill-Conditioned Linear Systems by a Deflation Preconditioner with Multigrid Methods
Bang Huang, Department of Mathematics, University of Wyoming

The talk will focus on solving ill-conditioned linear systems by a deflation preconditioner. Ill-conditioned linear systems, or systems with large condition numbers, are difficult to solve by numerical methods. For an ill-conditioned linear system, slight changes in the coefficient matrix or the right-hand-side cause large changes in the solution. Typically, round-off error in the computer arithmetics can cause instability when attempts are made to solve an ill-conditioned system either directly or iteratively on a computer. The convergence of iterative methods for ill-conditioned problems, however, can be improved by using preconditioning. A preconditioning strategy that deflates few isolated external eigenvalues will be studied in this research. The deflation strategy is an action that removes the influence of a subspace of the eigenspace on the iterative process. The most expensive part in the deflation process presented here is to solve the linear systems arising from approximate the eigenvectors. We propose to use the geometric multigrid (GMG) methods and the abstract multigrid with Krylov subspace methods as smoothers to solve all the linear systems.

February 28 (2:00-3:00 in EN 3114)

Variational Integration Schemes For 2nd-Order IVPs
Russell Johnson, Department of Mathematics, University of Wyoming

We describe a method of numerically approximating solutions to a second-order nonlinear initial value problem. The approximation techniques are based upon the Galerkin Finite Element Method and are cast as various single- and multi-step numerical integration schemes. Well known families of integration schemes are also derived from a variational setting. Preliminary convergence results are presented for a selection of problems. A means of computing adjoint-based a posteriori error estimates for the numerical integration schemes is presented as well.

April 7

Methods for Longitudinal Analysis of Human Microbiome Data
Eugenie Jackson, Department of Statistics, University of Wyoming

The study of microbial communities that inhabit human body sites has been an important area of research since the beginning of this century. Sequencing technologies have been developed that allow culture-free identification of community members. We recognize of the importance of these communities in the maintenance of good health and in the development of disease. Human Microbiome Projects 1 and 2, supported by the National Institutes of Health, have been instrumental in advancing these studies.

Applying traditional ecological analysis methods to the data has proven problematic. The data are characterized as sparse, high-dimensional, compositional, typically contaminated, and frequently involving more taxa than observations. These features necessitate the development of new methods for exploration and inference.

My research explores three related areas. First, the high dimensionality of the data coupled with a tendency of a single taxon to dominate an observation influenced the development of a new visualization method for data exploration. I present this method with several examples including a longitudinal application, since our interest in the human microbiome extends beyond the static case. Second, I present a literature survey of longitudinal analyses that have been performed on human (vaginal) microbiome data, discussing methods, pitfalls, and results. Finally, I present a method based on a hierarchical Bayesian model to infer a co-occurrence structure (BioMiCo, Shafiei, et al., 2015) and my efforts to modify the model to account for the within-subject variability over time.

April 14

Homogenization of Stochastic and Heterogenous Models in Porous Media
Prof. Hakima Bessaih, Department of Mathematics, University of Wyoming

We study models with microscale properties that are highly heterogeneous in space and time. The time variations are controlled by a stochastic particle dynamics described by a stochastic differential equation (SDE). Our main results include the derivation of macroscale equations and showing that the macroscale equations are deterministic. The macro scale equations are derived through an averaging principle of the slow motion (fluid velocity) with respect to the fast motion (particle dynamics) and also by averaging some coefficient with respect to the space variable. Our results include some Brinkman flows and can be extended to more general nonlinear diffusion equations with heterogeneous coefficients.

April 21

Calculating Coefficients of the Generalized Langevin Model for Turbulent Flows from DNS Data
Grigorii Sarnitskii, Department of Mathematics, University of Wyoming

The talk will be focused on the inverse problem for a stochastic turbulence model. The model in hand, Generalized Langevin Model, describes the motion of fluid particles (Lagrangian point of view), with the velocity of a fluid particle governed by a stochastic differential equation. We propose a method to calculate the coefficients of the model from the numerical solutions of the Navier–Stokes equations. This is the topic of my ongoing research, so only partial results will be presented.

April 28

GPU Accelerated Computing and Magnetic Resonance Imaging
Xiukun Hu, Department of Mathematics, University of Wyoming

The term GPU stands for graphics processing unit. A GPU is designed to accelerate the creation of images on a computer, mobile phone, or even a wrist watch. People have been trying to do general purpose computing on GPUs since late last century. After 2001, the advent of programmable shaders (CUDA, OpenCL, etc.) and IEEE floating point support on GPUs made GPU computing popular and somewhat more practical. Due to huge improvements in GPU hardware capabilities and programmability, they are now used in government labs, universities, enterprises, and small and medium sized businesses around the world.

In this talk, we will describe the hardware architecture of a GPU, explore conditions for it to become preferable, and discuss the reasons hiding behind the hardware. We will study a specific example from advanced magnetic resonance imaging reconstruction algorithms to see how GPU computing is implemented.

May 5

A Finite Element Method (FEM) Perturbation Expansion for a Coupled Structural-Acoustic System
Dr. Li Deng Douglas, Department of Mathematics, University of Wyoming

The structural acoustic coupled vibration problem is very important in many engineering applications such as quality control of vehicles. Formulating the problem using the FEM leads to a nonsymmetric generalized eigenvalue problem. We show that the problem can be reformulated into uncoupled structural and acoustic problems by introducing a coupling strength parameter ε as a multiplier applied to the off-diagonal coupling terms. The discretized uncoupled problems then lead to a pair of symmetric generalized eigenvalue problems which can be efficiently and independently solved. The solutions of the uncoupled problems are then used to compute the coupled solution using the perturbation method and the introduced coupling strength parameter. We confirm the adequacy of the method by investigating numerical examples for a two dimensional uniform mesh, whose exact solution is known, as well as arbitrary meshes for a car-like example.

 

This web page is maintained by Prof. Craig C. Douglas

Last modified: