Title 1: Markov processes with darning and their approximations

Speaker: Zhenqing Chen, University of Washington

Abstract: This talk is concerned with? darning of general symmetric Markov processes by shorting some parts of the state space into singletons. We will present a natural way to construct such processes? by using a Dirichlet form approach. When the initial processes have discontinuous sample paths, the processes constructed in this way are the genuine? extensions of those studied in Chen and Fukushima (2012). We further show that, up to a time change, these Markov processes with darning can be approximated in the finite dimensional sense? by introducing additional large intensity jumps among these compact sets to be collapsed into singletons to the original Markov processes. For diffusion processes, it is also possible to get, up to a time change, diffusions with darning by increasing the conductance on these compact sets to infinity. To accomplish these, we extend the semigroup characterization of Mosco convergence to closed symmetric forms whose domain of definition may not be dense in the $L^2$-space. The latter is of independent interest and potentially useful in the study of convergence of Markov processes having different state spaces.? Based on joint work with Jun Peng.

Title 2: Distances between Random Orthogonal Matrices and Independent Normals

Speaker: Tiefeng Jiang, University of Minnesota

Title 3: Regularity of different type of noises to the deterministic system ?

Speaker: 谢颖超，江苏师范大学 谢颖超教授为江苏师范大学数学与统计学院教授，南开大学数学科学学院博士生导师。荣获中华人民共和国政府特殊津贴、全国模范教师、江苏高校优势学科“统计学”学科带头人、江苏省科学技术进步奖三等奖等多项荣誉和奖项。谢教授主要从事随机分析及其应用、随机偏微分方程、随机过程极限定理等方面的研究工作，取得了一系列重要的研究成果，谢教授在《Stochastic Process Appl.》、《Stochastic Anal. Appl.》、《中国科学》等国内外一流刊物上发表学术论文60余篇，连续多次获得国家自然科学基金面上项目资助。

We show the existence and uniqueness of strong solutions for stochastic differential equation driven by partial? $\alpha$-stable noise and partial Brownian noise with singular coefficients. The proof is based on the regularity?of degenerate mixed type Kolmogorov equation. Based on joint work with Yueling Li and Longjie Xie.

Title 4: The Censored Markov Chain and the Best Augmentation

Speaker: Yiqiang Q. Zhao, Carleton Unviersity Abstract: Computationally, when we solve for the stationary probabilities for a countable-state Markov chain, the transition probability matrix of the Markov chain has to be truncated, in some way, into a finite one. Various augmentation methods might be valid such that the stationary probability distribution for the truncated Markov chain approaches that for the countable Markov chain as the truncation size gets large. In this talk, we introduce the censored Markov chain, one of the truncation method, and prove that the censored (watched) Markov chain provides the best approximation in the sense that for a given truncation size the sum of errors is the minimum and show, by examples, that the method of augmenting the last column only is not always the best.