Past Probability Seminars Spring 2020: Difference between revisions
Line 51: | Line 51: | ||
There are interesting cases where the eigenvalue distribution is known to | There are interesting cases where the eigenvalue distribution is known to | ||
change dramatically when small Gaussian random noise is added, and this talk | change dramatically when small Gaussian random noise is added, and this talk | ||
will focus on what happens when the noise is <i>not</i> Gaussian | will focus on what happens when the noise is <i>not</i> Gaussian. | ||
== Thursday, February 27, [http://mypage.iu.edu/~jthanson/ Jack Hanson], Indiana University Bloomington == | == Thursday, February 27, [http://mypage.iu.edu/~jthanson/ Jack Hanson], Indiana University Bloomington == |
Revision as of 18:14, 18 February 2014
Spring 2014
Thursdays in 901 Van Vleck Hall at 2:25 PM, unless otherwise noted.
If you would like to sign up for the email list to receive seminar announcements then please send an email to
Thursday, January 23, CANCELED--NO SEMINAR
Thursday, February 6, Jay Newby, Mathematical Biosciences Institute
Title: Applications of large deviation theory in neuroscience
Abstract: The membrane voltage of a neuron is modeled with a piecewise deterministic stochastic process. The membrane voltage changes deterministically while the population of open ion channels, which allow current to flow across the membrane, is constant. Ion channels open and close randomly, and the transition rates depend on voltage, making the process nonlinear. In the limit of infinite transition rates, the process becomes deterministic. The deterministic process is the well known Morris-Lecar model. Under certain conditions, the deterministic process has one stable fixed point and is excitable. An excitable event, called an action potential, is a single large transient spike in voltage that eventually returns to the stable steady state. I will discuss recent development of large deviation theory to study noise induced action potentials.
Thursday, February 13, Diane Holcomb, UW-Madison
Title: Large deviations for point process limits of random matrices.
Abstract: The Gaussian Unitary ensemble (GUE) is one of the most studied Hermitian random matrix model. When appropriately rescaled the eigenvalues in the bulk of the spectrum converge to a translation invariant limiting point process called the Sine process. On large intervals one expects the Sine process to have a number of points that is roughly the length of the interval times a fixed constant (the density of the process). We solve the large deviation problem which asks about the asymptotic probability of seeing a different density in a large interval as the size of the interval tends to infinity. Our proof works for a one-parameter family of models called beta-ensembles which contain the Gaussian orthogonal, unitary and symplectic ensembles as special cases.
Thursday, February 20, Philip Matchett Wood, UW-Madison
Title: The empirical spectral distribution (ESD) of a fixed matrix plus small random noise.
Abstract: A fixed matrix has a distribution of eigenvalues in the complex plane. Small random noise can be formed by a random matrix with iid mean 0 variance 1 entries scaled by [math]\displaystyle{ n^{-\gamma -1/2} }[/math] for [math]\displaystyle{ \gamma \gt 0 }[/math], which by itself has eigenvalues collapsing to the origin. What happens to the eigenvalues when you add a small random noise matrix to the fixed matrix? There are interesting cases where the eigenvalue distribution is known to change dramatically when small Gaussian random noise is added, and this talk will focus on what happens when the noise is not Gaussian.
Thursday, February 27, Jack Hanson, Indiana University Bloomington
Title: Subdiffusive Fluctuations in First-Passage Percolation
Abstract: First-passage percolation is a model consisting of a random metric t(x,y) generated by random variables associated to edges of a graph. Many questions and conjectures in this model revolve around the fluctuating properties of this metric on the graph Z^d. In the early 1990s, Kesten showed an upper bound of Cn for the variance of t(0,nx); this was improved to Cn/log(n) by Benjamini-Kalai-Schramm and Benaim-Rossignol for particular choices of distribution. I will discuss recent work (with M. Damron and P. Sosoe) extending this upper bound to general classes of distributions.
Thursday, March 6, TBA
Title: First-passage percolation on Z^d
Abstract: First-passage percolation is a model consisting of a random metric t(x,y) generated by random variables associated to edges of a graph. Many questions and conjectures in this model revolve around the fluctuating properties of this metric on the graph Z^d. In the early 1990s, Kesten showed an upper bound of Cn for the variance of t(0,nx); this was improved to Cn/log(n) by Benjamini-Kalai-Schramm and Benaim-Rossignol for particular choices of distribution. I will discuss recent work (with M. Damron and P. Sosoe) extending this upper bound to general classes of distributions.
Thursday, March 13, TBA
Thursday, March 20, No Seminar due to Spring Break
Thursday, March 27, Cécile Ané, UW-Madison Department of Statistics
Title: Application of a birth-death process to model gene gains and losses on a phylogenetic tree
Thursday, April 3, TBA
Thursday, April 10, Dan Romik UC-Davis
Thursday, April 17, TBA
Thursday, April 24, TBA
Thursday, May 1, Antonio Auffinger U Chicago
Thursday, May 8, TBA