Probability Seminar: Difference between revisions

From DEV UW-Math Wiki
Jump to navigation Jump to search
Line 133: Line 133:
== December 10, 2020, [https://www.ewbates.com/ Erik Bates] (UW-Madison) ==
== December 10, 2020, [https://www.ewbates.com/ Erik Bates] (UW-Madison) ==


Title: '''TBA'''
Title: '''Empirical measures, geodesic lengths, and a variational formula in first-passage percolation'''


Abstract: TBA
Abstract: We consider the standard first-passage percolation model on $\mathbb{Z}^d$, in which each edge is assigned an i.i.d. nonnegative weight, and the passage time between any two points is the smallest total weight of a nearest-neighbor path between them.  Our primary interest is in the empirical measures of edge-weights observed along geodesics from $0$ to $n\mathbf{e}_1$.  For various dense families of edge-weight distributions, we prove that these measures converge weakly to a deterministic limit as $n$ tends to infinity.  The key tool is a new variational formula for the time constant.  In this talk, I will derive this formula and discuss its implications for the convergence of both empirical measures and lengths of geodesics.




[[Past Seminars]]
[[Past Seminars]]

Revision as of 05:39, 1 December 2020


Fall 2020

Thursdays in 901 Van Vleck Hall at 2:30 PM, unless otherwise noted. We usually end for questions at 3:20 PM.

IMPORTANT: In Fall 2020 the seminar is being run online. ZOOM LINK

If you would like to sign up for the email list to receive seminar announcements then please join our group.

September 17, 2020, Boris Hanin (Princeton and Texas A&M)

Pre-Talk: (1:00pm)

Neural Networks for Probabilists

Deep neural networks are a centerpiece in modern machine learning. They are also fascinating probabilistic models, about which much remains unclear. In this pre-talk I will define neural networks, explain how they are used in practice, and give a survey of the big theoretical questions they have raised. If time permits, I will also explain how neural networks are related to a variety of classical areas in probability and mathematical physics, including random matrix theory, optimal transport, and combinatorics of hyperplane arrangements.

Talk: (2:30pm)

Effective Theory of Deep Neural Networks

Deep neural networks are often considered to be complicated "black boxes," for which a full systematic analysis is not only out of reach but also impossible. In this talk, which is based on ongoing joint work with Sho Yaida and Daniel Adam Roberts, I will make the opposite claim. Namely, that deep neural networks with random weights and biases are exactly solvable models. Our approach applies to networks at finite width n and large depth L, the regime in which they are used in practice. A key point will be the emergence of a notion of "criticality," which involves a finetuning of model parameters (weight and bias variances). At criticality, neural networks are particularly well-behaved but still exhibit a tension between large values for n and L, with large values of n tending to make neural networks more like Gaussian processes and large values of L amplifying higher cumulants. Our analysis at initialization has many consequences also for networks during after training, which I will discuss if time permits.

September 24, 2020, Neil O'Connell (Dublin)

Some new perspectives on moments of random matrices

The study of `moments' of random matrices (expectations of traces of powers of the matrix) is a rich and interesting subject, with fascinating connections to enumerative geometry, as discovered by Harer and Zagier in the 1980’s. I will give some background on this and then describe some recent work which offers some new perspectives (and new results). This talk is based on joint work with Fabio Deelan Cunden, Francesco Mezzadri and Nick Simm.

October 1, 2020, Marcus Michelen (UIC)

Roots of random polynomials near the unit circle

It is a well-known (but perhaps surprising) fact that a polynomial with independent random coefficients has most of its roots very close to the unit circle. Using a probabilistic perspective, we understand the behavior of roots of random polynomials exceptionally close to the unit circle and prove several limit theorems; these results resolve several conjectures of Shepp and Vanderbei. We will also discuss how our techniques provide a heuristic, probabilistic explanation for why random polynomials tend to have most roots near the unit circle. Based on joint work with Julian Sahasrabudhe.

October 8, 2020, Subhabrata Sen (Harvard)

Large deviations for dense random graphs: beyond mean-field

In a seminal paper, Chatterjee and Varadhan derived an Erdős-Rényi random graph, viewed as a random graphon. This directly provides LDPs for continuous functionals such as subgraph counts, spectral norms, etc. In contrast, very little is understood about this problem if the underlying random graph is inhomogeneous or constrained.

In this talk, we will explore large deviations for dense random graphs, beyond the “mean-field” setting. In particular, we will study large deviations for uniform random graphs with given degrees, and a family of dense block model random graphs. We will establish the LDP in each case, and identify the rate function. In the block model setting, we will use this LDP to study the upper tail problem for homomorphism densities of regular sub-graphs. Our results establish that this problem exhibits a symmetry/symmetry-breaking transition, similar to one observed for Erdős-Rényi random graphs.

Based on joint works with Christian Borgs, Jennifer Chayes, Souvik Dhara, Julia Gaudio and Samantha Petti.

October 15, 2020, Philippe Sosoe (Cornell)

Title: Concentration in integrable polymer models

I will discuss a general method, applicable to all known integrable stationary polymer models, to obtain nearly optimal bounds on the central moments of the partition function and the occupation lengths for each level of the polymer system. The method was developed for the O'Connell-Yor polymer, but was subsequently extended to discrete integrable polymers. As an application, we obtain localization of the OY polymer paths along a straight line on the scale O(n^{2/3+o(1)}).

Joint work with Christian Noack.

October 22, 2020, Balint Virag (Toronto)

Title: The heat and the landscape

Abstract: The directed landscape is the conjectured universal scaling limit of the most common random planar metrics. Examples are planar first passage percolation, directed last passage percolation, distances in percolation clusters, random polymer models, and exclusion processes. The limit laws of distances of objects are given by the KPZ fixed point.

We show that the KPZ fixed point is characterized by the Baik Ben-Arous Peche statistics well-known from random matrix theory.

This provides a general and elementary method for showing convergence to the KPZ fixed point. We apply this method to two models related to random heat flow: the O'Connell-Yor polymer and the KPZ equation.

Note: there will be a follow-up talk with details about the proofs at 11am, Friday, October 23.

October 29, 2020, Yun Li (UW-Madison)

Title: Operator level hard-to-soft transition for β-ensembles

Abstract: It was shown that the soft and hard edge scaling limits of β-ensembles can be characterized as the spectra of certain random Sturm-Liouville operators. By tuning the parameter of the hard edge process one can obtain the soft edge process as a scaling limit. In this talk, I will present the corresponding limit on the level of the operators. This talk is based on joint work with Laure Dumaz and Benedek Valkó.

November 5, 2020, Sayan Banerjee (UNC at Chapel Hill)

Title: Persistence and root detection algorithms in growing networks

Abstract: Motivated by questions in Network Archaeology, we investigate statistics of dynamic networks that are persistent, that is, they fixate almost surely after some random time as the network grows. We consider generalized attachment models of network growth where at each time $n$, an incoming vertex attaches itself to the network through $m_n$ edges attached one-by-one to existing vertices with probability proportional to an arbitrary function $f$ of their degree. We identify the class of attachment functions $f$ for which the maximal degree vertex persists and obtain asymptotics for its index when it does not. We also show that for tree networks, the centroid of the tree persists and use it to device polynomial time root finding algorithms and quantify their efficacy. Our methods rely on an interplay between dynamic random networks and their continuous time embeddings.

This is joint work with Shankar Bhamidi.

November 12, 2020, Alexander Dunlap (NYU Courant Institute)

Title: A forward-backward SDE from the 2D nonlinear stochastic heat equation

Abstract: I will discuss a two-dimensional stochastic heat equation in the weak noise regime with a nonlinear noise strength. I will explain how pointwise statistics of solutions to this equation, as the correlation length of the noise is taken to 0 but the noise is attenuated by a logarithmic factor, can be related to a forward-backward stochastic differential equation (FBSDE) depending on the nonlinearity. In the linear case, the FBSDE can be explicitly solved and we recover results of Caravenna, Sun, and Zygouras. Joint work with Yu Gu (CMU).

November 19, 2020, Jian Ding (University of Pennsylvania)

Title: Correlation length of two-dimensional random field Ising model via greedy lattice animal

Abstract: In this talk, I will discuss two-dimensional random field Ising model where the disorder is given by i.i.d. mean zero Gaussian variables with small variance. In particular, I will present a recent joint work with Mateo Wirth on (one notion of) the correlation length, which is the critical size of the box at which the influences to spin magnetization from the boundary conditions and from the random field are comparable. Our work draws a connection to the greedy lattice animal normalized by the boundary size.

December 3, 2020, Tatyana Shcherbina (UW-Madison)

Title: SUSY transfer matrix approach for the real symmetric 1d random band matrices

Abstract: Random band matrices (RBM) are natural intermediate models to study eigenvalue statistics and quantum propagation in disordered systems, since they interpolate between mean-field type Wigner matrices and random Schrodinger operators. In particular, RBM can be used to model the Anderson metal-insulator phase transition. The conjecture states that the eigenvectors of $N\times N$ RBM are completely delocalized and the local spectral statistics governed by the Wigner-Dyson statistics for large bandwidth $W$ (i.e. the local behavior is the same as for Wigner matrices), and by Poisson statistics for a small $W$ (with exponentially localized eigenvectors). The transition is conjectured to be sharp and for RBM in one spatial dimension occurs around the critical value $W=\sqrt{N}$. Recently, we proved the universality of the correlation functions for the whole delocalized region $W\gg \sqrt{N}$ for a certain type of Hermitian Gaussian RBM. This result was obtained by application of the supersymmetric method (SUSY) combined with the transfer matrix approach. In this talk I am going to discuss how this techniques can be adapted to the real symmetric case.

December 10, 2020, Erik Bates (UW-Madison)

Title: Empirical measures, geodesic lengths, and a variational formula in first-passage percolation

Abstract: We consider the standard first-passage percolation model on $\mathbb{Z}^d$, in which each edge is assigned an i.i.d. nonnegative weight, and the passage time between any two points is the smallest total weight of a nearest-neighbor path between them. Our primary interest is in the empirical measures of edge-weights observed along geodesics from $0$ to $n\mathbf{e}_1$. For various dense families of edge-weight distributions, we prove that these measures converge weakly to a deterministic limit as $n$ tends to infinity. The key tool is a new variational formula for the time constant. In this talk, I will derive this formula and discuss its implications for the convergence of both empirical measures and lengths of geodesics.


Past Seminars