SIAM Student Chapter Seminar: Difference between revisions

From DEV UW-Math Wiki
Jump to navigation Jump to search
Line 81: Line 81:


'''9/30 Jeff Hammond:''' Jeff Hammond is a principal engineer with NVIDIA based in Helsinki, Finland, where his focus is developing better ways to write software for numerical algorithms. From 2014 to 2021, Jeff worked for Intel in Portland, Oregon; he started in the research organization and moved to the data center business group. Prior to that he worked for Argonne National Laboratory, first as a postdoc and then as a scientist in the supercomputing facility. Jeff was a graduate student at the University of Chicago and focused on developing open-source chemistry simulation software with Karol Kowalski at Pacific Northwest National Laboratory.  He majored in chemistry and mathematics at the University of Washington.  Details can be found on Jeff's home page: <nowiki>https://jeffhammond.github.io/</nowiki>.  
'''9/30 Jeff Hammond:''' Jeff Hammond is a principal engineer with NVIDIA based in Helsinki, Finland, where his focus is developing better ways to write software for numerical algorithms. From 2014 to 2021, Jeff worked for Intel in Portland, Oregon; he started in the research organization and moved to the data center business group. Prior to that he worked for Argonne National Laboratory, first as a postdoc and then as a scientist in the supercomputing facility. Jeff was a graduate student at the University of Chicago and focused on developing open-source chemistry simulation software with Karol Kowalski at Pacific Northwest National Laboratory.  He majored in chemistry and mathematics at the University of Washington.  Details can be found on Jeff's home page: <nowiki>https://jeffhammond.github.io/</nowiki>.  
'''10/7 Jie Wang:''' We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.
==Past Semesters==
==Past Semesters==
*[[Spring 2022 SIAM|Spring 2022]]
*[[Spring 2022 SIAM|Spring 2022]]

Revision as of 20:01, 2 October 2022



Fall 2022

Date (1 PM unless otherwise noted) Location Speaker Title
9/23 Virtual and 911 Van Vleck Thomas Anderson (University of Michigan) A few words on potential theory in modern applied math
9/30 (11 AM) Virtual and 911 Van Vleck Jeff Hammond (Principal Engineer at NVIDIA) Industry talk
10/7 Virtual and 911 Van Vleck Jie Wang (Georgia Institute of Technology) Sinkhorn Distributionally Robust Optimization
10/14 Virtual and 911 Van Vleck Matt Reuter (Stony Brook University)
10/19 (Wednesday at 4 PM) Virtual and 911 Van Vleck Ying Li
10/28 911 Van Vleck Yinling Zhang (UW-Madison)
11/4 911 Van Vleck Haley Colgate (UW-Madison)
11/11 911 Van Vleck Zinan Wang (UW-Madison)
11/18 911 Van Vleck Parvathi Kooloth (UW-Madison)
11/25 NO TALK THANKSGIVING WEEK
12/2 Virtual and 911 Van Vleck Jenny Yeon (Applied Scientist at Amazon)



Abstracts

9/23 Thomas Anderson: I'll talk a bit about potential theory as it is used today in the solution, via boundary integral equations / the boundary element method, of linear PDEs. These aren't only a numerical approach: I'll say a few words too about how they can be used to do analysis on problems. Then I may say a few things about volumetric potential theory: what are the problems there I've been thinking about, and application studies in mixing, for example, that they enable. Finally, I'll be happy to talk a bit about my experience so far in academia.

9/30 Jeff Hammond: Jeff Hammond is a principal engineer with NVIDIA based in Helsinki, Finland, where his focus is developing better ways to write software for numerical algorithms. From 2014 to 2021, Jeff worked for Intel in Portland, Oregon; he started in the research organization and moved to the data center business group. Prior to that he worked for Argonne National Laboratory, first as a postdoc and then as a scientist in the supercomputing facility. Jeff was a graduate student at the University of Chicago and focused on developing open-source chemistry simulation software with Karol Kowalski at Pacific Northwest National Laboratory.  He majored in chemistry and mathematics at the University of Washington.  Details can be found on Jeff's home page: https://jeffhammond.github.io/.

10/7 Jie Wang: We study distributionally robust optimization with Sinkhorn distance -- a variant of Wasserstein distance based on entropic regularization. We derive convex programming dual reformulations when the nominal distribution is an empirical distribution and a general distribution, respectively. Compared with Wasserstein DRO, it is computationally tractable for a larger class of loss functions, and its worst-case distribution is more reasonable. We propose an efficient stochastic mirror descent algorithm to solve the dual reformulation with provable convergence guarantees. Finally, we provide various numerical examples using both synthetic and real data to demonstrate its competitive performance and light computation cost.

Past Semesters