Directed Reading Program Spring 2023: Difference between revisions

From DEV UW-Math Wiki
Jump to navigation Jump to search
(Updaten schedule)
Line 98: Line 98:
|-
|-
| bgcolor="#E0E0E0" | 4:30-4:45
| bgcolor="#E0E0E0" | 4:30-4:45
| bgcolor="#C6D46E" | Break
| bgcolor="#C6D46E" | A characterization of Hausdorff dimension via Komolgorov complexity
| bgcolor="#BCE2FE" | N/A
| bgcolor="#BCE2FE" | Jack Maloney
|-
|-
| bgcolor="#E0E0E0" |4:45-5:00
| bgcolor="#E0E0E0" |4:45-5:00

Revision as of 21:52, 18 April 2023

What is it? The Directed Reading Program (DRP) in the UW Madison Department of Mathematics pairs undergraduate students with graduate mentors for semester-long independent studies. During the semester, the student will work through a mathematical text and meet weekly to discuss it with their mentor. The original DRP was started by graduate students at the University of Chicago over a decade ago, and has had immense success. It has since spread to many other math departments who are members of the DRP Network.

Why be a student?

  • Learn about exciting math from outside the mainstream curriculum!
  • Prepare for future reading and research, including REUs!
  • Meet other students interested in math!

Why be a mentor?

  • Practice your mentorship skills!
  • It strengthens our math community!
  • Solidify your knowledge in a subject!

Current Organizers: Ivan Aidun, Allison Byars, John Cobb, John Spoerl, Karan Srivastava

Requirements

At least one hour per week spent in a mentor/mentee setting. Students spend about two hours a week on individual study, outside of mentor/mentee meetings. At the end, students give a 10-12 minute presentation at the end of the semester introducing their topic. This semester, it is scheduled for Wednesday, April 26th.

Applications

Check out our main page for examples of past projects.

Students: Applications are closed.

Mentors: Applications are closed.

Questions?

Contact us at drp-organizers@g-groups.wisc.edu

Projects

Spring 2023 Projects
Title Abstract Required Background
Probability and statistics We will study the principles of Bayesian statistics. Topics may include conjugate priors, model selection, identifiability, mixture models. Emphasis will be placed on understanding Bayesian statistics as a method for unsupervised machine learning. Students with some background in computer programming will be able to work on problems related to Markov Chain Monte Carlo sampling. Well-prepared students will have the opportunity to contribute to a peer-reviewed academic journal article related to process control in semiconductor manufacturing. Students should have some background in probability and/or statistics. Knowledge of some computer programming is preferred. This project should be one where a student can work independent, but the student can also expect to have regular contact and assistance from the mentor.
Stochastic Processes, Graph Theory, and Algebraic Topology This DRP program is a continuation of an overview of graph neural networks from last semester. We will explore how incorporating homotopy theoretic invariants can help improve conventional graph neural networks, and explore various applications in medical sciences and social network analysis. Students should have some background in probability and/or statistics. Knowledge of some computer programming is preferred. This project should be one where a student can work independent, but the student can also expect to have regular contact and assistance from the mentor.
Fractal geometry In recent years, a surprising connection between two seemingly unrelated branches of math has emerged. Fractal dimension is a notion of size for sets which, unlike how we normally think about dimension, is not necessarily an integer (e.g. the rough shape of a coastline, the branching of a bolt of lightning, or the chaotic movement of a stock's price have non-integer dimension). Kolmogorov complexity, which measures how much information a string contains, can be applied to characterize some types of fractal dimension. This connection has already yielded impressive new results about fractal sets.

In this DRP, we will start by understanding notions of fractal dimension and the definition/properties of Kolmogorov complexity. Then, we'll bridge the gap between the two via "effective" dimension. Effective dimension gives rise to elegant characterizations of both Hausdorff and packing dimension via the recently proven point-to-set principle, which we will study. Finally, we'll see some applications of this alternative definition, which makes it one of the most powerful new tools in fractal geometry.

Students should have completed at minimum 521 or comparable classes, but the more experience with proof-based math you have, the better. This material is very self-contained, but there are a number of definitions and challenging concepts. We'll go through everything at a reasonable pace, but some level of mathematical maturity will be helpful.
Number Theory The primary source I plan to work through is P-adic Numbers by Fernando Q. Gouvêa. One goal of this project would be to understand the local-global principle, and to use Hasse-Minkowski's Theorem to classify rational solutions to various classical equations. A semester of algebra. A semester of either analysis or topology will also be helpful but not strictly necassary in order to understand metric spaces
An introduction to matroids A matroid is a mathematical object that generalizes the notion of linear independence in linear algebra. However, its applications are much broader, including graph theory and many problems in combinatorics. In this DRP, we will read Welsh's book Matroid Theory. The specific direction we take from there will be up to the students' interests. No background is truly required, but some familiarity with graphs, linear algebra, basic set theory, etc would be useful.
Graph Theory The goal of this DRP is to provide an introduction to hypergraph theory. There will be four main topics that we hope to cover:

1. Basic Examples of Hypergraphs and Graph Coloring. 2. Extremal Graph Theory (eg, Ramsey's Theorem, Hales-Jewett) 3. The Probabilistic Method 4. Linear Algebraic Methods We will only cover each topic on a surface level, but that is still sufficient to see some very strong and interesting results.

Some prior exposure to proofs is the only hard requirement. Some very basic notions in discrete probability and linear algebra will be used, but we can cover them if needed. A prior course in graph theory or discrete math might be helpful to better appreciate some of the generalizations through hypergraphs.
Set Theory / Logic Let’s take an excursion into Set Theory! Set theory was originally developed as a way to formalize mathematical reasoning about infinite objects, but today it’s a full mathematical field with its own fascinating questions and results. We’ll certainly learn about ordinals and cardinals, and we’ll explore more topics depending on your interests and background! Completed a 500-level math course. Preference for those who have taken Math 570 (Set Theory).
Algebra In this project, we will use the methods of abstract algebra, and Cox Little and O'Shea's book "Ideals, Varieties and Algorithms," to generalize many of the topics we are familiar with when working with polynomials to multiple variables. We will also go beyond the text of this book and consider some unexpected approaches, as we learn about monomial orders, various types of ideals, and perhaps (optionally) some coding. Many types of mathematics will converge here. Student should have completed a semester of algebra (541 or equivalent). Taking a second semester concurrently is a good idea, but not required.
Graph Neural Networks Graph Neural Networks are a form of neural network that are designed to extract information from data in the form of a graph. The basic idea is that these are trainable networks that can learn from the various attributes of a graph (such as node-degree, number of edges, etc) and perform tasks at a node level (local) or graph level (global). It is an increasingly interesting area in machine learning and the aim of this DRP project will be to learn the basic theory of GNNs, implement some fundamental GNNs, and, time-permitting, come up with our own project for implementing a GNN to get something useful based on your interests. Being comfortable with linear algebra, having any optimization experience, as well as some coding experience in python are important. You don't need much or any graph theory - we can cover the basics, which will be more than enough. Bonus if you've done some ML in the past. You'll be very independent for this project and be doing a lot of exploring with your group members.

Presentation Schedule

Time Speakers Title
3:30-3:45 ] Tianze Huang P-adic number and (maybe) hensels lemma
3:45-4:15 Jack Westbrook & Yixuan Hu Constructing Gödel’s Constructible Universe
4:15-4:30 S. Sudhir Inferring Tester Error Characteristics
4:30-4:45 A characterization of Hausdorff dimension via Komolgorov complexity Jack Maloney
4:45-5:00 Jimmy Vineyard Community Detection with Line Graph Neural Networks
5:00-5:30 Matt & PJ Matroid Presentation
5:30-5:45 Ruixuan Tu Representation power of GNN