8:30 | Registration |

9:00 | Connelly |

10:00 | Coffee break |

10:30 | Glotzer |

11:30 | Vallentin |

12:30 | Lunch break |

2:30 | Mœglin |

3:30 | Coffee break |

4:00 | Kim |

9:00 | Streinu |

10:00 | Coffee break |

10:30 | Bezdek |

11:30 | Cohn/Miller |

12:30 | Lunch break |

2:30 | Loeser |

3:30 | Coffee break |

4:00 | Yin |

9:00 | Arthur |

10:00 | Coffee break |

10:30 | Casselman |

11:30 | Goresky |

9:00 | Harrison |

10:00 | Coffee break |

10:30 | Zhan |

11:30 | Appel |

12:30 | Lunch break |

2:30 | Mahboubi |

3:30 | Coffee break |

4:00 | Urban |

6:00 | Banquet |

9:00 | Nipkow |

10:00 | Coffee break |

10:30 | Lewis |

11:30 | Adams |

12:30 | Lunch break |

2:30 | Trường |

3:30 | Coffee break |

4:00 | Gonthier |

5:00 | Closing |

**Mark Adams: Flyspecking Flyspeck**

To remove any doubt about the correctness of his proof of the Kepler Conjecture, Hales instigated the Flyspeck project in 2003. This used human-directed software to "formalise" the proof, i.e. break it down into tiny primitive inference steps in formal logic and automatically check that these steps are correct. Twenty person-years and many billions of primitive inferences steps later, in 2014, the project was complete. However, sceptics might argue that the formalisation itself may be flawed. In this talk, I shall cover the conceivable ways in which flaws may have crept in. I shall also talk about my own project to flyspeck Flyspeck by subjecting it to a complete and highly rigorous audit.

**Andrew Appel: Graph coloring and machine proofs in computer science, 1977-2017**

The Four Color Theorem of Kenneth Appel and Wolfgang Haken (1976) was proved and checked with the assistance of computer programs, though much of the proof was written (and refereed) only by humans. Contemporaneously, Edinburgh LCF (Logic for Computable Functions) was developed by Robin Milner -- a system for proofs written by humans (with computer assistance) but completely checked by computer; with particular application to proofs about computer programs. These two developments, and their convergence, have had significant impact on computer science, and my own research career: graph-coloring algorithms for register allocation in compilers, functional programming languages, fully machine-checked proofs of mathematical theorems, fully machine-checked proofs of software systems. One result at the intersection of all these is a machine-checked proof of correctness of a program that does register allocation by graph-coloring, using an algorithm related to one used in every four-color proof (and attempted proof) since 1879.

**James Arthur: Orbital integrals and zeta functions
**

We shall review the work of Z. Yun on a relation between p-adic orbital integrals and certain zeta functions. We shall also recall the analysis by A. Altug of elliptic terms in the trace formula for GL(2). We shall then study the problem of Poisson summation for general linear groups. A suitable solution would be an important step in Langlands' proposed reconfiguration of the trace formula, known as Beyond Endoscopy, which is aimed ultimately at the Principle of Functoriality.

**Károly Bezdek:
On totally separable translative packings**

This is a survey talk on volumetric inequalities and contact graphs of totally separable translative packings of convex bodies in Euclidean spaces.

**William Casselman: Computing in the exceptional groups
**

The classical groups are convenient to work with because they have simple matrix realizations. There are several programs (such as MAGMA) that allow also computation in the exceptional groups as well, but the underlying machinery (which reduces to computations in the Lie algebra) is somewhat clumsy. I'll explain how to combine an elegant idea of Kottwitz and some old results of Tits to make things much less clumsy. With luck, I'll also demonstrate an application to the reduction theory of arithmetic groups.

**Robert Connelly:
Circle Packings, Triangulations, and Rigidity**

Suppose you have a finite collection of circular disks of various sizes, and you want to fit them in a container without overlap so they occupy the largest fraction of the area possible. Think of the container as a torus, which is just the plane wrapped up by two independent translations. An example is the following, where there are three sizes of disks in ratio 1:2:3. Notice that the *graph of the packing* (obtained by joining centers of touching disks) is a triangulation in the plane. When this property holds, often the packing has a high density.

The packing fraction or density of this periodic packing is $7\pi/24$. Are there any other triangulated periodic packings whose density is a rational multiple of $\pi$?

Problems of this sort have connections to rigidity theory (my day job), granular materials, approximate conformal (angle preserving) mappings given by packings, number theory, and salt, all explained. We extend a program of László Fejes Tóth, the great Hungarian geometer, with a conjecture about the most dense packings for some classes of circle packings in a torus.
Another aspect of these ideas has to do with jammed packings of disks in a fixed container, where the radii ratios of the disks are essentially chosen randomly, as you might expect with some granular materials. Given the number of disks, rigidity theory tells us that there has to be a minimum number of contacts among the disks. It has been a long standing conjecture (the *isostatic conjecture*), and taken as obvious by many scientists, that for generic radii, you cannot have more than that minimum number of contacts. The analytic theory of packings, as used by Koebe, Andreev, and Thurston and Ren Guo, gives us a way of actually proving that that the isostatic conjecture is correct in a reasonable circumstance.

This is work with Steven Gortler, Steven Gortler, Evan Solomonides, and Maria Yampolskaya.

**Sharon Glotzer:
How shapes pack, and the bizarre story of the tetrahedron**

**Georges Gonthier:
Writing, crafting, and engineering formalized mathematics**

The computer formalization of a mathematical proof is an odd hybrid. On the one hand it is program code instructing the machine to go through the myriad elementary logical steps comprising the proof, whether insightful or trivial. As such it is a technical artefact engineered for performance, robustness and maintainability. On the other hand we would also like a formal proof to be a written document that retains the clarity of its source material, and ideally provides additional insight in the workings of the demonstration. While these objectives are often at odds with the hackish nature of computer code, they can be reconciled by organizing the formalization so that important steps appear as they would on paper, and coding is used to interpret and implement these with basic logical constructs.

One may even have to craft an intermediate level of argument explaining how the latter is carried out in the "little math" of pencil-and-paper combinatorial objects, sometimes leading to a fresh mathematical presentation of the background theories, as I've learned from my experience with the Four Color Theorem and the Odd Order Theorem.

**Mark Goresky: Points mod p of hyperbolic three-manifolds
**

Fifty years from now there will be a very good theory of the reduction modulo p of hyperbolic three-manifolds (among other things). The beginnings of this theory are emerging now. This is a general talk addressed to a wide mathematical audience.

**John Harrison: A HOL Light formalization of singular homology theory**

There are some mathematical theories that act as powerful "big guns" for getting interesting results quickly, but at the cost of lengthy and burdensome technical constructions to set up the machinery in the first place. Homology and cohomology theories are classic examples, and I will describe a formalization in the HOL Light proof assistant of singular homology theory with integer coefficients. This builds from the basic constructions defined on singular chains to derive the Eilenberg-Steenrod-Milnor "axioms", and the theory is then used for some classic applications including the generalized Jordan-Brouwer separation theorem and the Borsuk-Ulam theorem. Interactive theorem provers, not just HOL Light but also Mizar, Isabelle/HOL, Coq and others, have been used to formalize a significant amount of point set topology and homotopy theory. But as far as I know, this is the first foundational construction of a homology theory in a machine-checked setting. I will present some highlights of the formalization, including the auxiliary work on group theory that it uses, and reflect on what I have learned from it. I will not assume any prior knowledge of homology theory, and indeed didn't have very much myself before starting this formalization effort.

**Ju-Lee Kim: Jordan decompositions of cocenters of reductive $p$-adic groups**

Cocenters of Hecke algebras play an important role in studying mod $\ell$ or $\mathbb{C}$-harmonic analysis on connected $p$-adic reductive groups. On the other hand, the depth $r$ Hecke algebra is well suited to study depth $r$ smooth representations. In this paper, we study depth $r$ rigid cocenters of a connected reductive $p$-adic group over rings of characteristic zero or $\ell\neq p$. More precisely, under some mild hypotheses, we establish a Jordan decomposition of the depth $r$ rigid cocenter, hence find an explicit basis of depth $r$ rigid cocenters. This is joint work with Xuhua He.

**Robert Lewis
: A heuristic method for formally verifying real inequalities
**

We describe a general method for verifying inequalities between real-valued expressions, especially the kinds of straightforward inferences that arise in interactive theorem proving. In contrast to approaches that aim to be complete with respect to a particular language or class of formulas, our method is heuristic and establishes claims that require heterogeneous forms of reasoning. It is also able to create proof "sketches" or "explanations" that can be inspected by the user. Our method is implemented in the powerful metaprogramming framework of the Lean proof assistant; we will discuss how this framework is used to make our tool extensible.

**François Loeser: The geometric nature of $p$-adic integrals
**

This will be a gentle introduction to some of the progress that has been made during the last two decades in understanding the geometric nature of $p$-adic integrals, with a special eye for those appearing in the statement of the Fundamental Lemma.

**Assia Mahboubi:
Verified computer-aided mathematics**

Proof assistants are pieces of software for representing mathematical definitions, statements, algorithms and proofs in a digital form, suitable for computer processing. The first proof assistants were written in the 60s, about at the same time as the first computer algebra systems. The latter have since become quite popular experimentation tools in mathematics, and have even entered in the classroom. By contrast, the former have mostly been used by computer scientists, motivated by program verification. Despite recent landmark successes, both in the verification of sophisticated programs and in the formalization of complex and modern mathematical theories, the mere existence of proof assistants probably remains largely ignored from a majority of researchers in mathematics. This talk proposes to illustrate and discuss the assistance proof assistants may provide for mathematically inclined users.

**Henry Cohn/Stephen Miller: Sphere packing and the Universal Optimality Theorem in dimensions 8 and 24
**

I'll describe recent joint work with Henry Cohn, Abhinav Kumar, Danylo Radchenko, and Maryna Viazovska concerning the problem of how to arrange points in euclidean space so as to minimize repulsion between them. We show that the $E_8$ root lattice in $\mathbb{R}^8$ and the Leech lattice in $\mathbb{R}^{24}$ minimize energy (among point configurations of the same density) for any summable, completely monotonic potential function of the distance-squared. The sphere packing problem in those dimensions is then a consequence (by taking a potential concentrated more and more towards the origin). I'll describe other applications to lattices and automorphic forms.

** Colette Mœglin
: Local components of square integrable automorphic forms
**

In this talk, I would like to give a general picture of these local components, focusing on Arthur's approach of it.

**Tobias Nipkow: Verified Analysis of Algorithms
**

This talk will survey recent progress in the verification of basic algorithms from standard textbooks such as CLRS. Both functional correctness and computational complexity are considered and the formalization of the necessary mathematical basis for the latter is discussed.

**Ileana Streinu: Kinematics of periodic framework structures
**

Periodic frameworks appear in various contexts, from contact graphs of lattice sphere packings to atom-and-bond connections in crystals. We survey a geometric deformation theory which can be used for understanding phenomena investigated in materials science, for example auxetic behavior or negative thermal expansion. Auxetic behavior refers to lateral widening upon stretching. The geometric approach leads to rigorous and comprehensive answers to questions regarding structure and design. This is joint work with Ciprian Borcea.

**
Hoàng Lê Trường: Computer aided proofs in discrete geometry and algebraic geometry
**

We illustrate the use of proof assistants for verifying complicated proofs of the Kepler conjecture in discrete geometry and theorems in algebraic geometry. In particular, formal proof technology can help us to understand the statements of existence in discrete geometry and algebraic geometry.

**Josef Urban:
Kepler and Hales: conjectures and proofs, dreams and their realization**

This could be in principle a talk about our work on automation and AI for reasoning and formalization, and Tom's great role in these areas. But the motivations and allusions go back to alchemistic Prague of 1600s and the (un)scientific pursuits of then versions of "singularity", provoking comparisons with our today's funny attempts at building AI and reasoning systems for solving the great questions of Math, the Universe and Everything. I wonder where this will all take us.

**Frank Vallentin: Coloring the Voronoi tessellation of lattices
**

In this talk I will introduce the chromatic number of a lattice: It is the least number of colors one needs to color the interiors of the cells of the Voronoi tessellation of a lattice so that no two cells sharing a facet are of the same color. I will introduce two lower bounds for the chromatic number: the sphere packing lower bound and the spectral lower bound. Using them I will show how to compute the chromatic number of several important lattices.

**
Yimu Yin: Motivic integration and the nonarchimedean Milnor fiber
**

Extending the work of Hrushovski and Loeser, I will describe how an object called "the nonarchimedean Milnor fiber" embodies a broader concept of Milnor fiber. This object may be considered, by way of model theory, as an infinitesimal limit of the usual topological Milnor fiber, and gives rise to the motivic Milnor fiber by way of the Hrushovski-Kazhdan integral. The uniform narrative affords new insights into how the various invariants and constructions associated with Milnor fiber are related. This is joint work with Goulwen Fichou.

**
Bohua Zhan: Proof automation in set theory
**

In formalization of mathematics, the choice of logical foundation is an important decision, which can affect how mathematical concepts are expressed throughout a formalization project. Set theory based on ZFC axioms has the advantage that it is close to what mathematicians work with. However, set theory is also considered to be difficult to use in practice. In this talk, I will show how many of the difficulties can be addressed with appropriate setup of proof automation. These ideas are implemented in the auto2 prover in Isabelle, and has been used in a formalization of the definition of the fundamental group based on set theory.