Title: Compact Quantum Groups and their Semidirect Products

by Sutanu Roy (National Institute of Science Education and Research) as part of Topological Groups

Lecture held in Elysium.

Abstract

Compact quantum groups are noncommutative analogs of compact groups in the realm of noncommutative geometry introduced by S. L. Woronowicz back in the 80s. Roughly, they are unital C*-bialgebras in the monoidal category (given by the minimal tensor product) of unital C*-algebras with some additional properties. For real 0<|q|<1, q-deformations of SU(2) group are the first and well-studied examples of compact quantum groups. These examples were constructed independently by Vaksman-Soibelman and Woronowicz also back in the 80s. In fact, they are examples of a particular class of compact quantum groups namely, compact matrix pseudogroups. The primary goal of this talk is to motivate and discuss some of the interesting aspects of this theory from the perspective of the compact groups. In the second part, I shall briefly discuss the semidirect product construction for compact quantum groups via an explicit example. The second part of this will be based on a joint work with Paweł Kasprzak, Ralf Meyer and Stanislaw Lech Woronowicz.

Title: Effective Dimension and the Intersection of Random Closed Sets

by Christopher Porter (Drake University) as part of Computability theory and applications

Abstract

The connection between the effective dimension of sequences and membership in algorithmically random closed subsets of Cantor space was first identified by Diamondstone and Kjos-Hanssen. In this talk, I highlight joint work with Adam Case in which we extend Diamondstone and Kjos-Hanssen’s result by identifying a relationship between the effective dimension of a sequence and what we refer to as the degree of intersectability of certain families of random closed sets (also drawing on work by Cenzer and Weber on the intersections of random closed sets). As we show, (1) the number of relatively random closed sets that can have a non-empty intersection varies depending on the choice of underlying probability measure on the space of closed subsets of Cantor space—this number being the degree of intersectability of a given family of random closed sets—and (2) the effective dimension of a sequence X is inversely proportional to the minimum degree of intersectability of a family of random closed sets, at least one of which contains X as a member. Put more simply, a sequence of lower dimension can only be in random closed sets with more branching, which are thus more intersectable, whereas higher dimension sequences can be in random closed sets with less branching, which are thus less intersectable, and the relationship between these two quantities (that is, effective dimension and degree of intersectability) can be given explicitly.

Title: The computable strength of Milliken’s Tree Theorem and applications

by Paul-Elliot Angles d’Auriac (University of Lyon) as part of Computability theory and applications

Abstract

Devlin’s theorem and the Rado graph theorem are both variants of Ramsey’s theorem, where a structure is added but more colors are allowed: Devlin’s theorem (respectively the Rado graph theorem) states if S is ℚ (respectively G, the Rado graph), then for any size of tuple n, there exists a number of colors l such that for any coloring of [S]^n into finitely many colors, there exists a subcopy of S on which the coloring takes at most l colors. Moreover, given n, the optimal l is specified.

The key combinatorial theorem used in both the proof of Devlin’s theorem and the Rado graph theorem is Milliken’s tree theorem. Milliken’s tree theorem is also a variant of Ramsey’s theorem, but this time for trees and strong subtrees: it states that given a coloring of the strong subtrees of height n of a tree T, there exists a strong subtree of height ω of T on which the coloring is constant.

In this talk, we review the links between those theorems, and present the recent results on the computable strength of Milliken’s tree theorem and its applications Devlin and the Rado graph theorem, obtained with Cholak, Dzhafarov, Monin and Patey.

Title: Towards a unifying approach to algebraic and coarse entropy

by Nicolò Zava (University of Udine) as part of Topological Groups

Lecture held in Elysium.

Abstract

In each situation, entropy associates to a self-morphism a value that estimates the chaos created by the map application. In particular, the algebraic entropy $h_{alg}$ can be computed for (continuous) endomorphisms of (topological) groups, while the coarse entropy $h_c$ is associated to bornologous self-maps of locally finite coarse spaces. Those two entropy notions can be compared because of the following observation. If $f$ is a (continuous) homomorphism of a (topological) group $G$, then $f$ becomes automatically bornologous provided that $G$ is equipped with the compact-group coarse structure. For an endomorphism $f$ of a discrete group, $h_{alg}(f)=h_c(f)$ if $f$ is surjective, while, in general, $h_{alg}(f)

eq h_c(f)$. That difference occurs because in many cases, if $f$ is not surjective, then $h_c(f)=0$.

In the first part of the talk, after briefly recalling the large-scale geometry of topological groups, we define the coarse entropy and discuss its relationship with the algebraic entropy. The second part is dedicated to the introduction of the algebraic entropy of endomorphisms of $G$-sets (i.e., sets endowed with group actions). We show that it extends the usual algebraic entropy of group endomorphisms and we provide evidence that it can represent a useful modification and generalisation of the coarse entropy that overcome the non-surjectivity issue.

Title: Topological Groups Seminar Two-Week Hiatus

by Break (University of Hawaiʻi) as part of Topological Groups

Lecture held in Elysium.

Abstract: TBA

Title: Reverse mathematics of combinatorial principles over a weak base theory

by Leszek Kołodziejczyk (University of Warsaw) as part of Computability theory and applications

Abstract

Reverse mathematics studies the strength of axioms needed to prove various

mathematical theorems. Often, the theorems have the form $forall X exists

Y psi(X,Y)$ with $X, Y$ denoting subsets of $mathbb{N}$ and $psi$

arithmetical, and the logical strength required to prove them is closely

related to the difficulty of computing $Y$ given $X$. In the early decades

of reverse mathematics, most of the theorems studied turned out to be

equivalent, over a relatively weak base theory, to one of just a few typical

axioms, which are themselves linearly ordered in terms of strength. More

recently, however, many statements from combinatorics, especially Ramsey

theory, have been shown to be pairwise inequivalent or even logically

incomparable.

The usual base theory used in reverse mathematics is $mathrm{RCA}_0$, which

is intended to correspond roughly to the idea of “computable mathematics”.

The main two axioms of $mathrm{RCA}_0$ are: comprehension for computable

properties of natural numbers and mathematical induction for c.e.

properties. A weaker theory in which induction for c.e. properties is

replaced by induction for computable properties has also been introduced,

but it has received much less attention. In the reverse mathematics

literature, this weaker theory is known as $mathrm{RCA}^*_0$.

In this talk, I will discuss some results concerning the reverse mathematics

of combinatorial principles over $mathrm{RCA}^*_0$. We will focus mostly on

Ramsey’s theorem and some of its well-known special cases: the

chain-antichain principle CAC, the ascending-descending chain principle ADS,

and the cohesiveness principle COH.

The results I will talk about are part of a larger project joint with Marta

Fiori Carones, Katarzyna Kowalik, Tin Lok Wong, and Keita Yokoyama.

Title: Non-arithmetic algebraic constructions

by Chris Conidis (CUNY-College of Staten Island) as part of Computability theory and applications

Abstract

We examine two radical constructions, one from ring theory and another from module theory, and produce a computable ring for each construction where the corresponding radical is $Pi^1_1$-complete.

Join the Hawai‘i Data Science Institute for another Data Science Friday seminar titled “Bayesian Topological Learning for Complex Data Analysis” presented by Assistant Professor of Mathematics Dr. Farzana Nasir on October 16, 2020 at 2 pm on Zoom.

Please find more information below and on the attached flyer.

**Zoom registration:** http://go.hawaii.edu/39f

**Abstract:** Persistent homology is a tool in topological data analysis for learning about the geometrical/topological structures in data by detecting different dimensional holes and summarizing their appearance disappearance scales in persistence diagrams. However, quantifying the uncertainty present in these summaries is challenging. In this talk, I will present a Bayesian framework for persistent homology by relying on the theory of point

processes. This Bayesian model provides an effective, flexible, and noise-resilient scheme to analyze and classify complex datasets. A closed form of the posterior distribution of persistence diagrams based on a family of conjugate priors will be provided. The goal is to introduce a

supervised machine learning algorithm using Bayes factors on the space of persistence diagrams. This framework is applicable to a wide variety of datasets. I will present an application to filament networks data classification of plant cells.**Bio:** Farzana Nasrin graduated from Texas Tech University with a Ph.D. in Applied Mathematics in August 2018. Her research interests span algebraic topology, differential geometry, statistics, and machine learning. Currently, she is holding an assistant professor position at UH Manoa in the Department of Mathematics. Before coming to UHM, she was working as a postdoctoral research associate funded by the ARO in mathematical data science at UTK. She has been working on building novel learning tools that rely on the shape peculiarities of data with application to biology, materials science, neuroscience, and ophthalmology. Her dissertation involves the development of analytical tools for smooth shape reconstruction from noisy data and visualization tools for utilizing information from advanced imaging devices.