Title: On the descriptive complexity of Fourier dimension and Salem sets

by Manlio Valenti (Università di Udine) as part of Computability theory and applications

Abstract

It is known that, for Borel sets, the Fourier dimension is less than or equal to the Hausdorff dimension. The sets for which the two notions agree are called Salem sets.

In this talk, we explore the descriptive complexity of the family of closed Salem subsets of the Euclidean space. We also show how these results yield a characterization of the Weihrauch degree of the maps computing the Hausdorff or the Fourier dimensions.

Title: Accounting with $mathbb{QP}^infty$

by Wayne Lewis (University of Hawaiʻi) as part of Topological Groups

Lecture held in Elysium.

Abstract

Rational projective space provides a useful accounting tool in engineering decompositions of $mathbb{Q}[x]$ for desired effect. The device is useful for defining a correspondence between summands of such a decomposition and elements of a partition of $mathbb{A}$. This mechanism is applied to a decomposition of $mathbb{Q}[x]$ relative to which the correspondence gives the $Lenstra$ $ideal$ $E$, a closed maximal ideal yielding the $adelic$ $numbers$ $mathbb{F}=frac{mathbb{A}}{E}$.

Title: The interplay between randomness and genericity

by Laurent Bienvenu (Université de Bordeaux) as part of Computability theory and applications

Abstract

In computability theory, one often think of (Cohen)-genericity and algorithmic randomness as orthogonal notions: a truly random real will look very non-generic, and a truly generic real will look very non-random. This orthogonality is best incarnated by the result of Nies, Stephan and Terwijn that any 2-random real and 2-generic real form a minimal pair for Turing reducibility. On the other hand, we know from the Kucera-Gacs theorem that for any n there is a 1-random real which computes an n-generic one, but also (and more surprisingly), by a result of Kautz that every 2-random real computes a 1-generic real. These last two results tell us that the interplay between randomness and genericity is rather complex when “randomness” is between 1-random and 2-random or “genericity” between 1-generic and 2-generic. It is this gray area that we will discuss in this talk (based on the paper of the same title, joint work with Chris Porter).

Title: Journey in Hawaii’s Challenges in the Fight Against COVID-19

by Monique Chyba (University of Hawaiʻi) as part of Topological Groups

Lecture held in Elysium.

Abstract

The COVID-19 pandemic is far from the first infectious disease that Hawaiʻi had to deal with. During the 1918-1920 Influenza Pandemic, the Hawaiian islands were not spared as the disease ravaged through the whole world. Hawaiʻi and similar island populations can follow a different course of pandemic spread than large cities/states/nations and are often neglected in major studies. It may be too early to compare the 1918-1920 Influenza Pandemic and COVID-19 Pandemic, we do however note some similarities and differences between the two pandemics.

Hawaiʻi and other US Islands have recently been noted by the media as COVID-19 hotspots after a relatively calm period of low case rates. U.S. Surgeon General Jerome Adams came in person on August 25 to Oahu to address the alarming situation. We will discuss the peculiarity of the situation in Hawaiʻi and provide detailed modeling of current virus spread patterns aligned with dates of lockdown and similar measures. We will present a detailed epidemiological model of the spread of COVID-19 in Hawaiʻi and explore effects of different intervention strategies in both a prospective and retrospective fashion. Our simulations demonstrate that to control the spread of COVID-19 both actions by the State in terms of testing, contact tracing and quarantine facilities as well as individual actions by the population in terms of behavioral compliance to wearing a mask and gathering in groups are vital. They also explain the turn for the worst Oahu took after a very successful stay-at-home order back in March.

Title: Randomness notions and reverse mathematics

by Paul Shafer (University of Leeds) as part of Computability theory and applications

Abstract

There are many notions of algorithmic randomness in addition to classic Martin-Löf randomness, such as 2-randomness, weak 2-randomness, computable randomness, and Schnorr randomness. For each notion of randomness, we consider the statement “For every set Z, there is a set X that is random relative to Z” as a set-existence principle in second-order arithmetic, and we compare the strengths of these principles. We also show that a well-known characterization of 2-randomness in terms of incompressibility can be proved in RCA_0, which is non-trivial because it requires avoiding the use of $Sigma^0_2$ bounding.

Title: A family of metrics connecting Jaccard distance to normalized information distance

by Bjørn Kjos-Hanssen (University of Hawaii at Manoa) as part of Computability theory and applications

Abstract

Distance metrics are used in a wide variety of scientific contexts. In a 2001 paper in the journal Bioinformatics, M. Li, Badger, Chen, Kwong, and Kearney introduced an information-based sequence distance. It is analogous to the famous Jaccard distance on finite sets. Soon thereafter, M. Li, Chen, X. Li, Ma and Vitányi (2004) rejected that distance in favor of what they call the normalized information distance (NID). Raff and Nicholas (2017) proposed a return to the Bioinformatics distance, based on further application-informed criteria.

We attempt to shed some light on this “dispute” by showing that the Jaccard distance and the NID analogue form the extreme points of the set of metrics within a family of semimetrics studied by Jiménez, Becerra, and Gelbukh (2013).

The NID is based on Kolmogorov complexity, and Terwijn, Torenvliet and Vitányi (2011) showed that it is neither upper semicomputable nor lower semicomputable. Our result gives a 2-dimensional family including the NID as an extreme point. It would be interesting to know if any of these functions are semicomputable.