Papers in reverse chronological order.
Summary. Shows that effective Hausdorff dimension of base-b expansions and effective continued fraction dimensions are not the same for every real, contrary to expectations.
Answers an open question posed by J. Hitchcock and V. Vinodchandran (CCC 2003) in the negative, if one-way functions exist - we show that if one-way functions exist, then two notions of polynomial-time dimension, one using Kolmogorov complexity rates, and the other using betting algorithms called gales, are inequivalent. This is the first setting where effective dimension is non-robust. We also show that the hypothesis is essentially necessary, showing that if there are sufficiently many sequences where these notions differ (i.e. if there is a P-samplable distribution where this set has positive probability), then we can construct i.o. one-way functions from this. This adds to the recent literature on meta-complexity.
Summary. Generalizes classical and effective Hausdorff dimension to a larger family of coverings, which includes base-b expansions of reals, Cantor series expansions of reals and continued fractions. A cover family is said to be faithful if, for every set, the dimension we compute using these covers is the same as Hausdorff dimension. We investigate whether classical dimension is preserved precisely when effective pointwise dimension is preserved, by proving a "point-to-set principle" for faithfulness.
Summary. Introduces a robust, relativized notion of finite-state dimension. Using this, we show that van Lambalgen's theorem works for finite-state randoms (i.e. normal numbers). This is the first "resource-bounded" setting where van Lambalgen's theorem is known to hold. We also get a stronger version of Wall's theorem on A. P. subsequences of normals, which characterizes normals, unlike Wall's original theorem.
Summary. Generalizes Weyl critierion for normality (1916), which works for almost all reals (those which are normal), to work for all reals, without exception. (This turns out to be highly non-trivial.) We show, using the theory of weak convergence of probability measures and Rényi dimension of weak limits, how Weyl exponential averages, even divergent ones, relate to finite-state dimension (finite-state compressibility).
Summary. Answers an open question posed by Lutz (2012), and Lutz, Mayordomo (2020). Just as there are absolutely normal numbers, and numbers which have 0 finite-state dimension in every base, they asked, are there numbers which have the same intermediate finite-state dimension, say s, 0 < s < 1, in every base b > 2? We provide an explicit construction for such a number, showing more generally that any reasonable set of dimension requirements for different bases can be met by a single real.
(This paper got a mention in Richard Lipton's blog.)
Summary. Studies ergodic theorems in resource-bounded settings for the first time. We focus on ergodic averages of PSPACE functions on PSPACE and SUBEXP space randoms, establishing sufficient conditions in the first, and characterizations in the second, for ergodicity on these randoms.
Summary. S. S. Pillai showed that base-b normality can be equivalently defined by considering "overlapping" occurrences of blocks, as well as disjoint occurrences of blocks, and these two notions are the same for all base-b normals. We show the analogous equivalence for continued fraction normality. We demonstrate its utility by giving a combinatorial proof of Heersink and Vandehey's surprising result that A.P. subsequences of continued fraction normals are never normal.
Summary. We define the notion of computably random continued fraction (c.e. randomness was previously known). We show that these are precisely the computable randoms. Further, we define effective Hausdorff dimensions of individual continued fractions and provide explicit constructions of continued fractions with dimensions 0 and 1.
Summary. We answer an open question by Barmpalias and Lewis-Pye (2014), and show that there is a weak-2 generic that bounds a minimal degree. This implies that weak-2 generics are not dense. Earlier, Jockusch had shown that 2 generics are dense, and Kumabe, Chong and Downey had shown that 1 generics are not dense.
Summary. This paper shows that the Furstenberg multiple recurrence theorem holds effectively - that is, for systems that are suitably computable, and which satisfy the Furstenberg conditions, every Martin-Löf random is multiply recurrent in effectively closed sets.
Summary. This paper adapts the theory of effective dimension by Lutz and others, to quantify the amount of randomness present in pseudorandom distributions.
Summary. We show that resource-bounded versions of the van Lambalgen's theorem may not hold, and the mode of failure depends on the notion of resource-bounded randomness we use (i.e. compressibility versus betting). This paper also contains two new proofs of the classical van Lambalgen's theorem.
Summary. Gives a nearly linear-time construction of a certain class of transcendental numbers, namely Liouville numbers, which are normal - i.e. look random to finite-state compressors. The result is surprising since the standard examples of Liouville numbers all are highly compressible, so it was conceivable that all of them are compressible. The construction contains two key insights - one, that the "padding" inside a Liouville number need not be zeroes, it can be any pattern, even a random-looking one, and two, by using rationals with repeating non-zero blocks as approximations (instead of rationals truncated with zeroes), we can control the diophantine approximation.
We also give Liouville numbers for intermediate compressibilities between 0 and 1.
(see: reading list on algorithmic information theory, by C. Shalizi.)