Categories
Math

The Transcendentality of Normal Numbers

Here, we investigate a conjecture I have, that all normal numbers are transcendental.

It seems that most people in fact believe the opposite, and furthermore believe a stronger opposition to the conjecture, that every irrational algebraic number is normal. Thus, if enough people haven’t embarked on investigating the alternative possibility, it’s possible that my conjecture is within reach.

This would extend a certain theme that we’re seeing in the properties of certain kinds of numbers. For example, one part of this theme can be informally summarized as: exoticness is the norm. For numbers that are “well-behaved” in some way, they’ll end up taking up less “space” among the real numbers than intuition would initially suggest (they’re countable, or they have density zero, or something like that.)

Elaborating more, we have:

1. The rational numbers are “well-behaved” in terms of predictability of their digit expansions, but they are only countable in the reals.

2. The rational numbers are a subset of the algebraic numbers, which are “well-behaved” in terms of polynomial equation solutions, but the algebraic numbers are still only countable in the reals.

Now, my conjecture is equivalent to the statement that all algebraic numbers are non-normal. If that’s the case, then the algebraic numbers would be a subset of the non-normal numbers, which are well-behaved in terms of having more predictable digit expansions (for example, non-normal numbers include the rationals.) In fact, in the expansion of a normal number, every string must show up with equal probability, which in a sense means no predictability of which string comes next, or if we look at a given subsection of the expansion then no predictability of what the value of that subsection is. It is known however that non-normal numbers, while uncountable, have measure zero, implying that the “exotic” property of normalness is in fact the most common.

(The terminology sounds opposite to this, since here the property of normalness is what is “exotic.” The existing terminology makes sense from the “cleanness” of the definition of a normal number, similar to the “cleanness” of the definition of an equidistributed sequence, but in this context it’s actually the property of non-normalness that signals well-behaved.)

The idea that exoticness is the norm is already known for non-normal numbers, but an additional idea that we can see here is that the “well-behaved” sets of numbers all end up being wholly included within each other. If true, the conjecture would contribute further to this.

For these “well-behaved” sets, another theme we can notice is that fundamental “natural” constants like pi and e end up being exotic even if we continue loosening the standard for “well-behaved” and correspondingly tightening the standard for “exotic.” It was first proven that pi and e are irrational, but then it was later shown that they are in fact transcendental, which signifies a greater level of exoticness.

Another example of a “well-behaved” set of numbers is the computable numbers, and in keeping with the theme that well-behaved sets are rarer and ordered by inclusion, it is known that the algebraic numbers are a subset of the computable numbers and that the computable numbers are countable. However, pi and e are clearly computable since approximation algorithms to arbitrary accuracy are known for them, so the theme that fundamental constants are exotic doesn’t extend this far.

I suspect that the computable numbers are a subset of non-normal numbers; then, all algebraic numbers would be non-normal, establishing my conjecture and further affirming the theme that well-behaved sets are rare and ordered by inclusion. Equivalently, all normal numbers are uncomputable, which is actually intuitively quite immediate: if we couldn’t predict a given subsection of the number’s digit expansion, then how could we compute the expansion to arbitrary accuracy? As a possible proof sketch, if we had a normal number x that was computable, then we could use the algorithm from the computability property to predict subsections better, which would hopefully imply non-uniform probabilities for what shows up, further implying non-uniform densities of digit strings.

Since this intuition seems quite solid, I’m surprised that this isn’t something that has been proven yet — and it definitely hasn’t been, otherwise people would know that normal numbers must be transcendental and they wouldn’t wonder about the normality of irrational algebraic numbers. (Even the normality of specific and very common irrational algebraic numbers like sqrt(2) is unknown.) For whatever reason, people may just have not considered the possibility that normal numbers must be transcendental, and thus may not have investigated supporting or adjacent lines of research to that?

Actually, establishing that normal numbers are not computable would go beyond that, and it would actually settle a lot of outstanding conjectures about normal numbers. For example, it would imply that pi and e are not normal, as well as related numbers like pi+e and pi*e. So let’s see if we can use our intuition to establish this.

On second thought, my intuition seems suspect if we look at common examples of normal numbers. Indeed, intuitively speaking, any number that we could “describe” would be computable, and the main known examples of uncomputable numbers come from “non-constructive” methods based on the halting problem. There are many numbers that we can artificially construct to be normal, like Champernowne’s constant (0.123…) or the various other examples listed in the Wikipedia article on normal numbers. Based on their construction, they certainly seem like they would be computable, so it’s unclear whether our intuition holds up here.

The main discrepancy seems to be this: the (informal) idea that “frequency corresponds to probability” is predicated on assumptions about what our prior known information is. If we’ve “described” a number where we can get the frequencies to be such that it’s normal, we can use this description to “predict” or directly derive the value of a subsection of the digits expansion. For example, we can clearly derive the value of any subsection of Champernowne’s constant. The lack of predictability only happens when we don’t know prior information about the number. If we were given a number x and were given the frequency (in the limit) of all strings, and if we were not told any other information about x, then x being normal would offer the least predictability of subsections of x. Given more specific information about x, however, we could eliminate much of this unpredictability, while still keeping the frequencies of strings or nornality.

It thus seems very likely that we could produce normal numbers that are computable. Let’s see if we can do this, to validate our latest intuition. And let’s see if we can nevertheless use a different method to show that normal numbers are transcendental.

Before we try this, let’s discuss some implications regarding our initial discussion. If we can produce computable normal numbers, then it wouldn’t be possible that the computable numbers are a subset of non-normal numbers. Does this mean that the idea that “well-behaved sets are ordered by inclusion” is violated? Do we know that the non-normal numbers aren’t a subset of the computable numbers? Well, we know that the non-normal numbers are uncountable while the computable numbers are countable. This means that the non-normal numbers and the computable numbers can’t be ordered against each other. Consequently, normal and non-computable numbers can’t be ordered against each other. What about normal and computable numbers (or equivalently, non-normal and non-computable?) Well, there certainly exist non-normal computable numbers (like rationals), and there certainly exist normal non-computable numbers (like the Chaitin’s constants.) Thus, this would violate well-behaved sets being ordered by inclusion.

In retrospect, given that this idea came from an observation and wasn’t really justified further, it seems like something that wasn’t too likely to hold up when we went further. In fact, there are so many different contexts and directions from which we could formalize particular incarnations of “well-behaved,” so to expect that these would all be related is a long shot, especially without more conditions on how these contexts and directions relate to each other. For example, the fact that rational numbers are a subset of algebraic numbers is clear from the very definition of algebraic: it’s based in the first place on “extending” the rational numbers, via polynomials that are explicitly said to have rational coefficients. Similarly, the fact that algebraic numbers are contained within computable numbers is clear from very definition of computable and the fact that any polynomial can be computed to arbitrary precision. It would be a long shot to expect this theme of inclusion to continue for other “well-behaved” sets, especially when their particular definitions don’t hint at it.

So it’s not clear anymore whether we should intuitively expect algebraic numbers to be contained within the non-normal numbers, especially if we don’t have further intuition that could suggest this inclusion. Nevertheless, given that it seems to be a possibility that has been less explored, and given how little progress we’ve made in understanding normal numbers (for example, for no particular irrational algebraic number have we shown either normality or non-normality), it still may be worth thinking about and investigating as a research direction.

The main (informal) theme that is holding up here regardless is that “exoticness is the norm,” and indeed, if we consider normal numbers to be exotic based on unpredictability given just their digit string frequencies, then this is already known to be true for normal numbers. This is easier to motivate: since the reals “add so much more” by “filling in all the holes” in the continuum, any notion of “well-behaved”-ness (or more specifically, any that doesn’t include the concept of continuity) would “fall short” when compared with the “full” continuum. The continuum includes “all the points in the middle,” which would outnumber anything that non-continuous notions of “well-behaved” could “possibly reach.”

OK, we’ve been discussing a lot of intuition, which is a great guide, but let’s see if we can prove some of our formal claims. Specifically, let’s show that there are normal numbers that are computable. (We mentioned some examples, but technically those are only known to be normal in a particular base, so we’ve so far established that there are numbers normal in a particular base that are computable. What about numbers that are normal in all bases?)

Actually, the Wikipedia article states that this is known: a paper published in 2002 shows that there is a computable, absolutely normal number. (Given the relative lateness of the paper’s date, it seems that this might in fact be hard to prove.) Thus, this point is settled.

It remains to further investigate the transcendentality of normal numbers.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.