There is something awesome about incredibly large finite numbers. People gush about infinity, but I find it to be a pretty boring concept compared to finite numbers too large to even be written in this universe.
Infinity is aspirational. Infinity is a concept, simple and self-evident, yet paradoxical and oxymoronic.
I get kind of intrigued by these large number things at first but ultimately it feels like kind of a less elegant version of the same thing? It's all this mucking about with multiples and powers of multiples and powers when it's like...we've already taken this to the limit, we can just stand back and marvel at that, what are we looking for? We already know you can always add another digit, why invent more and more complicated ways to do that?
This isn't meant to be contradictory to what you're saying or anything, just interesting to explore these different perspectives on what mathematical concepts capture the imagination.
I'm wondering if there's a connection between large number hunters, unwritten rule proponents in sports and games, and modular synth collectors. There's a sort of satisfaction derived from finding and defining order according to aesthetic structures that are largely arbitrary but also emotionally resonant.
Meanwhile, infinity is for those that embrace chaos, minimalism, nothingness.
Like, there is a perfectly finite number, but is so large that there simply isn’t enough information in the universe
to encode it in any format. How cool is that to just think about for a while?
I think such a number is going to have strange properties like, some number bigger than that unencodable number is encodable because of a special pattern that allows a special non-surjective recursive function to encode it. I am just wondering if there really is smallest number for which no number greater than it is encodable.
It is not obvious to me that the definition of an encodable function has bounded growth: is it true that f(1) - f(0) for encodable f always has a bound given by the amount of data used to encode f? What is that bound?
The parent suggested that the number couldn't be encoded due to its largeness rather than its value. So while any number n with Kolmogorov complexity K(n) > 10^100 cannot be recursively encoded in the known universe, that number n need only be 10^100 bits long. On the other hand a number that's too large to be recursively encoded in the known universe would have to exceed BBλ2(10^100),
where BBλ2 is an optimal busy beaver for prefix-free binary programs [1].
Yes, I understood what the parent suggested. I am pointing out that such a number may have strange properties like the fact that a number larger than it can have a smaller Kolmogorov complexity, then I am questioning whether there is a number such that every number larger than it has such a large Kolmogorov complexity that it cannot be encoded. The question therefore becomes, is there a limit to the size of physically describable numbers? Or is there always going to be some number larger with some trivial kolmogorov complexity?
Postulate: You cannot define a largest physically describable number.
My assumption is that due to the very nature of Kolmogorov complexity (and other Godel related / halting problem related / self referential descriptions), this is not an answerable or sensible question.
It falls under the same language-enabled recursion problems as:
- The least number that cannot be described in less than twenty syllables.
- The least number that cannot be uniquely described by an expression of first-order set theory that contains no more than a googol (10^100) symbols.
If you could encode it that way, then it's incoherent. After all, that encoding exists within the universe. If it resolved to a value, that would disqualify that value from being correct because of the self reference.
Not really as that implies that you have a list of numbers that can be encoded within the universe, but the universe would run out of room keeping that list.
There is enough information if you assume reality is continuous. Pick a point A to be the origin. Then then you can encode the number by placing something at 1/N meters away from the origin.
Relativistic speeds can contract the length of any object as measured from an outside observer. If an object the size of 1 Planck length travels fast enough: you won't be able to measure it, as from your position it would be smaller than the Planck length as it passes by.
It's not impossible (afaik) for things to be smaller than the Planck length. We just don't have the ability (maybe ever) to measure something smaller than this limit.
Now, good luck finding something the size of 1 Planck length, and also to accelerate it to relativistic speeds.
Frustratingly, attempts to discretize space invariably run into problems with relativity, since they effectively impose a preferred frame of reference. I.e. you can impose a minimum distance, but relativistic length contraction means that observers measure different minima and in different directions.
Apparently, under some of these models, this implies that the speed of light ends up depending on wavelength, lending them to empirical tests. My understanding is that these discrete space models have failed to line up with experiment, at least within the limits of measurement.
That's currently unknown. For all current practical purposes, kind of. The plank length sets a limit on spatial resolution of any information, so a finite region with (universally) bounded entropy per conceivable bucket on that scale still has finite entropic capacity.
The current theories use continuous space and time. However, we can't encode information into space itself. We would have to use some configuration of matter, and then there are limits to how well-defined a particle's position can be coming from the uncertainty principle.
On the other hand, general relativity implies that if you put enough matter in a small enough space it becomes a black hole, and then we can't access the information in it.
IANA physicist but I think this line of thought is probably speculative at the moment because it involves both general relativity and quantum mechanics and it isn't known how they should work together.
For me it is far more to consider the number of atoms in the solar system, as it implies a pretty obvious limit on the data storage that humanity can meaningfully use. Obviously only a tiny fraction of those atoms can be used to store information & furthermore the energy required to actually perform that storage is enormous compared to the energy we have available at present.
Would you like to also be able to communicate about this number? You might have to reserve some atoms to form a being that could actually enjoy it. Considering such a perspective, the decoder for observing it should probably be embedded in the number itself.
You are stretching the definition of "is" here. Almost all finite numbers are impossible to actually count up to, so they exist only in the same sense that infinite numbers exist.