Yet we know for a fact irrational numbers exist, and some of them are even trivial to grasp (like pi, which they mention how they are calculating all the digits of, which is weird on its own since pi has a greater than aleph-0 digits in any numbering system, because it's, well, an irrational number).
But writing a coherent story around non-countable infinity is so much harder because our brains struggle to grasp that concept altogether.
Basically, my point is that our brains have a few limitations which we work around by simply ignoring the stuff that does not compute. And that there are much more approachable things which belong there, like the ratio of a circumference and radius of a circle.
Yet even in the quantum future, it's hard to imagine "real" number of universes because we are so bound by the countable numbers.
I'd like to see a story go that far, but it's likely not to be very readable because humans today don't think of irrational numbers as irrational.
Hum, how so? If a number is represented by an aleph-0 digits (in decimal system), it is clearly countable, since aleph-0 is a countable infinity (equivalent to the infinity of natural numbers).
I guess you may have somewhere confused "countable" and "rational" (and maybe "periodic" or "algebraic" or "computable" or ...).
Irrational means: cannot be expressed as "integer divided by integer".
As a consequence, the decimal digits of rational numbers start periodically repeating at some point. That is because, as you keep dividing, after some point the remainder can only be a number between zero and denominator minus one, which is a finite number of options, so when you get the same remainder again, the loop restarts.
Therefore, if the decimal digits do not repeat in loop after some point, the number is irrational. This is true regardless of whether the pattern of decimal digits is something complex, or something quite simple but not exactly a loop; for example "1.101001000100001000001..." would also be irrational (i.e. not a fraction of two integers).
Technically, individual real numbers cannot be "countable"; that adjective only refers to sets (and ordinals or kardinals, but those are not real numbers). In standard math (i.e. not hyperreal numbers), every real number in decimal expansion has a finite, or countably infinite number of digits. Countably infinite here means that you can, literally and straightforwardly, count the decimal digits: "this is the first decimal digit", "this is the second decimal digit", etc.
Then there is a question of whether we could write an algorithm that prints those digits. Obviously, for rational numbers, we could: print the (finite) part before the infinite loop, then keep printing the (finite) contents of the (infinite) loop. We could also so it for some irrational numbers, such as the "1.101001000100001000001...". Even for pi, e.g. using the Taylor series. However, for many real numbers, which are effectively just infinite sequences of random digits, we can't do that.
tl;dr -- all real numbers have countable (or finite) number of decimal digits
But writing a coherent story around non-countable infinity is so much harder because our brains struggle to grasp that concept altogether.
Basically, my point is that our brains have a few limitations which we work around by simply ignoring the stuff that does not compute. And that there are much more approachable things which belong there, like the ratio of a circumference and radius of a circle.
Yet even in the quantum future, it's hard to imagine "real" number of universes because we are so bound by the countable numbers.
I'd like to see a story go that far, but it's likely not to be very readable because humans today don't think of irrational numbers as irrational.