It's always been pretty mind-blowing to me that you can store over 225 bits of information just in an ordering of one deck of cards. A standard way to store data in this way is a factoradic encoding like Lehmer coding:
I found new joy in shuffling a deck of cards, after learning that every (proper) shuffle that every human's ever done has returned a unique deck that nobody's ever held before.
edit: I just remembered a guy who made a javascript demo that encodes small strings into a card deck order: https://jerde.net/peter/cards/cards.html (explanation page linked)
It's crazy things have sensible probabilities at all, but surprisingly often the virtual numerator and denominator are of comparable size, despite the magnitudes involved.
I used this fact in an interview ages ago. The interviewer wanted a function, in Java, that shuffled a deck of cards such that every permutation was equally likely. I pointed out this was not possible using the standard library random functions since the seed is a long (akshually... it's 48 bits).
Since log(xy) = log(x) + log(y), you can simply calculate sum(log2(i) for i in range(1, 53)) :) might be a nicer formulation for when you don’t have arbitrary precision support.
https://en.wikipedia.org/wiki/Lehmer_code
The python library 'permutation' has some functionality around this that's fun to play with:
https://permutation.readthedocs.io/en/stable/#permutation.Pe...
I found new joy in shuffling a deck of cards, after learning that every (proper) shuffle that every human's ever done has returned a unique deck that nobody's ever held before.
edit: I just remembered a guy who made a javascript demo that encodes small strings into a card deck order: https://jerde.net/peter/cards/cards.html (explanation page linked)