That's a low pass filter. And I suppose in the type of highly-resilient slow changing signals like OP mentioned in a different comment, it makes sense to do that and even to refer to it as "debouncing". But for general consumer electronic devices with basic local buttons, the standard algorithm is something more like act on the first close, then ignore the next 10 samples. When there is no other reason for a switch to read as closed, the first close means the button has definitely been pushed. The point of debouncing is to avoid seeing the open-close-open-close oscillation afterwards (contacts settling) as their own release-press combos.
Sure, but why does it feel like we're going around in circles here? Labeling that an "asymmetric algorithm" is technically correct but it implies it's some rarer special case. Whereas it's literally just the common debouncing technique, at least when I was learning embedded development ~20 years ago.
And there's no reason for a keyboard to be using anything different. As I said, the real delay factor with keyboard is matrix scanning. If there is a keyboard that has 30ms of latency to register a keypress, I would guess that a ~400Hz (sqrt(104) -> 11 columns?) scanning frequency was as good as could be handled by whatever early cheap USB microcontroller they used, and its designers figured that was good enough for productivity use.