But now, a trick, to make browsers even angrier (which is starting to come up as a use-case in, e.g., forum signatures and avatars):
1. set the GIF to a non-native scale (e.g. width="100" height="100");
2. apply a CSS animation to it, where the CSS animation loops a filter() value on the GIF. For example, filter: hue-rotate() the GIF through a 360 degree cycle every 2s. For maximum pain, apply a filter: blur(). (Warning: this last one may actually crash your browser, or at least cause severe artifacting on random parts of the page. I've been intending to report it to Chrome's bug-tracker for a while...)
I believe this causes as much havoc as it does, because GIFs are always decoded on the CPU, and the results of decoding GIF frames and scaling them to display size are never cached; instead, each time a GIF transitions to a new frame, it writes the frame-delta directly on top of the previous frame's video memory. So, when you apply a GPU-bound filter to a CPU-bound write-heavy texture, you give the GPU's pipeline a pessimal case.
But now, a trick, to make browsers even angrier (which is starting to come up as a use-case in, e.g., forum signatures and avatars):
1. set the GIF to a non-native scale (e.g. width="100" height="100");
2. apply a CSS animation to it, where the CSS animation loops a filter() value on the GIF. For example, filter: hue-rotate() the GIF through a 360 degree cycle every 2s. For maximum pain, apply a filter: blur(). (Warning: this last one may actually crash your browser, or at least cause severe artifacting on random parts of the page. I've been intending to report it to Chrome's bug-tracker for a while...)
I believe this causes as much havoc as it does, because GIFs are always decoded on the CPU, and the results of decoding GIF frames and scaling them to display size are never cached; instead, each time a GIF transitions to a new frame, it writes the frame-delta directly on top of the previous frame's video memory. So, when you apply a GPU-bound filter to a CPU-bound write-heavy texture, you give the GPU's pipeline a pessimal case.