Yes I would have put it in the original (pre-edit), but better late than never :-)
Yeah I've never done GPU driver hacking so I can't answer your question with optimal specificity, but there's always something new. It's usually (but not always) backwards compatible, but Apple is a special case since they are their only supported customer. That means that they can iterate very fast with flagrant disregard for backwards compatiblity (I don't necessarily mean that in a negative way, it can be great for development/progress), and it means that any changes they make have to be reverse engineered by setting debug breakpoints, examining state, signals, etc to try and figure out what they changed and how it works now. It's a truly monumental undertaking.
> That means that they can iterate very fast with flagrant disregard for backwards compatiblity
In practice though, Apple is famous for not replacing everything left and right. IIRC, the UART part of their SoCs dates back to the first generation of iPhones, if not even earlier.
Good point. Either other people missed it like I did, or I'm wrong about why the downvotes are happening. (or for completeness I suppose a third possibility that it was edited in later, but I'm not alleging that in any way).
Yeah I've never done GPU driver hacking so I can't answer your question with optimal specificity, but there's always something new. It's usually (but not always) backwards compatible, but Apple is a special case since they are their only supported customer. That means that they can iterate very fast with flagrant disregard for backwards compatiblity (I don't necessarily mean that in a negative way, it can be great for development/progress), and it means that any changes they make have to be reverse engineered by setting debug breakpoints, examining state, signals, etc to try and figure out what they changed and how it works now. It's a truly monumental undertaking.