Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The first conformant M1 GPU driver (rosenzweig.io)
1569 points by todsacerdoti on Aug 22, 2023 | hide | past | favorite | 657 comments


I wonder if support for OpenGL, Vulkan, etc will improve now that Apple is partnering with nVidia, Adobe, Autodesk, Microsoft, etc around the OpenUSD rendering/animation/CAD/3D-scene format?

Considering the whole schtick of OpenUSD is "one file format that renders consistently everywhere" (paraphrasing), I would be surprised if Apple doesn't use it as a means to cement more 3D software vendors into macOS land. It's really hard to render consistently if the underlying drivers are all wonk and proprietary.

I am curious to see how this plays out. In my mind, there are two options:

1. Apple conforms to the existing standards of OpenGL and Vulkan we see gaining steam for many film and game production pipelines.

2. Apple tries to throw its weight around and force devs to support their Metal standards even more, ultimately hoping to force the world onto Metal + macOS.

My heart hopes for option 1, but my gut tells me Apple is going to push for option 2 with all the might it can muster. In my experience, Apple doesn't like any standards it doesn't control with an iron fist (not really saying much about Apple here though... nVidia, Autodesk, Adobe, and Microsoft are all the same).

The next couple of years are going to be interesting for sure!


> In my experience, Apple doesn't like any standards it doesn't control with an iron fist

I would add some nuance to this statement: "Apple likes open standards when it is weak."

The iMac and early OS X went big on open standards, and Jobs made a point of pointing this out: USB for the iMac, JPEG, MPG, mp3, Postscript etc for OSX. IP/TCP built in. They even paid the danegeld for .rtf.

Then as they clawed their way back from the precipice, they started "adding value" again.

The iphone was an HTML device, loudly repudiating the proprietary (and terrible) Flash much less the crappy, mostly stillborn "mobile HTML" attempts.

You still get H.264, matter/threads, and other standards they don't control, where they don't have market power.


You might actually expand "Apple likes open standards when it is weak" to "companies likes open standards when they are weak."

Generally speaking you get standards consortiums when there is a clear winner that is mopping up the space.

Here's an example that's happening right now: Nvidia-NvLink-Infiniband.

Nvidia owns the highspeed interconnect inside the chassis (HGX), the NICs (Mellanox), the inter-host interconnect (Infiniband), the high performance inter-host interconnect (switched NvLink), the and the ethernet network (Mellanox has the same 52.1Tbps switch performance that everyone else has now). GPU training is RDMA heavy and this is a place where both NvLink and Infiniband shine, ethernet much less so. Retransmissions are very bad, in global-performance-terms, for ROCEv2 transfers. Right now Nvidia is just crushing it and there's zero chance anyone is going to catch up by introducing new Infiniband ASICs.

So what happens? You have a consortium spun up by all of the companies in the Ethernet space - Ultra Ethernet Consortium - to try and use "standards" to push back on customers who don't want to make big investments in "non-standard." UEC is pretty vague but seems to be promising Broadcom-style cellized fabrics, the whole point of which is to have an ethernet-like standard that avoids ECMP-induced flow collisions and retransmissions - that is, get Ethernet into the same territory as Infiniband.

If you look back in time in the tech industry, you see this over and over and over and over. Standards are great, they make certain kinds of multi-sided markets and markets that need broad participation to be viable possible - but they are also routinely about the losers joining together to compete.


I'd even bend that quote to say "Given a problem that maximizes their profit if solved, it's an acceptable compromise for companies to resort to solutions they don't control, if they don't have the ressources to create competing solutions they'd do control."


> they are also routinely about the losers joining together to compete.

That is the greatness of it. It reminds me of democracy: The less powerful join together to give everyone an equal vote, rather than having one vote per dollar.


In this case though dollars work just fine - the incumbent who probably invented the market now gets smaller companies banding together against it after a while.


Same for adaptive sync. Gsync was first to market by a country mile, the adaptive sync standard wasn’t even approved until like 6 months after the first gsync stuff showed up and if nvidia hadn’t gone first it would have taken much longer if it happened at all.

Freesync wasn’t really market-ready until at least 2015 and the early products were mostly junk, it’s not until the gsync compatible program that any vendors really cared about LFC or flickering issues.


> Right now Nvidia is just crushing it and there's zero chance anyone is going to catch up by introducing new Infiniband ASICs.

I mean, Cornelis Networks is trying with the resurrected OmniPath. I hope they pull it off, but I'm not holding my breath.

See: https://www.cornelisnetworks.com/products/


It's not going to work because they are too late.

HPC cluster builds are complex enough due to the presence of multiple networks (2x moderate-scale infiniband chassis and 2x ethernet chassis as a minimum) without introducing unknown vendors. At that point, if you're doing IB, why not just go Mellanox since you will almost certainly buy 200GE Connectix NICs and not the 100G NICs from these guys.

UEC will - like most standards - _eventually_ work. In the mean time, Nvidia pods are the obvious choice for anyone who really cares about performance, and other vendors (Cisco, Arista) if they don't.


Idk, I personally did it for a HPC cluster two years ago. 2x100GbE + OmniPath was a sensible way to reduce cost, especially as the cluster was very light on GPU power and mostly focussed on CPU-bound jobs. Last I heard everyone there is still very happy with what we built.


Was that before Intel spun it out? I can see people being willing to do that build if Intel was seemingly on board. Today things are pretty different.

Modern HPC mostly means GPU compute and tons of data shuffling, but point taken. CPU bound jobs aren't going to stress the i/o, so you probably could have done 100GE for less. I'm curious what you did for storage but I'm guessing with CPU-bound that is again much less of an issue.


While we’re over generalizing… What could be said about grand unification vs the march of progress? Can the laws of physics change? Must we reinvent arithmetic?

One of my greatest joys learning to program is reading old (well written) code, and piecing apart the timeless from the legacy.


> losers joining together to compete

You are saying it like it is something bad. Competition is good for consumers, and I cheer any means to spin up, sustain and heat competition.


Or wait for the incumbent to get big fat and lazy before beating them using a 10x faster, dumb, "unsuitable" interconnect loaded to only to half its theoretical bandwidth.


>The iphone was an HTML device, loudly repudiating the proprietary (and terrible) Flash much less the crappy, mostly stillborn "mobile HTML" attempts.

Skipping Flash wasn't so much an ideological decision as a practical one.

At the time Steve Jobs listed a ton of reasons that they didn't implement Flash. Listed among them were concerns about it not being an open standard, inferiority to H.264, security and performance issues, etc. However, all of these things could've been ignored or overcome.

The principal problem was that a huge proportion of Flash applications, games, and websites used mouseovers as crucial methods of interactions, and Apple simply had no way to allow users to mouseover an element on a touchscreen.


That could have been overcome. The general crustiness of flash could not have been (from Apple's POV).

Apple used to ship a dev app called "Spin Control" that would log stack traces whenever an app failed to drain its event queue in a timely manner (i.e. beachball). One time I accidentally left this open for an entire week, went about a bunch of assorted business, and when I came back every single stack trace had to do with flash, and there were many. Either flash in a browser or flash in a browser embedded in something else (ads embedded in 404 pages for broken help pages that were never even displayed, lol). At first I thought it had to be a mistake, a filter I had forgotten about or something, so I triggered a spin in Mail.app by asking it to reindex and sure enough that showed up as the first non-Flash entry in Spin Control.

As hard as it was to believe: Flash had been responsible for every single beachball that week. Yikes.


I was mistaken, I found a copy of Jobs' letter re:Flash and he does cite the proprietary nature of Flash and Apple's lack of control over the content served on its platform as the most important reason Flash was kept off the iPhone.

https://www.cnet.com/culture/steve-jobs-letter-explaining-ap...

  As hard as it was to believe: Flash had been responsible for every single beachball that week. Yikes.
I used OSX and Flash in the late 2000s so I have no problem believing that.


> This becomes even worse if the third party is supplying a cross platform development tool. The third party may not adopt enhancements from one platform unless they are available on all of their supported platforms. Hence developers only have access to the lowest common denominator set of features. Again, we cannot accept an outcome where developers are blocked from using our innovations and enhancements because they are not available on our competitor's platforms

Funny how one could and maybe should point the same criticism towards electron.


Better than OS 9 and Flash when it would bring down the whole system!


Pre OSX Macs I found to be more unstable than Win95 osr2. Apple really needed OSX!


Mostly thanks to flash!

Even in OS X it was still a bad plugin, one of Snow Leopard's headline features was that plugins were moved to separate processes so that Safari could keep running instead of bringing down the whole browser when a plugin had a problem.


If memory serves me right, that was one of the big raisons d'etre for Chrome (released in 2008)... so Snow Leopard (2009) was catching up with it more than solving a Mac specific issue. But yes, back then the big big plugin was Flash, and its security and performance left a lot to be desired (in all platforms).


I wasn't even using flash. It was Photoshop. When this bogged down from switching between applications it would just shit the bed.


Back in the day there was even a Mac browser plugin called "Click to Flash" that would prevent Flash from loading in web pages and draining battery like crazy. Made the web livable and more secure throwing Flash garbage ads everywhere.

Ah, the dev page for these: https://hoyois.github.io/safariextensions/clicktoplugin/

People also don't remember the huge number of CVEs that Flash had: https://www.cvedetails.com/version-list/53/6761/1/Adobe-Flas...


There were also browser extensions that'd replace YouTube's terrible flash player with an HTML5 h.264 player which was a godsend on single-core PPC G4/G5 machines, where flash would happily keep the CPU pegged for no good reason.


Yeah Flash was slow, but the common alternatives for in-browser animation/games were (and still are) far slower. H.264/5 maybe did video better than Flash, even then idk, Flash YouTube was always way faster on my old iMac than HTML5 YT. Google Hangouts/Meet will lag a high-end Intel MBP thanks to the unaccelerated VP8/9 video, but AIM Flash-based video calls ran fine on an iMac G4. On top of that, there was never a real replacement to the Flash creation software. All those Flash games that people made without really understanding how to code, no longer doable.

I guess Flash had to die because of how outdated, insecure, and proprietary it was. It did seem like a nightmare to support well on mobile. Just wish someone made something strictly better.


Perhaps it was fast due to lack of tight security policies etc. These days, everything has to be containerized or be in a sandbox. That adds layers of overhead. Probably also why Google killed NaCl (Native Clients for Google Chrome). It was loosely in the same space as Flash and ActiveX - anyone remember that?


I've thought about that too. Can only guess about it myself. Old software tends to be faster just because it has to be, and there isn't always a compromise other than time-to-implement.

Not sure how ActiveX's sandboxing worked, but I'll bet it was even less than Flash, since it was running actual x86 code.


ActiveX required that the ActiveX code you wanted to run be already installed on your computer. The security mechanism was that users obviously wouldn't install insecure stuff...

Turns out nearly every extension had gaping security holes.


Not quite. ActiveX would download and run unsandboxed native code direct from the web, even if it wasn't installed. The security mechanism was a popup dialog that contained the publisher name as verified by a code signing certificate.


This is how I remember it. I was dealing with early 2000s smart home software that used ActiveX. I'd visit a website in IE, press ok, and run their "web app" that had raw access to my serial and ethernet ports. It was bizarre even for back then.


Not hard to believe. Reliability was one of the points in Steve Jobs' _Thoughts on Flash_:

> We also know first hand that Flash is the number one reason Macs crash. We have been working with Adobe to fix these problems, but they have persisted for several years now. We don’t want to reduce the reliability and security of our iPhones, iPods and iPads by adding Flash.


Yeesh, thank you for the reminder about how much the flash-based web sucked. It's easy to forget.


Part of it sucked, part of it was great. On Mac-side the experience was worse than on Windows, that's for sure.

But Flash also allowed many games to be easily developed and played in the browser. Lots of fun cartoons were made (the Metallica cartoons "fire bad" on Newgrounds come to my mind now).

It's a shame Flash sucked so much on Mac, since the developer behind Flash [0] did create some nice games on the Mac early in his career, namely Airborne! and Dark Castle.

---

[0]: https://en.wikipedia.org/wiki/Jonathan_Gay


I had no idea! Those games were amazing.


My fuzzy recollection was that the OS X version of Flash was much worse than the Windows version. Given "Thoughts on Flash" and the direction of the iPhone, this ended up being very expensive for Adobe :)


I always thought Flash was unusably slow, until I tried it on Windows. On Windows Flash was fast and fluent, on the Mac it was choppy and unbearably slow.

Since Flash was so ubiquitous on the web, this made Macs a lot slower for many tasks than Windows computers. No matter how much Steve Jobs touted the power of the G4, and boasted about the speed of Final Cut, nobody would believe it when their Mac couldn't even run a space invaders clone in IE4 fluently.

Adobe ignored performance on Apple devices for years. There was no way in hell Apple would allow Adobe to do the same to iOS.


> Flash had been responsible

Was it just Flash or low skilled web developers using Flash the wrong way, adding 10 Flash advertisements on a page and so on?


Spin Control lives on as part of Instruments, which can be configured to collect a tailspin whenever an app hangs.


I don't think that wasn't the deal-breaker. Jobs didn't want Flash to become the de facto development environment for the phone.

I'm pretty sure he always knew they'd end up with apps on it, they just didn't pull that together for the first release, thus the HTML song and dance. But if they supported flash, that would reduce a lot of the demand for apps later, and worse - it would be cross platform.

So he used the other (still good) reasons - battery life, security, etc. to obscure the real reason - Apple was not yet ready to compete with it on their own terms, so they banned it.


Apple did find a solution in mobile Safari for touchscreen hover states on the Web. However the Web platform generally offers more affordances for accessibility than Flash ever did, which I'm sure helps.


If I'm not mistaken the solution was to make touching an element trigger the :hover state and the click action unless the :hover state changed the visibility of another element. If the :hover state changed the visibility of another element, then the click action was not triggered until a user tapped again.

This is possible in HTML because it's trivial to determine whether or not a :hover changes display or visibility properties of other elements. As you've supposed, Flash did not afford browsers with that sort of ability.


> more affordances for accessibility than Flash ever did

TBF one thing Flash did manage to achieve was the proliferation of web sites and apps that were as hard to use for people without protected disabilities as for people who did have them.

Whether this increased empathy for people with disabilities is an open question


>The principal problem was that a huge proportion of Flash applications, games, and websites used mouseovers as crucial methods of interactions, and Apple simply had no way to allow users to mouseover an element on a touchscreen.

It's more about control and all the other reasons.

Mousovers is a dead herring. They could still give Flash not to run existing stuff, but new apps, that would take iOS into account.

And legacy flash movies and animations and games that just needed click and not mousovers would still work. To be frank, mouseover interaction weren't really that big in many (most?) Flash games anyway.


Mouseovers were just the tip of the iceberg: fixed screen sizes and other desktop UI conventions, assumptions that you could just leave things constantly running rather than figure out how to do proper event-driven programming, etc. Yes, they could have tried to do a “Flash mobile” but most of the appeal was compatibility with the huge library of existing apps and users wouldn’t have been happy with that, while authors would have bristled at having to give up their favorite hacks. Flash was a fractal of bad programming and UI design, and there was no way to get that community to improve since Adobe was one of the worst offenders and didn’t care about their platform in any discernible way. People wrote Click-to-Flash plugins to keep it from crashing your browser, every browser had to change their architecture to handle Flash crashing, the plugin & authoring tools had tons of performance, stability, and security issues – and Adobe just kept snoozing, waking only enough to cash the royalty checks, and claim the next version would be totally better.

Another poster mentioned the hangs - which is very true - and I can say that the desktop Macs I supported had a high 90% of the crash logs show Flash as the culprit, not to mention almost every CPU / battery life issue.

There were some great artists who produced neat work despite it but the only company to blame for Flash’s demise is Adobe. As a thought exercise, ask why Flash on Android consistently sucked – if they were trying to make the case that Apple should reconsider, they could have put at least one intern on making that seem appealing.


> Adobe was one of the worst offenders and didn’t care about their platform in any discernible way.

This is par for the course for Adobe. The other day I had occasion to try to fill out a PDF form using Acrobat Pro. I made it through about a page (painfully slowly) until I unwisely saved my work. Then I cursed for a bit, tried quitting and reloading, and eventually gave up and started over in PDF.js. Superior in every way.

I remember when a major selling point of Acrobat was that you could save a filled out form, whereas third party apps couldn’t. Apparently doing so still breaks the form, and third party apps have gotten it right for many years now.

Adobe seems to pretty much never care about their platform once they have market share.


I remember working on a project in 2010. We hit multiple crashing bugs in Flash (both runtime and IDE) doing fairly basic stuff. I figured we’d paid for support, might as well use it, and reported them to Adobe.

Literally never heard back until a year later when they closed the tickets saying it might be fixed in the next release and we should buy licenses to find out (spoiler: no).


FYI, the idiom is "red herring"


I remember my old Samsung Galaxy Note (or was it the Note 3?) could actually sense when your finger was near the screen but not touching, for mouseover events.

I always wondered why that feature didn't continue - I remember it working quite well, but IIRC it only worked in Chrome browser.


Also, from a practical perspective, even Adobe never had a fully working (feature parity to desktop) way to actually load Flash on the iPhone. Apple kept asking for one: Adobe could never produce something that wasn’t buggy crap.

There were some 3rd party things that sorta worked a bit, but they were not good either.

Flash was bad on touchscreens for sure, but we’d have seen content adapt eventually anyhow, if it had actually ever worked in the first place.


Very true. In case anyone is too young to remember, in the days of the iPhone 3G, it was somewhat popular to buy apps which would use a real browser on a server somewhere to render a browser session, complete with Flash, and stream it to you. It was very handy for those of us who played Flash games and needed to check in on our game on the go. (Think FarmVille, Cityville, etc.)


That, and even if you somehow fixed that, Flash on mobile sucked. I had Flash on my Windows Mobile handset. It was horrible!


> "Apple likes open standards when it is weak."

Everyone does. AMD is the same. The market leader focuses on features, the runners up try to take them down with openness. The competition is good for consumers, but the motivation is one of self-interest, not the common good.


    Then as they clawed their way back from the precipice, 
    they started "adding value" again.
I don't know if I can agree. On the software side, MacOS supports all those things. On the hardware side, it's still (edit: almost) nothing but industry-standard ports.


One thing that I've enjoyed is how for decades Mail.app has been first and foremost a generic IMAP mail client, with features specific to iCloud Mail being far and few between. It works exactly as well with e.g. Fastmail as it does iCloud Mail. Conversely, iCloud Mail is just plain old IMAP and works fine with other generic IMAP clients.

Compare this with Gmail for example, which has a crummy IMAP implementation, and though the Gmail mobile app supports adding non-Google IMAP accounts, it clearly prioritizes being a Gmail client.


I didn't say they necessarily denigrated the open formats but added and preferred their own proprietary image, audio etc formats as they gained market strength.

On the hardware side I'm delighted by Apple's USB C/TB push (and Apple contributed a lot to those standards, esp based on what they had learned with Lightning) but note they revived the proprietary "magsafe" connecter on recent laptops (though you can still use USB C PD). Apparently enough customers wanted it.

And as others have pointed out, Apple is hardly alone in this


> Apparently enough customers wanted it.

Magsafe is genuinely a nice innovation that users missed. It solves the cord yank problem. But I do like having the option to use USB-C if I don't have the MagSafe cable.


I'm the other way around: I just want to carry a UB C cable or two. The cable was designed so wear and damage should accrue to the cable, not the connector in your device.


The current computers are the best of both worlds in that respect :)


Also USB-C absolutely sucks. The socket seems to attract dirt and dust in a way that prevents its proper working like no other.


I have lots of USB-C devices and never had that problem.

I'm careful with my stuff though.


Phones are very sensitive to it as they are in a pocket.

Pocket lint -> pushed in USB cable -> after some years the connector wont connect anymore


Had lots of phones, always in my pocket, never had that problem.

Actually I just realized I put it in my pocket top-end down. So the usb socket is facing the sky. I suppose that might keep the usb socket further from pocket lint.


The only port I’ve ever experienced this problem with is Lightning.

At least on many iPhone versions, if anything damages one of the delicate lead springs inside the port, Apple service will tell you to replace the entire phone.


> Apparently enough customers wanted it.

This magnetic thing saved my macbook from numerous falls when people tripped on the power cord.

It was ubiquitous, worked across the whole lineup and for several generations. It is hard to forget that until usb-c, it was commonplace for a manufacturer to have a wide range of power adapters, of varying voltage, power, connector, etc. Apple do their own shit, but they do it consistently.


That's true. I forgot they brought Magsafe back. (Side note: I don't know why they couldn't just make a standard USB-C connector, except magnetic...)


Amusingly the new MagSafe is based on USB-PD I believe, just in a different connector form factor. (The old was 1-wire I think?)


About your side note: That seems physically impossible. Magsafe only has five pins, and is larger than USB-C, which has twenty-four pins. To me, USB-C is a modern miracle. It is so small, reasonably strong, reversible, and has very high pin density.


Some of the new MacBooks require a charging port capable of greater than the current USB-C specification limit of 100W, that's likely one reason why they decided against coopting USB-C exclusively for charging.


USB-C now allows 240W


I wasn't aware, thanks!


iChat stored every conversation (if you enabled history) in HTML files the user could find in their Documents folder and read. Messages history is stored in an undocumented SQLite database that you’re not meant to touch. I’m not saying they’re completely proprietary in every way, but you can see the progression from wide open to “keep your hands off.”

For another example, Apple Notes was actually just mail messages stored in IMAP, until they decided to deprecate that in favor of another iCloud-backed black box in order to add more features.


Html as a database is unacceptable though.

Emails as notes sounds pretty clever though.


IMAP notes still works.


It works about as well as it always did, which was not very well.


Right, I quite often charge my Asus laptop with my MagSafe charger.


I think you are joking, but if you want to, you can buy an adapter for like €10 that will go from magsafe to USB-C, and then indeed charge your non-Apple laptop with it.


I charge my Mac with USB-C. Your Asus has that, right? But yeah, I forgot they brought MagSafe back. Thanks, corrected.


> I would add some nuance to this statement: "Apple likes open standards when it is weak."

> [...]

> Then as they clawed their way back from the precipice, they started "adding value" again.

Just four words: embrace, extend, and extinguish

> https://en.wikipedia.org/wiki/Embrace,_extend,_and_extinguis...


Or even “Apple doesn’t like open standards that are weak”. OpenGL was just no longer fit for purpose. They were competing against DirectX, needed to do ML acceleration, and at the time Vulcan didn’t exist. They had to do something, and especially given their chip strategy that must have been in full swing at the same time, Metal must have been a no brainier.


NeXT had RTF support. It was used everywhere including NeXTmail


Big tangent, but can someone tell me why matter/threads doesn’t suck ass?

I have one of these devices and frankly I have no idea what I expect it to be doing.


> In my experience, Apple doesn't like any standards it doesn't control with an iron fist

Apple supports a number of open standards. I think I’d modify your statement to say Apple doesn’t want to depend on any standards it doesn’t control. And while that may appear nefarious, I get Apple’s implied position there. They have really tight coupling between hardware and software in order to deliver on the UX that they intend (whether or not you like all or some of it). If they’re designing their own software and hardware, I can see why they’d also want to implement standards that they can control to some degree - otherwise their UX is dependent on others. This is also why I think Apple sometimes implements new industry standards, USB-C being one example - if no one else has made an effort yet, they can influence the direction by being first movers.


OpenUSD is a way of bundling and specifying the data to be rendered. It doesn't relate to which API the app uses to accelerate rendering. Adobe, Autodesk, Blender, and most others support different backends per operating system including Metal on macOS already.


Yeah, I know it's just a file format for scene description.

I still believe rendering would work better for the industry as a whole if they could agree to all support, say, Vulkan.

I doubt it'll ever happen, but a consistent graphics driver/API standard across operating systems would be pretty rad.


While not at the kernel level I think WebGPU is probably better than something like Vulkan for that "I can target this low level access to the GPU resources but expect it to work everywhere" use case. Ignoring the name hint that it was designed for the web (it works fine outside the browser), it's a lot more portable than Vulkan across its various implementations precisely because it needed to work on various devices that could browse the web. It's also already backed and supported by all the big players, including Apple.


Out of idle curiosity I thought I'd give it a try. Maybe run a demo on the REPL in Clojure. I try to Google "Java WebGPU" and I get no real results

I could be wrong but It seems it's technically possible to use it outside the web, but the ecosystem looks halfbaked


WebGPU, unsurprisingly, is much better supported by the web ecosystem right now (just like WASM in the beginning), so you'd have better luck trying to use ClojureScript (as then you can just use the JS interop) for playing around with it. https://codelabs.developers.google.com/your-first-webgpu-app would be easy to get started with and translated to ClojureScript cleanly without issues.

Also, keep in mind that WebGPU only been around since 2021, and wasn't actually enabled in Chrome until this (2023) spring, so it's relatively new tech, not surprising that the ecosystem still feels halfbaked.


I'm a big fan of webGPU. I use it within my Three.js projects. Really happy to see it gaining support worldwide. I hope either it or similar projects continue to gain traction!


Apple is definitely not going back to OpenGL. I can't say I'm particularly sad, OpenGL is very long in the tooth and writing a good driver for it seems like a nightmare, considering how hard it is to even write good performant application OpenGL code. I wish they had thrown in behind Vulkan instead of creating Metal, but it seems like outside of linux vulkan is a second class citizen everywhere (although it's a pretty good second class citizen to target).

In terms of supporting games or steam and such, I think the reality is that a large segment of games now use engines that handle the API stuff for you, and if you do have the resources/time/inclination to write directly to the API, you're probably ok with MoltenVK as long as you're not doing anything too cutting edge.

Seriously though, while I've written OpenGL most of my life to be cross platform and support the most I can, it's an absolutely TERRIBLE api. Global state everywhere, tons of functions you can-but-shouldn't-use, all sorts of traps everywhere, monsterous header files for extensions, incredibly hard to debug. Vulkan is verbose but in a lot of ways it's actually easier, even if it's advertised as being the more hardcore way to do things.


    In my experience, Apple doesn't like any standards it 
    doesn't control with an iron fist
I mean, on one hand, the Metal/Vulkan/OpenGL situation is unfortunate and I don't understand Apple's motivation there.

On the other hand I'm sitting here typing on a Mac with nothing but USB-C ports that is connected to half a dozen peripherals and a big ol' monitor over those standard ports using standard protocols.

In general I feel that Apple prefers open standards when they actually suffice. USB2 couldn't do a lot of the things that Lightning did, so they invented that. Now that USB-C exists, they embraced it immediately on the Mac and iPad but are unfortunately dragging their feet w.r.t. the iPhone.


> I mean, on one hand, the Metal/Vulkan/OpenGL situation is unfortunate and I don't understand Apple's motivation there.

OpenGL is a dead end - while vendors (including Apple) still support their respective GL stacks, there is not really active investment anymore either in those stacks or in the standards.

Metal came out years before Vulkan, and Apple has tight integration between the graphics API and their underlying first-party graphics hardware designs. If Apple did have first party support for Vulkan, it would basically be MoltenVK anyway. Apple tends to push anything which isn't a first party framework to be third-party sourced and bundled as much as they can. They likely think MoltenVK as a third party library is the best scenario for Vulkan support.


"Years" is overstating it a bit; Vulkan's SDK came out a little more than a year and a half after Metal was available on iOS and 8 months after it was available on Macs.


Ahh you're right, I thought it was 2017 for the Vulkan 1.0 release.

My understanding is still that Metal was released on iOS months before AMD's Mantle API was accepted as the start of Vulkan work.


> the Metal/Vulkan/OpenGL situation is unfortunate and I don't understand Apple's motivation there.

I don't think it's hard to understand. Apple wanted an easy to use API that could be extended and updated easily. Vulkan is an extremely complex API that foregoes all and any developer ergonomy to facilitate quick driver development on as many hardware targets as possible. Consequently, Vulkan's design choices are driven by the least common denominator. The goals are just too different.

There is at least some evidence that Apple was interested in supporting and shaping Vulkan (they were member of the initial working group), but I suspect that it very quickly became clear that the committee is going into a direction they were not interested in, so they noted out.

Still, I don't think it's correct that Apple is opposing any kind of standardisation in this domain, they just want something more aligned with their vision. They have been very active with WebGPU, which is shaping up to be a very nice API, and it has inherited a lot of good design decisions from Metal.


Indeed Apple has been very active in shaping WebGPU, and that is why we can't use a bytecode representation for shaders. Instead, we have to repeat OpenGL's mistakes and store shaders as strings.

WebGPU specifically has to be the lowest common denominator among the APIs it supports. And there are several features very useful in GPGPU's which are supported in Vulkan and CUDA, but cannot be included in WebGPU due to the lack of Metal support. One such example is floating point atomics.


I am sure that bytecode representation of shaders will come in a future revision. It’s not a first priority. SPIR-V is a poor choice for portable shaders for reasons outlined by the WebGPU group on multiple occasions. And WebGPU shading language finally goes away with some poor legacy design choices that are still stuck in GLSL and HLSL (such as shader bindings via global variables).

Regarding floating point atomics, I think you got it confused? Metal fully supports floating atomics on all recent devices, while in Vulkan it’s an optional extension. According to gpuinfo database only 30% of implementations support float atomics, and only 10% support more advanced operations like min/max. If you are looking for cross-platform float atomics support, Apple is the least of your worries (what they suck at are 64-bit int atomics though).


>On the other hand I'm sitting here typing on a Mac with nothing but USB-C ports that is connected to half a dozen peripherals and a big ol' monitor over those standard ports using standard protocols.

cool, now try two monitors https://www.kensington.com/en-au/news-index---blogs--press-c...


I have two 4K 60Hz monitors plugged in my work MBP (M1 Pro, or is it Max?) via DP over USB-C (not TB) basically every day. The MagSafe and HDMI ports sit unused, and I wish these were more USB-C ports instead.

My personal Mini M1 can't handle two DP over USB-C displays but can handle one DP over USB-C + one USB-C to HDMI. I also wish the two USB-A ports were USB-C as well.


MST has been a dead letter since day 1 because nobody puts a more expensive monitor controller board in a monitor than it actually needs.

4K60 monitor? cool, it gets DP1.2. So MST means dropping to 1440p or 1080p resolution (splitting DP1.2 across two monitors).

crappy dell office monitor? it gets DP1.1, so MST means dropping to 720p or 540p.

absolutely no company in their right mind is going to haul off and put a DP1.4 in a bottom-shelf monitor or whatever, such that MST actually had extra resolution to play with. So the only places it matters are (a) stocktrader type people who want 8 monitors and don't care about visual quality/running non-native resolution, and (b) office situations and other places where the ability to run a single cable is more important than visual quality.

so de-facto nobody has ever cared about MST, and docks fill this use-case much better. Thunderbolt/USB4 doesn't care about what monitor controller board is behind it. It just cares that you have one DP1.4 stream and the dock can allocate that into as many physical ports as the dock physically allows. Have the bandwidth but need more ports? Cool, just daisychain more docks/adapters.

(and this does work on m1 pro/max btw - this guy for example did eventually find a dock that worked for him.)

https://www.derekseaman.com/2021/11/my-journey-for-dual-disp...

--

the big gotcha with M1 is really that the base M1 is actually a crossover chip between high-performance tablet/ultraportable laptop and so Apple doesn't want to waste the space for multiple PHYs that won't be used. So it gets 1x HDMI PHY (normally used for the internal screen, on tablets/laptops) and 1x DP PHY for an external monitor. The Pro/Max support 2 and 3 external displays respectively.

I do agree this is a major limitation on the "just buy macbook air" approach, although you can use displaylink (video compression+USB+active adapter) or use an ultrawide monitor (it's a fast connection, you just only get 1 of them). In particular the 15" MBA really needed a "Pro" CPU option like the Mac Mini family, because that's actually a very nice ultrabook other than the single display, and I absolutely think the chassis is big enough to handle it for normal "interactive" use-cases (not bulk video processing/etc).

And of course the 13" MBP doesn't get one either but lol fuck that thing anyway, let the touch bar/old-style chassis die already please


> 1. Apple conforms to the existing standards of OpenGL and Vulkan we see gaining steam for many film and game production pipelines.

Read what gamedevs have to say about this, Metal is more appreciated than Vulkan

> 2. Apple tries to throw its weight around and force devs to support their Metal standards even more, ultimately hoping to force the world onto Metal + macOS.

Apple was part of the Vulkan working group, knowing what gamedevs prefer, it now make sense why they parted away and created Metal instead

In retrospect I can only show compassion to Apple, they made the right choice


I think the conversation is mixed here. Baldur’s Gate 3 was just released with first class Vulkan support. Steam is pushing hard with MoltenVK on macOS and native Vulkan drivers on the Steam Deck.

I agree there are likely a lot of game devs who like Metal, but it would appear there are a lot of heavy hitters backing Vulkan.

As well, in film, many render engines prefer Vulkan due to the flexibility to write complicated compute shaders with complex command buffers. I experienced this first hand working with VFX studios in my day job.

I think the story is mixed, there is a big interest in Vulkan still.


And Vulkan drivers are like 10-20% slower compared to DX11 on NV.


Do you mean engine developers who actually use the Metal API, or game developers writing shader code? I know game developers prefer HLSL (Direct3D) over GLSL, but I dunno what people think about MSL.


> I know game developers prefer HLSL (Direct3D) over GLSL

That's my sentiment as well


What gamedevs? Most games don't even release on Mac.


Possibly referring to iOS games using Metal.


Vulkan is way better than Metal. Metal is simpler to learn. That is about it.


Nonsense. Feature-wise, they are mostly equivalent (Metal has better support for GPU driven pipelines and shader authoring). Metal is much simpler and more flexible. The only way Vulkan is "better" if you measure lines of codes.


Vulkan is surely more flexible. One example off the top of my head is mailbox presentation mode in Vulkan for minimal input delay without tearing.


Metal allows you to present surfaces at exact times. You have access to the display refresh timing information and it's your job to synchronise your drawing rate to (potentially variable) display presentation interval. Vulkan presentation modes are workarounds over the fact that Vulkan provides no fine-grained control over presentation intervals.

There is the VK_GOOGLE_display_timing extension that provides functionality similar to Metal, but it doesn't seem like it's well supported on desktop. The equivalent official extension seems to be stuck in limbo somewhere.


Sounds really fine-grained, but does this mean I have to invent my own "mailbox" every time I want "unlimited refresh rate with minimal input lag, but without tearing"?


I think it should be as easy as not presenting a drawable if you detect that the previous frame is still rendering. Should be solvable by adding a single conditional guard to the command buffer completion handler. Never did that myself as I don't have a use case for it, so I might be underestimating the challenge.

Note that mailbox approach does not really give you unlimited refresh rate, as you are bound by the number of drawables/swapchain images your driver can supply. If your drawing is very fast these resources become the bottleneck. If you truly need unlimited framerate (e.g. for benchmarking) the best approach is probably to render to a texture and then blit the last one to a drawable for presentation. And if your goal is "minimal input lag", then you might as well do it right and decouple your simulation and rendering threads.


Can someone give concrete examples, why? Let’s exclude the learning.


Apple have less constraints so their API is more straightforward (less abstractions) and less verbose, Vulkan give you more control, but at the expense of a more convoluted, verbose and complex API, people like to joke about the amount of code one need to write in order to render a triangle with Vulkan


I am not convinced that Vulkan gives you more control. Metal is adaptive in the sense that it can manage some state and configuration for you, but that is strictly opt-out. You still get your manual memory management and low-level event synchronisation.

On the topic of control, Metal has precise control over display intervals and presentation timing (I think Vulkan recently introduced something similar, not sure).


Though arguably, going from first triangle to a semi-complex scene is relatively few additional lines, for what it’s worth.


These are still very abstract differences.


>I wonder if support for OpenGL, Vulkan, etc will improve now that Apple is partnering with nVidia, Adobe, Autodesk, Microsoft, etc around the OpenUSD rendering/animation/CAD/3D-scene format?

I'd say it's totally orthogonal matter (having a standard 3D scene format, and what graphics api will render it), and Apple's participation on that will be minimal anyway.

>I would be surprised if Apple doesn't use it as a means to cement more 3D software vendors into macOS land. It's really hard to render consistently if the underlying drivers are all wonk and proprietary.

There's an easy fix though Apple can suggest: just use the official macOS engine.

Why would the even opt for (1)? To burden themselves with supporting different 3D engines? They already support and maintain their own.


I don’t think that’s the shtick behind OpenUSD. It’s not a transmission format like glTF, so the intent is not to get consistent rendering but rather to standardize intermediate graphics representations so that software that works on 3D scenes (unreal, Maya etc) can represent all of their workflow in USD and get consistent interop.


Someone should tell Jensen Huang then. That's the message he pushed heavily at Siggraph 2023 over and over and over again.

Source: https://www.youtube.com/watch?v=Z2VBKerS63A

They even did a big demo shot showing the same frame being rendered in multiple different editors all creating the same consistent result and matching. All of it was said to be due to OpenUSD standardizing how a scene is defined, animated, and rendered.

Probably just a bunch of marketing buzz though.


> Apple doesn't like any standards it doesn't control with an iron fist

You mean like SCSI, FireWire, USB-A, USB-C, USB 3.x, Bluetooth, Qi charging, H264, AAC?


Just because they used them doesn't mean they like using them, just that they don't have the sway to move people to something of their own.

FireWire was developed by Apple and some other companies, as a competitor to USB, and lost out to USB.

Apple had their own video container and codec formats in quicktime, and those also lost out to others.

They definitely prefer to roll their own, they just don't always succeed in gaining enough market adoption (in the past) or they're told to stop pushing it to the detriment to their users (as recently with USB-c).


> FireWire was developed by Apple and some other companies, as a competitor to USB, and lost out to USB.

Apple was part of the patent pool for FireWire and is also part of the patent pool for USB C and was early to be onboard with Thunderbolt along with Intel.

Apple went all in on USB with the iMac in 1997 well before PCs were completely onboard.

> Apple had their own video container and codev formats in quicktime, and those also lost out to others

Apple’s QuickTime container is part of the standard

https://wiki.videolan.org/QuickTime_container/#:~:text=The%2....

And Apple is in the patent pool for H.264


> Apple went all in on USB with the iMac in 1997 well before PCs were completely onboard.

"PCs" were using either parallel or serial ports, in addition to the PS/2 ports for mice and keyboards. None of them were proprietary or if they were, they were widely used so basically standard. USB ports were added easily as expansion cards on those PCs (TBH I don't recall if it was the case already in 1997, don't remember owning any USB peripheral back then)


What’s your point? Do you really think that Apple should have used PS/2?


My point is that PCs already had perfectly standards for cheap peripheral communication, so there was less pressure to upgrade to USB. I remember the "PC2000" slogan that aimed at having USB-only PCs by 2000, it probably took 3-4 extra years.


> Apple was part of the patent pool for FireWire

Apple is listed first as the designer, then second the IEEE1394 working group. Indeed, there's some indication that Apple's development started in the 80's and it wasn't until later it was presented as a standard.[1] Funnily enough, they wanted it to replace SCSI, another technology you noted as a counter to Apple not liking standards they don't control.

> is also part of the patent pool for USB C

Being part of a patent pool doesn't really mean anything to me, given how companies use patents strategically and trade them. Do you have details on what patents may be shared? (I ask because I looked and it wasn't obvious from some light googling on my part).

> and was early to be onboard with Thunderbolt along with Intel.

They weren't early to onboard, they developed it with Intel (even if Intel held most of the patents and may have done the lion's share of the work, I'm not sure on that point).[2]

> Apple went all in on USB with the iMac in 1997 well before PCs were completely onboard.

Being able to control the hardware completely allows they to make shifts like that, because there was no one "PC" to be completely onboard. That said, they make moves away from it where they could for protocols they had some level of control and or steering of (FireWire, Thunderbolt, etc).

> Apple’s QuickTime container is part of the standard

Apple's QTFF was donated to be the container for MP4, but for a decade or more prior to that it was proprietary (but may have been open to implementation by third parties, I'm not sure). The main problem was that Apple licensed and defaulted to using Sorenson video codecs in their Quicktime framework and shipped it along with their video players, locking down the playing of the format to people willing to purchase the Sorenson codec individually or to those that used their player.

I admit this one is less about using a standard of their own and more just an early example of the platform control and lock-in they're known for now.

> And Apple is in the patent pool for H.264

Again, being in a patent pool for a large company doesn't by itself signal anything to me, given how strategically large orgs use patents. I would need some more info to view this one way or another.

1: https://en.wikipedia.org/wiki/IEEE_1394

2: https://en.wikipedia.org/wiki/Thunderbolt_(interface)


What exactly is your complaint? That Apple only uses standards that it contributes to? What other computer maker was going to move technology forward?

Should Apple have used the PS/2 connector instead?

> That said, they make moves away from it where they could for protocols they had some level of control and or steering of (FireWire, Thunderbolt, etc).

What were they going to use instead of FireWire? USB 1 was painfully slow. Again what other “standard” should they have used?

There was never a Mac that didn’t have USB after the iMac.

You can go back even further Nubus was licensed from Texas Instruments (used in the Mac II in 1987) and they moved to PCI with the second generation PowerMacs in 1996 (?)


> What exactly is your complaint?

My complaint? I'm just calling into question your counterpoint exampkes supplied when someone stated that Apple only likes standards they control. Whether it's warranted or they have good reason to in some cases is somewhat besides the point, they have a long history of developing their own standards, sonetimes because they are addressing a problem that isn't solved by another technology, and sometimes just because they would rather have something they control whether the market segmentation and user confusion it causes is best for the customer or not.


> Apple had their own video container and codec formats in quicktime

You know the MP4 standard is based on Quicktime, right?

> MPEG-4 Part 14 is an instance of the more general ISO/IEC 14496-12:2004 which is directly based upon the QuickTime File Format which was published in 2001.

https://en.wikipedia.org/wiki/MP4_file_format#History


> FireWire was developed by Apple and some other companies, as a competitor to USB, and lost out to USB.

Firewire existed for years before USB: it was designed in the late 80's, roughly 10 years before USB. Development was mostly Apple and Sony, but numerous others were involved in the IEEE-hosted process.

As USB became more capable (Firewire vs USB1 was no contest), it gradually began to replace it. But ultimately, Thunderbolt was its real replacement.


QuickTime lost? MP4 basically is QuickTime.


Apple deliberately broke DSC 1.4 to support the Pro Display XDR. Thousands of people happily using 4K HDR10 high refresh monitors under Catalina all of a sudden couldn’t under Big Sur with the release of the Pro Display, and people wondering how they were managing the resolution.

And demonstrably so - “downgrading” to DSC 1.2 actually improves those other users refresh rates and HDR support.

This is still “broken” as of Ventura.


Still no iPhone with USB-C :(


This year definitely because of the EU mandate. It will probably still be nerfed like the low end iPad that has USB C. But still transfers at USB 2 speeds


I honestly don’t get why people are so furious about this. I asked my two siblings, two friends from my uni days and two friends from work and none of them have used the Lightning (soon to be USB-C) port for anything but charging and music.

None of them even remember connecting it to a computer past the iPhone 5S.


There is basically no point plugging an iPhone in to a computer. It’s so much more convenient to use airdrop or cloud storage.

It’s much more useful on the iPad where you might actually use it as some kind of video editing device and plug an external monitor and ssd in.


> There is basically no point plugging an iPhone in to a computer

...if your computer runs MacOS.

I was surprised to see how unnecessarily annoying it is to transfer videos and photos from an iPhone to a Windows PC.


It’s pretty easy to move files on and off the iPhone using a usb/portable ssd and the files, hardest part would be you need a usb to lightning otg cable which is somewhat uncommon.

It’s the kind of thing you don’t really do if you follow the apple flow though. You’d either stream the video from whatever service, or you sync it with apple photos and it will be available on your phone.


When I plugged in the iPhone via cable to my Windows PC, I could only extract pictures which were taken recently, god knows why.

Apple officially recommends to install iCloud on the PC and download the files from there, but they didn't let me disable the upload of the files on my PC, so I uninstalled iCloud again.

Then the recommendation was to just download it from iCloud Web. Which I did. But for some reason iCloud downloads default to a lower resolution (720p video in my case) instead of the full resolution. To do this I had to click on a small button, which then gave me the option to download my own files in full res.

Of course I only noticed that I'd downloaded a lower res after editing a video for 5 hours. All in all, an extremely subpar experience. Every Android phone ever can just transfer files over cable to any PC, for some reason just iPhones have to be complicated...


As long as it has DP alt-mode for HDMI, I'm happy. Unlike Lightning that doesn't do that, so they package a full h264 decoder into that HDMI adapter.

The USB 2 thing is probably like the Raspberry Pi 4: the SoC only supports 2.0. Older iPads and the Pi4 have a full USB 3.0 controller external to the SoC. Apple likes to re-use the previous year's SoC and no point doing USB 3.0 before. I could see the pro models doing 3.0 since it'll probably be a new SoC


> As long as it has DP alt-mode for HDMI, I'm happy.

That would be interesting, but like you alluded to it would likely require out-of-SoC tech for the existing A-series chips.


> It will probably still be nerfed like the low end iPad that has USB C. But still transfers at USB 2 speeds

Other than one exception, the A-series SoCs have not shipped with USB 3.x or USB4 support. The 10th gen iPad uses an A-series chip, so it is pretty close to being a "lightning to USB dongle" inside the case.

So it isn't a software or manufacturing nerf - the part does not support USB 3.


Fine for me, I never transferred anything over a wire between my phone and another device. It’s way more annoying to have to deal with a different charger.


Apple's strategy for cross-platform GPU is WebGPU, which they are actively involved in. OpenGL has been obsolete for a while and Apple has no interest in supporting Vulkan (for multiple reasons). The core philosophy of Vulkan — which is designed as a least common denominator abstraction to facilitate fast driver development, with no regards to end developer convenience — is at odds with what Apple wanted (a compact, easy to use API with progressive manual control knobs).

Current Metal is essentially a more streamlined and user-friendly DX12 Ultimate, plus a mix of some console-like functionality and Apple-specific features, plus Nvidia Optix, plus a subset of CUDA features (such as cooperative matrix multiplication). Plus they have a very nice shading language and some neat tools to generate and specialise shaders. I expect them to continue gaining feature parity with CUDA going forward (things like shared virtual memory might need new hardware). They simply can't offer such comprehensive feature set in a reasonable fashion if they went with Vulkan, even if we let the issue of usability aside for a moment (and we really shouldn't, as Vulkan is horrible to use).


I was unaware that Apple was helping implement WebGPU! I actually love WebGPU, it looks great and pairs very nicely with three.js which is a favourite hobby tool of mine to use on pet projects.

I can tell you have strong opinions on Vulkan. I don't disagree with your general view that it's hard to work with development wise as it's very tied down to driver and hardware implementation specifics.

What I can say though, is that I've met several pipeline rendering engineers (think folks who invent render engines for film and write low level game engine code) who seem to love Vulkan. They appreciate being able to really get down to the bare metal of the drivers and eek out the performance and conformity they need for the rest of the game or render engine.

A lot of the frustration with OpenGL/DirectX from these specialists was their inability to "get in there" and force the GPU to do what they really wanted. Vulkan apparently gives them a lot more control. As a result, they are able to accomplish things that were previously impossible.

All that being said, I think WebGPU will be far more popular for 99% of developers. Only a very few folks like getting down into the nitty-gritty of libraries like Vulkan. At the same time, there is huge money to be made knowing how to make a game eek out another 10 FPS or properly render a complex scene for a film group like Pixar who wants to save days on a scene render.


> I was unaware that Apple was helping implement WebGPU!

WebGPU is pretty much a combined Google/Apple effort (of course, with other contributors). If I remember correctly it was Apple engineers who proposed the name "WebGPU" in the first place.

> I can tell you have strong opinions on Vulkan

I really do, and I know that my rhetorics can appear somewhat volatile. It's just that I find this entire situation very frustrating. I was deeply invested in the OpenGL community back in the day and decades of watching the committees failing at stuff made me a bit bitter when it comes to this topic. We had a great proposal to revitalise open platform graphics back in 2007(!!!) with OpenGL Longs Peak, but the Khronos Group successfully botched it (we will probably never know why but my suspicion having conversed with multiple people involved into the process is that Nvidia was fearing to lose their competitive advantage if the API were simplified). Then we saw similar things happening to OpenCL (a standard Apple has developed and donated to Khronos btw.).

I am not surprised that Apple engineers (who are very passionate about GPUs) don't want anything to do with Khronos anymore after all this.

> What I can say though, is that I've met several pipeline rendering engineers (think folks who invent render engines for film and write low level game engine code) who seem to love Vulkan. They appreciate being able to really get down to the bare metal of the drivers and eke out the performance and conformity they need for the rest of the game or render engine.

But of course they are. OpenGL was a disaster, and it's incredibly frustrating to program a system without having a way to know whether you will be hitting a fast path or a slow path. We bitterly needed a lower level GPU API. It's just that one can design a low level API in a different ways. Metal gives you basically the same level of control as Vulkan, but you also have an option of uploading a texture with a single function call and have it's lifetime managed by the driver, while in Vulkan you need to write three pages of code that creates a dozen of objects and manually moves data from one heap to another. I mean, even C gives you malloc().

Vulkan gives me an impression that it was designed by a group of elite game engine hackers as an exercise to abstract as much hardware as possible. Let's take for example the new VK_EXT_descriptor_buffer extension. This allows you to put resource descriptors into regular memory buffers, which makes the binding system much more flexible. But the size of descriptors can be different on different platforms, which means you have to do dynamic size and offset calculation to populate these buffers. This really discourages one from using more complex buffer layouts. They could have fixed the descriptor size to say, 16 bytes, and massively simplified the entire thing while still supporting 99% of hardware out there. Yes, it would waste some space (like few MB for a buffer with one million resource attachment points), and it won't be able to support some mobile GPUs where a data pointer seems to require 64 bytes (64 bytes for a pointer!!! really? You make an API extremely complicated just because of some niche Qualcomm GPU?) And the best part: most hardware out there does not support standalone descriptors at all, these descriptors are just integer indices into some hidden resource table that is managed by the driver anyway (AMD is the only exception I am aware of).

In the meantime, structured memory buffers have been the primary way to do resource binding in Metal for years, and all resources are represented as 64bit pointers. Setting up a complex binding graph is as simple as defining a C struct and setting its fields. Best part: the struct definition is shared between your CPU code and the GPU shader code, with GPU shaders fully supporting pointer arithmetics and all the goodies. Minimal boilerplate, maximal functionality, you can focus on developing the actual functionality of your application instead of playing cumbersome and error-prone data ping pong. Why Vulkan couldn't pursue a similar approaches beyond me (ah right, I remember, because of Qualcomm GPUs that absolutely need their 64-byte pointers).

The thing is, this all works for a middleware developer, because these are usually very skilled people who already have to deal with a lot of abstractions, so throwing some API weirdness in the mix can be ok. But it essentially removes access from the end developer (who is passionate but probably less skilled in low-level C), making large middlewares the only way to access the GPU for most. This is just a breeding ground for mediocrity.


> In my experience, Apple doesn't like any standards it doesn't control with an iron fist

what about MacOS being a Unix?

I'd suggest a deeper diagnosis is that Apple doesn't like standards incapable of showing off or leveraging custom hardware prowess, which is a key competitive advantage.


USD is a file format. It doesn't have anything to do with the underlying graphics APIs. In fact, tailoring your file format to the underlying graphics APIs is pretty dumb. (like glTF specifying OpenGL constants like 9729, 9987, etc.)


I didn't mean to say they should tailor the file format to the graphics API. I more meant, when a scene format becomes popular, usually you see multiple engines/pipelines/libraries support the underlying scene descriptors like material files, physically based rendering profiles, animation keying, etc.

I wouldn't want the file format dictated by the graphics API, but I would like consistent rendering output in multiple places for the same file. That'd be cool.

In case you're wondering where this "conformance" idea came from, check nVidia's 2023 Siggraph talk. Jensen will buzz word OpenUSD and conformity across products until your ears bleed.

Siggraph 2023 nVidia: https://www.youtube.com/watch?v=Z2VBKerS63A

(in case it wasn't obvious, I am skeptical this will ever really happen, I imagine this is all marketing speak)


Consistent rendering is only possible when the material models used by the various engines are similar enough. Physically-based renderers produce pretty similar results for basic diffuse-metallic-clearcoat, etc. materials. Where things become hairy are more advanced effects like refraction, subsurface scattering, ambient occlusion etc., where different engines use different techniques with different tradeoffs, because there's no easy one-size-fits-all implementation. The UsdPreviewSurface part of the USD spec doesn't even support many of these advanced effects. If your scene uses these effects a lot, then consistent rendering is less likely.


Asahi and Alyssa are the titans of reverse engineering. Their work is pretty unbelievable. I would bet Apple is planning to hire them, or has tried already and they refused.


Valve already contracts Alyssa. Arguably they have more business use case for the skillset than Apple.


I thought she’s working for Collabora. But she’s still a student, so only part time.


She's doing all this while still being a student, unreal. I can't get over just how skilled some people are, and how far I myself am from them.


Talent + drive (obsession) + lots and lots of free-time.

Most of these types cut their teeth in their younger days - doesn't really mater if it is sports, musical instruments, art, coding, or whatever. If you spend 6-12 hours a day doing something, seven days a week, from you're 10 years old - you'll be pretty damn fantastic by the time you're college aged.


Yup, and you have to be pretty lucky to hit the nail in the head on your choice of vocation and profession.

A lot of people diversify, which is good for the mind, but bad if you have to compete for success.

Other people choose very unforgiving fields: want to live a life playing the guitar? Tough luck.

Some other people realize they don't like a hobby or job past some point, but turning over decades of skill to start a new path places them in a huge disadvantage.

It's not a person's fault to not choose the right career path when they're naturally too young to foresee their future in the field.

I personally feel pretty lucky to have chosen computers and technology for my vocation. If, for any chance in life, were to choose anything different when I was 12, then my life would have been entierly different.


You and me and <waves hand airily at 7 billion> both.

Don't measure yourself against others, it's a recipe for stress. Instead, understand your core values and measure yourself by how you live up to them.

If we could put a number on kindness, we could make a better economy.


She did right up until earlier this year and she just got her bachelor's a couple of months ago (all according to her public resume). Crazy stuff, awesome to see.


I don't think Valve contracts Alyssa. I think she is working for RH.



Some people are just more capable than others.



I bet Apple already tried and they refused due to NDA and non compete clauses, that or HR didn't validate their profile.


I don't think being able to reverse engineer your project is actually a good qualification for hiring you, considering your employees don't need to do that.

Also, isn't she a high schooler/freshman? There's always internships but it's best to do other things before grinding those out.


If someone understands your codebase without your sources and docs - you can bet that they will also understand it with the help of those. Usually better, if the docs have any value.

And this is kind of a valuable skill, considering how many coders don't understand their codebase even with docs.


I did not follow parent’s reasoning either. They did something without any documentation at all.


It’s certainly true that such people are very capable, and would be good people to hire for many kinds of work. However corporate environments don’t typically like people who are willing to work around artificial restrictions. This particular skill set (which is what reverse engineering is) may even be (perceived as) a negative at a company like Apple.


Much of it (LLVM) is open source, and for the rest I don't think understanding the inputs and outputs is actually that closely related to being able to maintain the bits in the middle, or come up with new bits.

It's not a negative of course.


> I don't think being able to reverse engineer your project is actually a good qualification for hiring you,

Ask Mark Russinovich, now Azure CTO


I don't think any technical skills are a good way to pick someone for a leadership position.


So what is? The ability to bullshit everyone?


I would've said management experience.



Most employees would probably be well served being able to reverse engineer things. Sure, you might have docs on how your graphics pipeline works, but then you’re probably going to be looking at commercial software that runs on it.


Prevention of competition to reverse engineer something is valuable sometimes


I hope she avoids working for someone else. Why does she need to have her inspiration crushed by the pettiness of corporate life?


Usually the reason is always the same: money. That is why people work in corporates.


Actually I think it’s more being unsure of what else to do in life and for want of a structure, and money is one of those things people are unsure of. However looking at her work to date, she can find money on her own terms. She seems to work well on her own terms in her own structure. The thing that could hurt her more than anything is working under someone else’s structure and terms.


I found that people more often start PhD if they don’t know what to do. But of course depends on quite many things.

I agree, her kind of people will find the money and better work without corprate bureocracy.


Too many people involved in technical hiring, and too many pointless interviews. It's likely anywhere they went to interview, the people there would have no idea how good they were in the first place.


What a waste of talent... why accomplish things in real life when you can grind leetcode to give the appearance of being able to accomplish things? /s


why bother appearing to accomplish anything when one could just opine anonymously on the interwebs?


If I were in their position then maybe applying to random jobs wouldn’t be the right strategy. Instead either ensure you have enough contact information available for leads to come in, or reach out to connections at companies to get warm introductions. It won’t work everywhere but it needn’t work everywhere.


You also need soft skills to work in a company. There are lots of people in the open source world who are technically brilliant arseholes.

Not saying that's the case here; just that "amazing programming" doesn't necessarily mean "Apple wants to employ them".


This. I am very often astounded with the straight up rude behavior of many "respected" in the open source world.


Employment is a poor proposition. Typically employees make poor money in comparison to what kind of profit they generate, that especially applies to IT.

Anyone with a talent, should rather seek to incorporate and sell their services at fair price.

Unfortunately in some countries, big corporations managed to lobby governments to put a stop to that.


The problem with that option is it involves many skills other than the primary tech-focused skill. Sales, marketing, accounting, negotiation, support, etc. Many technical people don't really want to handle those job functions, even if they happen to be decently good at them.

Certainly you can hire people to do those functions, but you need to have a decently well-established reputation and enough work and income before you're able to hire those people.

Employment does give you a trade off there: less money in exchange for someone else handling the "overhead".


Yeah, it is just another example of specialisation/divison of labour.


If you consider large employers as labor brokers, it makes more sense that someone accepts the reduction in potential earnings for easier access to labor demand.


Massive “it depends” on that


> Typically employees make poor money in comparison to what kind of profit they generate

I think it’s the opposite - developers are overvalued.


I understand where that sentiment comes from, especially looking at the slew of mid-sized companies that have never turned a profit despite paying multiple hundred K TCs, but if you look at the profits of the giants, it paints a different picture. The labor of thousands of very highly compensated engineers is still a drop in the bucket of the revenue generated by their labor. Alphabet seemingly employs somewhere between 20-40 thousand engineers, and it is their labor that produced all (?) of their revenue generating products. Spreading last year's revenue (~$282B) across their engineering staff would yield a revenue in the ballpark of $9M per engineer.

We all know that their engineers are paid well, but that's still potentially more than an order of magnitude off of the value they generate. Of course, there are many other expenses to running their business, but to claim that broadly developers are overvalued when one of the most prolific employers of developers is generating 10x+ the revenue of what they spend on them, is likely no more than only occasionally correct.


That raises an interesting question. If developers are toiling equally hard at company A and company B, on similar products, but company A currently had a dominant market position over company B thanks to network effects - are the developers at company A in any way responsible for its outsize gains and therefore due millions of dollars in TC?


Why are you leaving out all the other essential people that go towards that revenue figure?


Alphabet has at least twice those numbers.


The atomic trick was very satisfying, especially extrapolating the swizzle instruction from the PowerVR heritage. I don't know but wouldn't be surprised if the Apple engineers are learning from this or at least they would appreciate the sheer cleverness. It reminds me when back in the day, the Sega Genesis video hardware was reverse engineered at Accolade and our engineers used the public Texas Instruments TMS9918 VDP documentation as a starting point to extrapolate the evolved 9918 derived VDP in the Genesis (Mega Drive).


I’m sure they appreciate the work but Apple has internal documentation on their chips. It’s not like they make a black box and forget how they designed them when it comes to writing software for them.


Some times an outsider's attempt to reverse engineer your product brings new insights.


That is true, but one does not simply implement a fully conformant bit shuffle instruction by accident.


Or sometimes an instruction isn't in the compiler because the hardware has some strange corner case failures so they aren't using it. I would be nervous about depending upon an unused instruction.


The article claims that Apple's compiler doesn't use the instruction, so it seems like the compiler author did indeed forget or never learn about it.


Or maybe it has some limitations they realized and blacklisted the instruction?


For clarity, this isn't the first conformant linux drivers.

Apple themselves are not conformant to OpenGL® ES 3.1.

So this is literally the first conformant OpenGL ES 3.1 drivers for M-Series, for any operating system (Apple or Non-Apple).

Hence why the call to action to donate to the team.

https://asahilinux.org/support/


I love what they're doing, but last time I clicked a link for asahilinux on hacker news I was taken to a pop up that specifically said that readers from Hacker News are not welcomed on their site and couldn't read their content.


From what I've seen people post here, I can totally get that they don't like HN people. It's sad, because HN is how I found out about these people and their amazing skills, but I would've done the same thing in their place.

Many people, especially in the LGBTQ+ community, hate being featured on here. Some have asked the admin not to be linked but the HN doesn't blacklist websites.

There's a flood of negativity from the tech bros every time something makes it to the HN front page, both here but also on HN link aggregators. This is especially true for the people whose talents bring them to the HN front page every month or so, receiving the same flack and abuse every time.

Last time someone tried to block HN abuse by checking the referer header and redirecting visitors away, HN altered the link HTML to not send the referer header rather than listen to their wishes. First for a specific post, then for all posts. That's a pretty clear sign to me that the HN admins care more about their links and the fake internet points generated by discussions than about the people and projects being discussed.

If the admins are actively working against your wishes, why would you want to tolerate that platform?


Granted, I'm just massively struggling with the idea of excluding a whole radically diverse community (with some of the most civilized and self-moderating discourse on the internet) in the name of inclusiveness. To me, it's as ironic as it gets.

On that end, I can't find any fault on HN's end enabling their 99.99% of their interested readers who didn't do any wrong to be included in the discourse around these ideas, on the free and open web.


> the idea of excluding a whole radically diverse community (with some of the most civilized and self-moderating discourse on the internet) in the name of inclusiveness. To me, it's as ironic as it gets.

Not really more ironic than almost every non-trivial societal discussion, à la "if you want peace prepare for war". Anything non-trivial is multifaceted, and anybody who has an absolute opinion is probably missing most of the argument.

Very quickly for this specific case: one can easily argue that excluding people is hardly inclusive, but one can also easily argue that a niche community with its own subculture _needs_ some sheltering from the rest of the world, or the subculture gets lost very quickly (so the sheltering is good for diversity in that it preserves the existence of the subculture).

(I'm not pretending to present a fully-fledged argument here, just a taster to challenge absolute opinions, entire books could be written on this and probably have)


Re your specific argument, see also under “Cultural appropriation” (IV.6) in Scott Alexander’s essay “The ideology is not the movement”[1]. On the other hand, the argument that is actually, literally written down in Asahi discussions usually paints the whole HN readership as a group with specific coherent opinions that are specifically bad, so I’m not sure to which degree I ought to judge them by the strongest argument they could be making as opposed to the one they actually are.

(OK, there’s also the argument that HN will engage with people whose opinions the usual Asahi folks consider reprehensible—even if not necessarily often—and should thereby be treated as equally reprehensible unless they kick them out[2]. But, I don’t know. I’m a couple decades too young to have seen organized ostracism at the meetings of the Komsomol, but still, the only reaction I have when these kinds of social-isolation penalties are proposed is base animalistic terror. This includes people a decade younger in my own social circles proposing it, which is not even rare.)

[1] https://slatestarcodex.com/2016/04/04/the-ideology-is-not-th...

[2] “Parable of the Nazi bar“, etc.


And they are also free to (try to) restrict who has access to their site, so everyone is happy?


A few commenters on HN were giving Hector Martin shit and trying to dox Asahi pseudonymous contributors (who clearly did not want to be ID'd). Maybe the popup should have included a bit of detail on that so people such as yourself who aren't aware or involved aren't put off the project. But the animosity they have is understandable.


[flagged]


Harmful by what standard? The moderation here is among the best on the web, what on earth are you talking about?

Even if there are a tiny minority of toxic users, blocking access to all seems like the issue may not be with HN...

Edit: ok I missed some previous discussions, eg

https://news.ycombinator.com/item?id=36971867#37000629

for earlier similarly dubious generalisations and complaints providing context.


I think what you're missing is that a tiny minority of toxic users (as perceived from your viewpoint) can easily be perceived as a high concentration of harassment from those that are on the receiving end of abuse.

This asymmetry of harassment is a dynamic worth thinking about more broadly; for example, if only one in a thousand men cat-calls women, you might not even have any male friends that have ever cat-called, and yet that might mean a particularly attractive woman might pass a cat-caller many times per day. Small differences in the rate of cat-calling (1:1000 vs 1:2000) won't be detectable to you, but can represent a huge difference (once a day vs twice a day, say) to those sampling far more frequently from the distribution than you are.

More specifically to the thread here, the question is not whether HN is well moderated, or whether toxicity is quite low-frequency from your perspective. The relevant question is whether the base-rate of anti-trans harassment is higher than in the general population.

It's entirely possible (I think it's true) that people in this community are in general more civil to each other than on most of the internet, and I think the thoughtful moderation is a big part of that. But it's also possible (and I don't have any direct experience to claim whether it is or not, so this is just numbers to illustrate what's possible, not a claim of how things are) that at the tails of the distribution there's 10x more anti-trans harassment in the community too, and we'd simply not be in a good position to observe or measure it unless we were on the receiving end of it or auditing lots of comments very closely. Honestly, based on the tenor of responses when this specific topic comes up (quite different to the normal HN vibe IMO), that wouldn't surprise me either.


I think what you're missing is that a tiny minority of toxic users (as perceived from your viewpoint) can easily be perceived as a high concentration of harassment from those that are on the receiving end of abuse

I didn't miss it - that was partly my point; the the quantity of abuse is possibly a matter of perception which will necessarily be very influenced by the psychology and experiences of the victim; I agree that without being in that position it's not possible to have the same viewpoint, and certainly not the same raw hurt and related experiences. But it doesn't follow from being on the receiving end that there is a clearer, more neutral view of the percentages.


In my opinion, HN comments/users skew heavily towards cynicism/negativity. It may be a tiny minority, but I don’t think it’s a tiny minority of the most vocal users.


That latter (more subtle) point is a possibility I suppose but even if it were the case it would likely reflect the unpleasant side of normal online psychology rather than anything particular to HN; and there is still a significant difference between cynicism/negativity and the bigotry and hostility claimed by the "harmful" comment and linked context.

If all such reflex internet-negativity is being wrongly interpreted as representative bigotry by an external party that's still not a HN issue; and the moderation complaint is still very misjudged.


Just because the moderation is good doesn’t mean that they’re always right, or resourced to do a good job in every case.


[flagged]


> Hacker news users are, on the whole, asshole sexist tech-bros who can't wait to post "but actually" and harass trans women.

Do you have any research to backup your claim?


[flagged]


Isn't that a weird thing to say when this thread is filled with people absolutely praising them and this achievement?

I doubt HN is more or less toxic than traffic from any other mainstream site, but maybe i'm wrong.


I imagine since they're on the receiving end, they'd know. Also if even one HNer tried to dox me I'd ban the lot of you, and I'm a relatively cantankerous straight white guy.


I get it. I guess i find it so exceedingly bizarre to dox someone, especially when known for being brilliant people - it probably doesn't register properly.


Doxing is an exaggeration of what it is. They have a very public online presence in the first place and made it pretty obvious, and they self-promote. It's just putting 2 and 2 together. The "block" on HN seems to have more to do with their gripes about inaccurate or negative comments than about attacks, unless I missed something.


Most of the harm of "doxing" isn't revealing information, which in many cases can be found online for people like software engineers who have public CVs, but rather targeting of those individuals in communities that are predisposed against them and willing to take action be it through emails, comments, etc.

There's a lot to be said about any number of people privately or publicly spamming you, intimidating you, or threatening you on a _personal_ level.


I agree, but I don't think there's any harm in being honest about in this case, since it's pretty on-brand for their public persona. It just seems like a weird hang-up, which ironically fuels the rumor mill. "Hey, isn't this Twitter account run by the same person as that Twitter account?" is pretty harmless as far as "doxing" goes, and when you've let it slip a number of times, what's the point? It's naive to expect total anonymity and zero negative attention with a flamboyant public persona (not meant to be derogatory), especially when you're making money from it, and if you add fuel on the fire on top of that...


100% It's about proving to the victim that they don't have control.


"Half" is hyperbole, but it's a sizable minority that's obnoxious and persistent enough that nearly every trans hacker I follow has a gripe with HN. (Which is a greater number than you'd think; not that long ago I found out it's a meme in the transfemme community to refer to striped thigh-highs as "programmer socks")


There a reason a number of sites have taken similar positions about HN.

It’s more toxic on a few axis.


> toxic on a few axes.

(Sorry, you did tee it up)


Ugh, HN users with axes to grind...


[flagged]


> I have never seen anyone on this website [...] even transphobic at all.

These kinds of comments are here, but are are usually quickly downvoted/flagged to death. If you really want to see them, turn on showdead and look down the bottom of threads somehow related to trans people. It's not nearly as common as ohgodplsno implied though.


As I said, I haven't seen them. If they get downvoted immediately so they are hidden then just don't enable showdead... And you cut out the 'violently' part.

Frankly I don't think someone's views on whether it is possible to change gender is relevant on every topic that tangentially related to a transgender person. But if someone disagrees, and disagrees that it is possible, that doesn't make their comments "violent". This is reminiscent of the "trans genocide" conspiracy theory.


Doxing is nonconsensual unmasking of any kind.


No it is not. It is the revealing of personal information in such a way as to allow someone to be targeted in real life.


You're not in charge of what things mean. Here's Wikipedia:

> Doxing or doxxing is the act of publicly providing personally identifiable information about an individual or organization, usually via the Internet.

This would include publishing PII about Asahi Lina, like her "real" name, as it personally identifies her.

It's no wonder the Asahi Linux project doesn't want to talk to people with your attitude. Maybe take some time to do some introspection on why you feel so strongly about this issue, which seems to impact you not at all.


Publishing someone's real name online isn't doxing when his real name is already publically available online. It just isn't personally identifiable information when it is already public. Is it doxing to say that "rms" means Richard Stallman? Grow up. He is one of many people online that sometimes uses a pseudonym, and pointing out that he contributes to the project as himself and also contributes while creepily pretending to be a female child on Twitch (not at all disturbing, no way) is not doxing.

I don't "feel strongly" about it at all. Save the pop psychologising for someone else. Awful tactic to try to use.


> Publishing someone's real name online isn't doxing when his real name is already publically available online.

You're being deliberately obtuse. It's the connecting (obviously) that's the issue. It's like saying, "publishing someone's address online isn't doxing when their address is already publicly available online." Yeah I only care if you say it's my address.

> I don't "feel strongly" about it at all.

> also contributes while creepily pretending to be a female child on Twitch

> Grow up.

Insults, condescending remarks, yeah I believe you don't feel strongly about this.

I usually try to be cooler on HN than this, but this is super toxic. You've obviously got some kind of beef--regardless of what you say--and I think it would probably behoove you to get to the bottom of it.


>I usually try to be cooler on HN than this, but this is super toxic.

I am responding in kind, based on the way you are talking to me. If you were polite, there would be no issue. Instead, you make nasty psychologising comments and then pretend that you're a victim when someone responds in kind.

>You're being deliberately obtuse. It's the connecting (obviously) that's the issue. It's like saying, "publishing someone's address online isn't doxing when their address is already publicly available online." Yeah I only care if you say it's my address.

Publishing someone's address online when it's already publicly known what their address is isn't doxing! Everyone knows that Marcan is a vtuber in his spare time, it's been public knowledge since about day 2 of him doing it.


> I am responding in kind, based on the way you are talking to me. If you were polite, there would be no issue.

I've yet to insult or condescend to you, whereas you've condescended to me ("Grow up"). I'm not "psychologising" you, I've said that you're pushing an issue that affects you not at all but is harmful to others, and that you should probably figure out why you're doing that (or, why you refuse to accept you're doing that). Maybe me being polite while pointing it out makes it seem like I'm "psychologising" you, so let me rephrase: you clearly can't keep yourself from harassing people who live life in a totally harmless way, that's real weird, and you might not know how weird it is.

> and then pretend that you're a victim when someone responds in kind

Again I've not insulted or condescended to you so you're not actually responding in kind, but that aside I don't at all feel like the victim here.

> Publishing someone's address online when it's already publicly known what their address is isn't doxing!

This analogy doesn't fit, because--again--the connection isn't publicly known. How do I know it's not publicly known? Because I didn't "know" and I looked. I went to marcan's various social media sites and his website, to Asahi Lina's various social media sites and her website, and came up bupkis. So to be honest, I can't corroborate your claim based on what I've seen, and it's not for lack of looking. Could I probably find something? Sure. Do I care to? Not even a little.

Basically your argument boils down to: "someone already doxxed this person, so it doesn't matter if someone does it again". I think any reasonable person would think that's wrong.


> Can you give us links to some examples?

Sure!

https://social.treehouse.systems/@marcan/110507731403617388

These receipts foot with my own observations. Transphobia does tend to eventually get flagged, but often it only happens after the comments section has dropped off the front page, and all it takes is browsing with "showdead" to reveal things that range from obvious "groomer" and "mentally ill" dog-whistles to being downright scary.

In the meantime, anti-transphobic comments or people complaining about the lack of moderation of transphobic comments are usually downvoted or flagged, and all you have to do is look at the root comment of this thread for evidence of that.


[flagged]


> There is nothing wrong with any of the comments linked to by him in that thread, in my opinion.

Then there is no meeting of the minds that can be had.

> In my experience, this is always arrived at by some sort of absurd logic that goes "he denies that I am really a woman, and when you do that you make transgender people more like to kill themselves, therefore it is genocide" which is obviously nonsense.

What is happening right now in America is much, much worse than your strawman.

https://www.erininthemorning.com/p/this-is-what-transgender-...

If you are trying to draw a distinction between murder and using rhetoric to encourage stochastic terrorism, criminalizing existing in public, destroying social support systems, blocking access to health care, and in some cases forcible detransition, IMHO you're drawing a distinction without a difference. Both the dog-whistle _and_ the fig leaf fallback positions are horrendous injustices.


[flagged]


[flagged]


I thought this was a forum for intelligent and well-informed people, not the posting of paedophilic propaganda. Sad.


Half of the average HN users spends their their either trying to dox Asahi Lina, or to be violently transphobic towards Rosenzweig. The other half is horribly pedantic assholes.

Also, if you donate before 6PM you'll receive a free NPR mug and a t-shirt.

(I predict this year's funding drive will be less successful than last year!)


[flagged]


That's because HN has since removed referral headers on links, to work around this specific kind of blockage.


Hm, strange, HN seems to set referrer-policy to 'origin', which seems like it should send a referer header when going to external sites[0]? But clearly it does not. So I guess I'm just misunderstanding how that header works.

[0] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Re...


Look at the link itself:

    <a href="https://asahilinux.org/support/" rel="nofollow noreferrer">https://asahilinux.org/support/</a>


Asahi and a few other websites blocked HN traffic (because of the aforementioned problems) and then HN added special nofollow/noreferrer for only these domains to prevent this. and then the sites responded with more assertive measures.


> assertive measures

This is incorrect. These sites (or at least the Asahi one) are not being "assertive" - they are being hypocritical and toxic.

Additionally, nobody has the right to selectively block links to their website. You get two options: you either let your site be publicly-viewable, with all the benefits and drawbacks, or you make it members-only and require a login. You can't have your cake and eat it too.


Look at the 'a' HTML tag.

  <a href="https://rosenzweig.io/blog/first-conformant-m1-gpu-driver.html" rel="noreferrer">The first conformant M1 GPU driver<a>


And this is exactly the kind of attitude they don’t want to deal with.


Well, Asahilinux was rude to do what they did, and the person you are responding to is a troll.


They were rude to say a group disproportionately responsible for spreading falsehoods, personal attacks, doxxing, and anti-trans harassment wasn’t welcome?


They were wrong to narrowcast everyone here as part of that audience, yes. It's their site, but they invite the controversy when they take such ridiculous measures to drag readers into drama most are unaware of.


> a group

HN is not a group.

> disproportionately responsible

Please link to the objective, empirical evidence that you have on hand for this claim that HN is "disproportionately" responsible - you can't be disproportionately responsible if you don't know the proportions.


Yes. They chose to come to the threads and focus on the trollish, downvoted comments and then outright block every member of the mostly decent community from even reading the blog.

They may be fine people (Asahi) but I thought it was an overreaction, yes.


The Asahi devs taking the worst comments/individuals on HN and then using that to stereotype the entire site is...not exactly an indication of integrity. More like extreme hypocrisy.

Dang is a far better person than any of these people, given that despite the amount of vitriol he's had directed at Y Combinator, HN, and him personally, he's still orders of magnitude more patient than anyone affiliated with Asahi.


That they're in the right? What?


This is a profoundly rude comment.


Well if I'm going to be rude I'd best be profoundly rude.


feel better soon.


Right - the Asahi Linux project specifically does not want attention from HN. I flag every Asahi-related submission here, and I encourage you to do so, too.


Makes me wonder... is it possible to use this from macOS?


Depends on what you precisely mean by "this". The user space part can easily be used from macOS. That's how the author first developed it without a working kernel driver.

From https://asahilinux.org/2022/11/tales-of-the-m1-gpu/

> But wait, how can she work on the user space driver without a kernel driver to go with it? Easy, she did it on macOS! Alyssa reverse engineered the macOS GPU driver UAPI enough to allocate memory and submit her own commands to the GPU, and this way she could work on the user space part without having to worry about the kernel bit. That’s super cool! She started writing an M1 GPU OpenGL driver for Mesa, the Linux userspace graphics stack, and just a few months later she was already passing 75% of the OpenGL ES 2 conformance tests, all on macOS!


I don't think the newer MESA drivers can interact with the OSX UAPI. That was just a stopgap until the Linux Kernel driver was far enough along.


Can you use Windows drivers for your GPU on Linux?


Intel did. Obviously there is more to it than just setting a different target and hitting compile but it's also not a ridiculous question because of that measure alone. It'd require enabling reduced security mode to load kexts or a custom kernel as well as a good amount of additional code for interfacing it with the macOS kernel interfaces but the majority of the code would be reused. Not trivial by any measure but also not an unreasonable approach in terms of total effort to get a working driver.

I'd be very surprised if anyone was interested in doing all that work given the security limitations and ability to just use Linux.

Edit: I forgot Metal is actually a userspace driver, you don't need to mess with the kernel side... though I can't remember if you still need to lower security to poke the right areas.


You're right, I was being overly dismissive in my original reply.


intel's linux driver is completely different from the windows driver. it also has been open since the start (well, since the start of the rewrite in 2010 or something) actually before AMD did theirs.


If you look the other way around (which is more relevant anyways), you can compile Mesa for Windows. However I'm pretty sure only the software renderer is available because Mesa is not capable of interfacing with the Windows kernel driver, as all the other drivers are coupled with Linux driver interfaces, iirc. (At least I'm quite certain this is the case for AMD.)

I'm also quite certain the proprietary NVIDIA driver shares a ton of code between Windows and Linux.


To add on, just because it's cool, there is a 3rd category in play with Mesa of layered drivers. Microsoft worked on and officially supports running OpenCL and OpenGL on devices which only have a DX12 driver via Mesa this fashion. Still all user space of course but not software rendering. More akin to MoltenVK from the blog post, except fully supported and maintained by the OS provider.


With a cursed ndiswrapper fork, perhaps :D I jest, but using drivers from another OS has happened


The Github link there goes to marcan, who leads Asahi, but I want to primarily support this driver development. Is that an option? If so, where do I do that? If not, I'll just use this later today.


In Asahi Lina’s guest post[1] on the Asahi Linux blog she mentions “If you want to support my work, you can donate to marcan’s Asahi Linux support fund on GitHub Sponsors or Patreon, which helps me out too!”

So yes, donating to Marcan helps driver development too.

[1] https://asahilinux.org/2023/03/road-to-vulkan/


Thanks!


In addition to the other post: I don't think Alyssa takes donations but Ella Stanforth has a GitHub sponsor page for the Vulkan driver.


Asahi Lina is marcan’s alter ego that works solely on the graphics driver development.


> Unlike ours, the manufacturer’s M1 drivers are unfortunately not conformant for any standard graphics API, whether Vulkan or OpenGL or OpenGL ES. That means that there is no guarantee that applications using the standards will work on your M1/M2 (if you’re not running Linux). This isn’t just a theoretical issue. Consider Vulkan. The third-party MoltenVK layers a subset of Vulkan on top of the proprietary drivers. However, those drivers lack key functionality, breaking valid Vulkan applications. That hinders developers and users alike, if they haven’t yet switched their M1/M2 computers to Linux.

This becomes very obvious to anyone who has to debug now-deprecated OpenGL apps in macOS, because the abstraction layers do not actually expose OpenGL state in ways that old Apple OpenGL debuggers can read. Which means unless you still have old Macs, with old versions of macOS installed where OpenGL is not Metal under the hood (I can't even remember the last version where that was true) then it's literally not possible to debug OpenGL on macOS anymore using stock debuggers.

Your debugger and or app will just crash.


My blood pressure just drove up reading this.


I assume that this primarily benefits games and not any deep learning right? The most attractive aspect of Mac M1 is the huge memory boost. Might not be great for training due to the inability to distribute across multiple cards, but it makes for a great inference engine for stable diffusion, llama, and other large models.


There are two modern cross-platform GPGPU standards that Apple Silicon can theoretically use or implement - SYCL and Vulkan Compute.

SYCL is Khronos Group's vendor-neutral, high-level programming framework. Application support is limited, but hopefully with Intel's backing, the situation would gradually improve. Meanwhile, Vulkan Compute sidesteps the entire headache with compute shaders. But I'm not familiar with it in terms of application support.

SYCL can be implemented on top of OpenCL and OpenCL's SPIR-V extension. It soon turned out that this route is unfeasible due to prevalent vendor lock-in that's not going to change anytime soon, so it has largely been abandoned by everyone else but Intel and Mesa. Right now SYCL is usually implemented by backends to GPU vendor's respective APIs, like ROCm, HIP or CUDA. Doing the same for Metal would be very challenging.

Mesa already has experimental support of OpenCL w/ SPIR-V on Intel and AMDGPU, so theoretically it can be extended to Apple Silicon. Difficulty of implementing OpenCL's SPIR-V extension should be comparable with Vulkan compute shader (which also uses SPIR-V). However, currently OpenCL on Apple Silicon is entirely unsupported. The last time I checked, it's on the roadmap.


The only problem with cross-platform standards is they are never performance portable unless they're so high level someone already their primitives have already implemented the algorithm X different ways for you already.

For any low level performance programming you need to code to the specific microarchitecture, so the pros of a single programming language/library are limited (you're not getting any code reuse that isn't available in the top level non-hardware C code anyway) and often outweighed by the ability to take advantage of the vendor's dedicated extensions provided by their preferred programming mechanism.

This issue was well modeled by OpenCL, which never really caught on for programming Nvidia GPUs for this reason.


Correct. You need CUDA, or ROCm, MPS (native to macOS) backends for running deep learning. I found it relatively easy to train some Pytorch model on beefy server with CUDA and running interference on my Macbook Air.


MPS is a Metal shader library rather than a programming language, which would be MSL (like GLSL/HLSL).



The compute shader portion is a good step but it's still not going to provide the interfaces most of these deep learning tools expect.

That said eiln wrote an ANE (Apple Neural Engine) driver which enables using the dedicated hardware for this instead of the GPU. It is set to be merged into linux-asahi in the future.


TensorFlow Lite does indeed support OpenGL ES.


"Of course, Asahi Lina and I are two individuals with minimal funding. It’s a little awkward that we beat the big corporation…"

While I'm glad they did this, and it's a crazy fucking accomplishment, it's not really a beating. It's just Apple doesn't care. They were never in the race to begin with.


"Of course, Asahi Lina and I are two individuals with minimal funding. It’s a little awkward that we beat the big corporation…"

Love the euphemism. This puts Apple to shame, plain and simple. They obviously don't care about standards, or compliance, because they like people to be walled in their own little private garden (still waiting for the facetime standard, or any kind of cross-platform technology created in the past 10 years).

If i weren't an iOS dev, i would have ran away from the apple ecosystem a long time ago. I love their hardware, and loved the brand back in the 80s and 90s when apple was about creativity, putting humans first before machine, etc. But what this company has become is just a corrupted mess of greed behind a curtain of politically correct marketing videos.


I agree with you completely and yet it's practically impossible finding a worthwhile alternative. I've tried finding alternatives on my last upgrade cycle and it's like having to live with endless amounts of compromises just to get away from Apple.

For the iPhone, I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time. Not to mention the polish of the software, app ecosystem and stability.

Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

I like to also read comics and magazines on the iPad and that's also a market where I have no idea what an alternative would be. And that's the state of the tablet market for years now.

Maybe I could get rid of my Apple Watch, but it just works and has endless amounts of third party accessories. I've been looking at Fossil Hybrid Smartwatches and it seems like they are a hot mess of instability and bad support.

At the end of the day, it's about stability and ease of use. I'm way past the time where I had the time and found it really cool to try every new ROM coming out ("daily driver", "What isn't working? You tell me") and it seems like Apple still can't be beat at this front. Sadly.


Probably the closest option to M1 on build quality and battery will be the AMD version of the Framework laptop... if you want can run Windows or Linux without issue. Will probably go that direction on my next purchase. Later this year and next year AMD are releasing new laptop CPU/APU that will absolutely kill on performance:watt, while they're a bit ahead of Intel at the moment, they're going to leap a bit further ahead if GPU perf matters to you. Not that Intel is asleep, just behind.

On the phone, I started with Android, so just kind of used to it... I've bought about every 2-3 generations for a while, currently on a Pixel 4a. I tend to avoid the high end, and find if you wait 3 months or so after a new release, the kinks are usually worked out by then.

As to the iPad, there really isn't a good alternative that I'm aware of... there are still a few Android options, none are great... the MS surface tablet and other convertable laptops are okay, but still not as nice a UX, it's not my thing so doesn't bother me, but can understand why if it works, it really works for you.

Watch is about on par, from what I understand.. again, not something I'm into personally.

I tend to take the Apple option for work (software dev, mostly web/svc oriented) only because corp Apple experience is generally better than corp windows.


I vote with my wallet.

The Framework laptop is my choice. It is not the best in every category, but it is mine.

I didn't like the weak hinges, I replaced them. I can replace the battery when I need. I can upgrade the memory or hard disk, or motherboard or screen when I need.

The thing with Apple is that their way of vertical tech integration results in highly polished, non-standard, unservicable machines. They are nice, but not worth the trade of ownership for me.


I feel exactly the same about Apple's software. It's great as long as your preferred workflow matches the Apple-blessed workflow. But if it doesn't, you're pretty much hosed.

Desktop Linux is not as nice or facile or polished, but it feels like "my" desktop because I can modify it. When I use MacOS, it feels like I'm just renting somebody else's computer. It's a very nice computer, but it can never be mine.


I mean, that’s your definition of “yours” though.

It’s like saying that buying a condo isn’t home ownership due to an HOA having some oversight. Some people are totally fine with that, it doesn’t mean it’s less “theirs”.


I would in fact argue that exact point.

How much you control something directly relates to how much you own it mentally and practically.


I regret to inform you that your point would not be a winning argument with a significant portion of the population.


In terms of "premium finish" feeling, I think that the Starbook from Star Labs might be in the convo - they actually make their own chassis, unlike so many other vendors.

https://us.starlabs.systems/pages/starbook

(I know Framework isn't rebranding stuff - just throwing another in the mix)

Good luck getting one though, wait times seem bad every time I look.


my starbook arrived about a year after I purchased it, and was DOA.


>As to the iPad, there really isn't a good alternative that I'm aware of...

As someone who refuses to use Samsung phones because of the bloatware (and was using a Pixel 4a until the sim slot crapped out last week, of all things) I've actually been pretty happy with Samsung's tablets.


I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone. I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button). My Pixel does better photos than ANY iPhone on the market.

Macbook Air M1, just look at Dell XPS series.

Samsung S9 tablets have an OLED display. OLED... you don't get that on any iPad.

Apple Watch really has no competitor which matches 100%. On certain areas like fitness (Fitbit) or hiking (Garmin) there can be also some good and better alternatives but it does not match 100% of the features.

I think there is always a choice except when looking at the Apple Watch.


I had a top of the line XPS. Battery life is less than half that of the Air. The speakers in the XPS sound like they are from the 90s. It gets painfully hot on the bottom case. The keyboard wrist rest area is cheap plastic. It was thicker and heavier than the MBA. Intel Iris GPU (ie slow af) versus M1/M2 GPU.

One point to the XPS: the pixel density of the 4k 13" was absolutely LOVELY. I have never seen a screen so nice.

There is really no comparison overall, though: the Apple laptops blow them (and everything else in that category) out of the water.

The top end iPhones are similarly 2-3 years ahead of the flagship Pixel devices in build quality, too. I tried, really I did.


Same here. The XPS’s hard drive died within two years and I had to replace it.

The keyboard and touchpad also don’t hold a candle to MacBooks.


I bought a MacBook Pro in 2013 and my mom bought a Dell XPS at around the same time. Her laptop died, she got a Lenovo Yoga. That one died within a year and was replaced for free. The new one died after two years. When the M1 Air came out I gave her the old MacBook Pro, which she still uses every day, going strong (aside from battery life, but it's mostly plugged in).

It has essentially outlived 4 Windows laptops. I expect the M1 Air to still be relevant in a decade, as well.


I bought a mid-range Acer laptop in December 2012. I used that thing as my daily driver until 2018 until I finally got a desktop. In 2020 I repurposed that laptop as a server and it's been running ever since. For tens of thousands of hours.

My point is that Apple doesn't have a monopoly on quality. Perhaps I'm pretty lucky in this sense, but I have had great longevity out of all of my hardware, and I have owned very few Apple devices.


The M1 won't, the SSD will wear out before that and with no way to swap, it will be landfill as well.


In 2013, TechReport did an SSD endurance test: for months, it ran non-stop write operations, a much more strenuous use case than what a regular laptop would experience, which will be primarily read operations. 4 months in, after writing 300TB, all the tested devices were still working without issue.

If the characteristics of the SSD in an M1 are sufficiently similar to the SSDs that were used back then (I have no clue if that's the case), wear out will be a non-issue.


Since 2013 there have been multiple iterations of SSD technology that increased storage density but sacrificed storage durability: https://www.howtogeek.com/444787/multi-layer-ssds-what-are-s...

This is why people are concerned about SSD endurance. An old SLC SSD's NAND would be good for many, many writes, tho the controller would often fail. Nowadays the NAND fails, but the controller is fine.


Do the M1s have especially short write lifetimes? I have a Toshiba from 2010 or so and the SSD is still fine even if the trackpad and speaker ports died years ago.


After 2 years of use, my M1 MBP has 5% write life used. Extrapolate that out and you get 40 years of lifespan.


I'd note that the 2013 MBP also has an SSD that hasn't experienced any problems so far.


Been running a XPS 9560 since 2017 and its rock solid. Windows 10 with WSLv2. Basically the key to any Windows machine is finding the combination of drivers that are stable. That is the trade off with Windows. They support a ton of hardware so naturally the driver quality varies. Apple's problem space is easy in comparison. One set of hardware, one set of drivers. A lot of the complaints about XPS hardware really boil down to the bad set of drivers Dell ships them with. As I get older it has become pretty clear the sweet spot is to be two generations behind the latest and greatest. You get the ideal spot between price, cheap replacement parts and stability.

So not turn key, but also not a rip off prison like Apple :)


Isn't a good high density screen typically a large power drain? Perhaps there's a trade-off that was made there…


It's not included because it's really hard to make a high quality screen, not because of the battery tradeoff.


> Macbook Air M1, just look at Dell XPS series.

If you want an unusable trackpad, a middling keyboard, a fan that spins up and down at random when the laptop is just sitting there, a space heater for your backpack when you close the laptop and stow it away, and a pathetic battery life, then, yes, a Dell laptop is just what you need.

I'll admit, I have a more expensive Dell Precision laptop, so maybe the XPS is actually usable, but I'm not going to hold my breath. The one that I have is the worst POS laptop I've ever had the pleasure of being forced to use.


I had a personal XPS 13 back in 2015 or 2016 that I loved. Right now for work I have a Precision 5560 from 2021 which has all the drawbacks you listed and that I hate. I don't know if it's a year thing or a model thing, but certainly there is no brand consistency when it comes to Dell.


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

I don't think most people -- even iPhone users -- hold that opinion. In the US, at least, the iPhone is still a status symbol. People have iPhones because they don't want their iMessage bubbles on others' phones to be the wrong color. They're locked into that ecosystem with various purchases and don't want to throw that away. They use a Mac and like the integration.

On top of that, I (as an Android user) am constantly uncomfortable running a mobile OS built by a company that exists mainly to track people's behavior and invade their privacy, with the goal of selling ads (and I am more vehemently anti-advertising than most people). As much as I don't fully buy "Apple's commitment to privacy", they are in a much better place in that regard than Android is. I lock my phone down and give nearly every app (including Google's) zero permissions, and only enable (and then immediately disable[0]) as necessary, but I'm still convinced my privacy posture would probably be better with an iPhone. But I don't want to live in that walled-garden nanny-state, so that's that.

[0] https://play.google.com/store/apps/details?id=com.samruston....


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

Because Google... I could live with not having iMessage or AirPlay, that's annoying but something I could live with. So it's either a de-googled Android phone or iPhone, and I do need a few apps which are only available in the App Store or Play Store, so I figure I'm limited to phones that can run something like CalyxOS, which basically limits me to Pixel or FairPhone.

The FairPhone isn't a terrible choice, but I'm not going to replace a functional iPhone with it... if it break maybe, or I can get a used iPhone.

It's not that I trust Apple all that much, I just trust them way more than Google at this point. I don't think Google is evil or bad, but their interest and mine doesn't really align.


>I could live with not having iMessage

So, literally any messaging app, something that non-Apple users have to do anyways, and have to deal with your bullshit about only going through iMessage when they have to send you SMS.

>AirPlay

Chromecast is infinitely more ubiquitous. Also, if Apple didn't patent AirPlay and refuse to share it with anyone, you wouldn't be in this situation.

I will absolutely agree with Google being an absolutely dreadful steward of Android, but make no mistake: you gave Apple full support in locking themselves down in their own little playground, and now you're complaining you can't get out.


> Chromecast is infinitely more ubiquitous.

Chromecast is a device.

> Also, if Apple didn't patent AirPlay and refuse to share it with anyone, you wouldn't be in this situation.

Google Cast is just as proprietary as AirPlay. Both require licensing to be included in devices. I have an LG TV that supports both, an ancient Roku device that does the same, as well as supporting Miracast. I suspect you're confusing Chromecast and the Google Cast protocol with Miracast, an open standard; one dropped by Google in favour of their proprietary stack.


> Chromecast is a device

It's branded as "Chromecast built-in" when supported by a TV, not "Google Cast".

e.g. https://www.sony.com/image/89821bf64399cd4c34680e0988903e4b?...


Is that a recent rebranding? The SDK is still called Google Cast.

https://developers.google.com/cast.


It's probably a difference between developer facing and consumer facing, or software vs hardware branding.

Or maybe they just couldn't get a trademark for "Cast"?


My son bought a Dell XPS for exactly that reason. After 1.5 years, the battery life was at 50%. He called Dell support and they said it was normal. He's now in the market for a MacBook Air M2.


I can’t believe my M1 Mac is almost 3 years old, performs just like the day I bought it (blazingly fast).

My work Thinkpad from the same period feels half way dead.


That's likely more a reflection of the software you run (and update over time) than of the hardware itself, no?


It helps that the Apple Silicon CPUs run so cool. In other laptops your CPU cooler and fans will fill up with dust causing the CPU to thermal throttle.


This is a limit of battery technology. Your Apple laptop will ha e shit battery life in a few years as well


I have an Apple laptop that is a few years old. It does not have shit battery life. Probably 90% of new.

Did you research this statement before making it?


I've owned several Apple laptops over the last 20 years for both personal and work use and they've all had reduced battery life over time. I've suffered one recall, another one which failed 1 month past the warranty and which the Apple Store said was quite normal(!), and several had degraded to the point I had to ensure they didn't drop down to below 20% (or some other magic number). After the first few I took into the Apple Store only for them to say "oh that's to be expected" I stopped going in. Not sure why people think Apple laptops are magically immune, they're not. They suffer battery issues just the same as other brands do.


Apple introduced battery charging limitation feature on MacBook in 2019 (or a few years ago?), while Windows laptops supported it in 2000s. That should be the reason for previous experience. https://support.apple.com/en-us/HT211094


Yep. Apple battery health in system preferences will show you how much the battery has degraded since the device was new. My ~2yo M1 MacBook Pro still has 92% capacity compared to when it was brand new.


I don't know if it's in System Preferences, but for me, System Information shows my mid-2015 MBP (running macOS 10.14.6 Mojave) still has a battery capacity of 8266mAh and a remaining charge of 8079mAh at 100%. Compared to the advertised capacity at launch of 8755mAh, that's a charge capacity of ~92% after ~five years (AppleCare replaced my battery during their recall). MacBooks are just built different.


Modern Macbooks are very strategic about when and how much they charge. You can override it if you know you need a full charge Right Now, but otherwise they will decide how much to charge based on your usage patterns, which keeps the battery alive much longer than on many other brands of laptop.


Apple's batteries are covered under warranty up to 1000 cycles. I had them replace my battery that hit 80% capacity at 3 years right before AppleCare ran out.


Device manufacturers can engineer for longer useful lifespan by oversizing the battery, can't they? Do they all do that to the same extent?


To an extent, there's a limit for air travel, generally speaking as they can be dangerous. The m1/m2 are just killer in terms of lifetime and usage for general reading/browsing/email.. and still very long for even content viewing. Most people aren't rapidly draining their batteries, so the longevity gets to be a bit better overall.

AMD is getting pretty close and the perf:watt on the coming generation(s) for laptops (including integrated gpu) look to be really impressive to say the least...


They can also implement better battery management technology (cooling, charge rate curves, keeping the charge between 10%-90% instead of 0-100%, but reporting 0-100% to the user via scaling), etc, etc.

For a good counter-example, look at the early Nissan Leafs. They burned out their batteries in a matter of a few years, but battery replacements for other brands from that time are basically unheard of. (The inherent information asymmetry for new car purchasers is one reason Biden's IRA dictated minimum car battery warranties.)


Funny, I heard the total opposite about Nissan Leafs. The industry was guesstimating that batteries would last 8-10 years. The first Nissan Leafs (which was about the first commercially mass-available EV) had battery lives where something like 90% were still going strong and still 80% of original capacity left after 13 years.

Rather than the Leaf being problematic, it was the car that showed the market that worrying about the lifespan of EV batteries wasn't really necessary.


I had one of those early Leafs, the battery degradation was real. Newer vehicles seem to have much better curves.


The key to battery life/health on the XPS is to use the BIOS functions to limit charging. My XPS-13 9370 has been plugged in most of its life (about 4 years now) and battery health has dropped from 96% to 93%.

I can't speak to the rest of the comparison to the Macs - they're probably better overall - but the battery life is a solved problem if you know to limit charging.


That's clever, but probably too clever for the 99% of people who don't even know what a BIOS is. You've got to wonder why Dell wouldn't do that kind of front end work themselves.


When most people you know use iMessage then Android is a bad experience.

Also, as someone who switched from Pixels and other Android devices to the Apple ecosystem . It’s nice that everything “just works.”

It’s kind of like running BSD or Debian stable after having been on Fedora/Arch/etc.


> When most people you know use iMessage then Android is a bad experience.

Isn't this because Android uses open standards for its SMS and iOS refuses to do so?


RCS as a baseline standard is proprietary, but google then slapped a bunch of proprietary extensions onto it that it refuses to license, so no, it's not.

https://arstechnica.com/gadgets/2022/08/new-google-site-begs...

> Google's version of RCS—the one promoted on the website with Google-exclusive features like optional encryption—is definitely proprietary, by the way. If this is supposed to be a standard, there's no way for a third-party to use Google's RCS APIs right now. Some messaging apps, like Beeper, have asked Google about integrating RCS and were told there's no public RCS API and no plans to build one. Google has an RCS API already, but only Samsung is allowed to use it because Samsung signed some kind of partnership deal.

> If you want to implement RCS, you'll need to run the messages through some kind of service, and who provides that server? It will probably be Google. Google bought Jibe, the leading RCS server provider, in 2015. Today it has a whole sales pitch about how Google Jibe can "help carriers quickly scale RCS services, iterate in short cycles, and benefit from improvements immediately." So the pitch for Apple to adopt RCS isn't just this public-good nonsense about making texts with Android users better; it's also about running Apple's messages through Google servers. Google profits in both server fees and data acquisition.

Like c'mon google doesn't care about open-standards except insofar as that allows them to embrace-extend-extinguish. google's end goal is imessage but with google servers in the middle instead of apple ones.


> there's no way for a third-party to use Google's RCS APIs right now

To be fair, this criticism is fundamentally true of iMessage, too. Implementing all of iMessage's features in an open, trustless manner is impossible.


Thanks, glad I asked - I genuinely was not sure.


Sorry, it sounded snarky/pointed, that was more adversarial than was ideal. It really is hard to ask a question these days, if it's political...


This is correct. Google put a lot of effort into making carriers adopt RCS (which has most of the functionality of iMessage), but Apple will not adopt it to keep their "competitive advantage."

https://www.android.com/get-the-message/

I use an iPhone now but these kinds of business tactics and the others mentioned here really make me wish there were more competitive products on the other end.


RCS is pretty garbage without Google’s extensions. Of course this ends up being very similar to iMessage.


Does it matter for me as a user?


No, and I wouldn't want to imply otherwise. I was genuinely asking because my recollection is that the answer is "yes" but I don't recall.


I loved Android back when I had the time to hack around with ROMs and crazy customization. It's fun. But these days in my busy working life, I don't have time for that kind of stuff, I just want something that works well and gets regular security updates. My Android phones weren't really cutting it.


> I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button).

During my time as an Android user, that back button struck me as the single biggest anti-feature in the Android UX. Every application implemented it differently, sometimes I'd find it bumped me right out of an app if I tapped it once too often, others didn't do that. I hated the thing; it required learning a different set of mysterious tendencies for every app and situation. So happy that it's not stinking up the screen of my first-ever iPhone.


Stuff like the 911 calling bug is a great example: https://news.ycombinator.com/item?id=32713375


> I still don't understand why people think in 2023 that Android can not be a viable alternative to an iPhone.

Because some people, like yours truly, enjoy having a patched up-to-date mobile os, but also don't need to change their phone every other year. My Iphone 7 which I bought refurbished in February 2017, still works perfectly after a battery change. It has the previous iOS version, but it keeps receiving security updates. All the apps I need work on it (games may be too much for it, but luckily, I have a PC with a big-ass GPU for that). My dad's Galaxy S7 hasn't had an update in a while. He tried to install 1password, a freaking password manager which is basically a glorified notepad, says it doesn't support the phone and / or the android version. His GS7 is working fine otherwise, though.

> Macbook Air M1, just look at Dell XPS series.

This has to be a joke. I can wholly understand people not valuing build quality and preferring to save money over that or invest it someplace else. But that doesn't make it "comparable".

Have they finally fixed the touchpad moving by itself or ignoring your finger? The fan spinning like a jet engine for no reason? I hear nowadays everybody's on the "modern standby" bandwagon. How do you like your battery draining 50% while on your commute home while the PC supposedly sleeps? Or waiting around for it to wake up from hibernation? That's if you're lucky enough it doesn't burn down your bag because it figured it's as good a time as any to wake up and do who knows what, which absolutely couldn't wait.

I won't comment on the ipad nor the watch, since I've never owned any of those.


FWIW, Apple does release _some_ patches for previous iOS versions, but there absolutely are mitigations that are not back ported to older versions.

If you care deeply about security, I’d recommend having a phone that actually is using up-to-date OS.


I do care, that's why I'm thinking of getting a new one. But still, ios 16, the first which didn't support the iphone 7, came out in September 2022. The iphone 7 came out in September 2016. I'm not aware of any Android phone with such a long support cycle.


There is not even one that would have half that.


My android phone is from 2011. All the apps I want to run on it run on it. I know of some that won't, but I don't want to use them.

So what does this prove? Anything? Probably not.


I'd argue that games are a specific kind of app which require performance. It's likely that a brand new low-end phone supports the latest android os but not the latest power-hungy games. Though you can probably install them on it, just like I can install power-hungry games on my old iphone. It's just that it won't be an enjoyable experience playing them.

A password manager doesn't have any such requirements.


Agreed Android is a suitable iOS replacement. Maybe a bit wonky at times but I’ve used both and they’re the same for my needs.

XPS is not comparable to M1. Not even close.


> I would never use an iPhone for certain software and UX related issues (biggest is the missing back gesture/button)

Um, swiping from the left edge takes you back on iPhone and iPad. Not sure how long ago did you last test the "missing back gesture" theory. Been using an iPhone for 3 years and has always been this way.


Not in several google apps — they deliberately break the convention.


just google ruining the experience as usual. same with youtube on iOS, they get away with it so why not try breaking the rules everywhere.


Android phones come loaded with ad tech, malware, depending on a matrix of carrier _and_ manufacturer. Even if they don't start with it, if they actually get updates, any update could bring it along later. It's a metaphoric minefield, where you have to do significant research before finding the current mine-free path. Samsung in particular has a reputation for lacking scruples.

It has nothing to do with hardware.


”…missing back gesture/button”

Swipe from the left of the screen goes back a page in the current app. Swiping right on the bar at the bottom goes back to the previous app used. Two different locations but the same gesture. What else is missing for a back gesture/button?


Back gesture on iOS only works within an app. Back button on Android works system-wide (cross-app).


Swiping along the bottom of the screen goes to the last used app.


It works but I prefer Android's intent based transition system.


What does intent based transition system mean? The advantage of Apple’s system is that it is unambiguous. I’ve heard from plenty of people that the “back” button or gesture surprises them from time to time.


Macbook Air M1 > Dell XPS or whatever alternative

can't speak to other tablets but iPads get the job done. I have one of the oldest generation ipad Air that still gets updates. Got it secondhand. It does the basics well enough. I'm not sure an android tablet would even be getting that support.


If you care about privacy, then the only choice within the Android-sphere is GrapheneOS, which limits you to pixels, which arguably are not on the same level in hardware than iphones.


>I would never use an iPhone... missing back gesture/button

One finger swipe right.


I have a Pixel 3a and longevity of phone is important. But Android is cutting down their supported cycles to 3 years. Unless something changes, in fall next year I will get an iPhone 14/15 (possibly refurbished) and keep it for 6/7 years. ios16 is still possible on the iPhone8 - a solid 6+ years of official updates!

Walled garden or not, their products are solid and supported. That is what most consumers look at. No hassle ownership for most.


> I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time.

I have the opposite experience. A few months ago a friend of mine had to buy a new iphone because their phone couldn't hold a charge. Shortly after they received their new phone, towards the evening, they remarked how happy they were that their new iphone (literally a few days old) still had 36% charge after most of a day's usage. I looked at my 2.5 year old (at the time) pixel 4a and I still had 84% charge.

> having to live with endless amounts of compromises just to get away from Apple.

Having basic system functionality, such as GPU accelerated OpenGL, Vulkan, or OpenGL ES, seems like a catastrophic compromise. Like I can compromise about how the widgets of application foo and the widgets of application bar don't match each other, I couldn't care less. But no Vulkan support? Forgetaboutit.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

I have a Surface Laptop 4 with an AMD processor and 16GB of RAM. I'm extremely happy with it, though I'll concede that the build quality is a click lower than a M1 MacBook Air (two of the rubber feet have fallen off of mine, being my main ding against it) and nobody can touch Apple Silicon's battery life. But aside from that, it's definitely in the same arena for build quality as an M1 MacBook Air (my wife's daily driver, so I'm not just saying that out of ignorance) with, IMHO, a better keyboard and a touch screen if you're into that sort of thing. Oh, and the facial recognition unlock is the best. I've had zero issues with stability—Microsoft pays more attention to squashing Windows bugs on their own hardware, it seems like.

Surface Laptop 5 has been out for about a year, but it was a totally microscopic incremental refresh. That means the Laptop 4's have dropped in price a lot, even though it's still 95% the same laptop.

My exact model of Laptop 4 can be had on periodic sale, refurbished on Woot.com for ~$700 or less, which is a screaming good deal IMHO for what you get. You could almost get two of these things for the price of one new M1 MacBook Air once you factor in the (IMHO mandatory) RAM upgrade.


A refurbished Macbook Air M1 with 8GB of RAM (which works wonderfully for day to day stuff, even development with VS Code) is $849 directly from Apple. I never felt any slowdowns. The thing is chugging along nicely and I don't feel a need for an alternative any more. Apple Silicone has turned the industry upside down IMHO.


Does anyone have experience how IntelliJ runs on this machine?


Yes, none of the Jetbrains IDE's seem to be usable on a 8GB Mac. It invariably starts throwing errors about running out of memory after a while and then everything starts to lag (including typing).

Also you can't really run any other process alongside as long as it uses above ~5000-1000mb (so no Node for instance). Having a browser with more than a few tabs open alongside is also an issue.

Maybe a ~5+ year old version might work better.


Thank you very much.


I have an AMD 5700U linux notebook for work. I'll second that they're really quite great on battery and speed.


> Oh, and the facial recognition unlock is the best.

This is scary as fuck. Combined with all the telemetry and tracking M$ is now doing without care from within its operating system.

Windows and Android are subsidized data collection applications which run on subsidized hardware. Simple as.


Across every consumer-facing device where it has ever been deployed—which is probably in the billions at this point thanks to iPhone alone—consumer-grade biometric face unlock technology has a body count of precisely zero.

If I turn out to be the world's first facial recognition casualty, that would be SUCH a hilariously unlikely way to go that I almost couldn't even be mad about it.


> Windows [...] are subsidized data collection applications which run on subsidized hardware. Simple as.

How does Microsoft monetize (or even could)? Are you just saying random things?

Also you could just install Linux, of course battery life would likely be even worse then.


LOL, How would you unlock iPhone?


There's nothing valuable about your Windows telemetry. It just means if you hit a crash it's possible it'll get fixed.


Apple has enjoyed, and seems set to enjoy for awhile longer, a full process node advantage over everyone.

People whiff that (cough RDNA3), and people overcome it kinda (Raptor Lake), but ceteris paribus, their shit is just from the future.

Having a bone to pick with Apple is a very reasonable thing, I’ve got a few gripes myself, but that’s why you buy something 2-3 years behind the cutting edge. It’s not because it’s better.


Based on reading Taleb, I have a theory and love to hear counterarguments if there are any:

Products are as good as the worst part. Apple‘s complete integration allows them to more easily fix the worst part since they have control over almost everything. Historically, "open" systems, as Bill Gates would call Windows together with the suppliers such as Intel and Dell, could get away with some bad aspects as long as they just threw in new CPUs with smaller transistors. Now that Moore‘s law and Dennard scaling have slowed down, fixing the worst part is the only way to find good improvements.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

The ThinkPads are pretty good laptops, especially if you put Linux on them.


Use to be 100% linux person. I've gone through many thinkpads. M1 Air is so far beyond them it's not even close.

Battery life is (not exaggerating) 5x or more in my experience.

It's honest to god, much faster.

Completely silent and cool. (one of my thinkpads almost burned me it got so hot, and the fans get so loud)

I've never had a single crash, random restart, failure to sleep, failure to charge, driver problem, touchpad randomly not working, wifi failing, all of which I've had with thinkpads.


A 5x battery life improvement seems like the baseline had some issues, unless your MacBook goes like a week without charging.


My M2 MBA routinely goes past 10-16 hours of life time with simple coding stuff (phpstorm and PHP in Docker), and it stays comfortably cold all time. Most Windows laptops struggle to get more than 3-4 hours and will fry off your balls if used as actual laptops.


The poster was apparently a 100% Linux person previously, though. I’m pretty sure Windows was designed to heat up like that as a funny joke. For example on Reddit I see somebody complain that they only get 5 hours with my laptop model (zenbook flip 13)

https://www.reddit.com/r/ASUS/comments/ry0nwb/the_asus_flip_...

But it is pretty easy to get 14 in Linux I think, at least if you believe the battery indicator.


To be fair, Windows has better battery optimisations for new laptops. Many hardware supported sleep modes are still missing from Linux kernel, for example, when I checked last time.

Huge differences come probably from the lack of user skills, less about the OS. Or just broken drivers.


Isn't battery life generally much worse on Linux compared to Windows?


It is pretty configurable. I don’t see why it should be worse (assuming of course you don’t have driver problems).

And the configuration is pretty helpful, for example I have an OLED screen, so I can get some power savings from making things mostly black.

Plus the hard drive can mostly be idle; you don’t have Cortana or whatever they call it now poking around for interesting bits.


> I’m pretty sure Windows was designed to heat up like that as a funny joke

Nah, it's common across all x86 devices. Even Apple's old lineup... which is why they went for M in the first place, Intel couldn't be arsed to deliver something power efficient.


I guess I find this troubling because it would seem to indicate that I spend multiple hours a day typing at a dead laptop, hallucinating that it is still working.


Most recent Thinkpad was X1 Carbon with Kubuntu, running intellij / browsers / docker would last around 3 hours or so. M1 Air is 15+.

I also have a Anker 737 battery, with it I can double the macbook's battery if fully charged. The Thinkpad would only charge partially, so wouldn't even double.


That’s weird, I wonder which program did it.

I typically get a day of work out of my Zenbook flip 13; I haven’t really measured the battery performance rigorously because it is easily long enough that I don’t think about it (the battery indicator will say 14 hours, but those are of course pretty flaky). I’m a vim/Firefox with ads blocked guy though so I guess I must not be making it work very hard.


>driver problem, touchpad randomly not working, wifi failing, all of which I've had with thinkpads.

Those are Linux driver issues, not HW issues.


Incompatibility is a two-sided issue, there are enough laptops out there that work perfectly fine in Linux. But a brand isn’t a technical promise, they’ll miss with some models and hit with others.

People always complain about Lenovo in these threads, I think because they are held up as the “good non-Apple laptop” brand for whatever reason. I suspect this reputation makes people assume they can just grab any random model and it will work perfectly. That’s just a roll of the dice, maybe weighted in favor of working, but still random.


>But a brand isn’t a technical promise, they’ll miss with some models and hit with others.

Sure, that's why there's now a dozen Laptop brands that ship with Linux compatibility in mind.


> Those are Linux issues, not HW issues.

The touchpads on my wife's T470 and my T480 have very similar intermittent issues, despite her running Windows and me running Linux.


That only matters in a technical sense -- as a consumer, I want a flawless out-of-the-box experience. I don't care if X isn't working because of Y or Z; I only care that it isn't working.


They are very likely hardware issues that the Linux driver is simply not working around rather than being actually wrong.


And when I can’t join a call because the Wi-Fi has stopped working I’m sure everyone will appreciate that distinction


Did Lenovo sell you the laptop with Linux compatibility explicitly stated?


If you're using Linux and there are no good, reliable drivers for your machine, it might as well be a hardware issue.


As a user, I don't really care about the root cause..


If you're too lazy to make a Linux driver for your hardware, it's a hardware issue.


Maybe some HW companies don't have the budget to write drivers for the 3% userbase that is PC Linux users, especially since most commodity HW is in a constant race to the bottom in terms of pricing so profits are slim as it is.

So better check for the HW you're buying it, if it's compatible with the SW you intend to use, especially now that there's almost a dozen laptop brands selling Linux-ready laptops. Like you don't buy an X-BOX hoping it will run your Nintendo games collection and the blame Mycroft when you realize it doesn't work, do you?

And calling those driver devs "lazy" is a huge slap in the face, especially if you knew how overworked and underpaid people in that industry tend to be, as the profits are also very small. Not everyone is rolling in cash like Nvidia, AMD and Intel.

This sub can be quite pretentious at times.


> The ThinkPads are pretty good laptops, especially if you put Linux on them

I used ThinkPads for 7 years until getting new M2 Pro, mostly with Linux.

Touchpads on Thinkpads are not getting even close to Macbooks. You need external mouse.

Also the the basic screen quality on current Macbook Pros is beyond their top quality products. Try to look for 1000 nits screen? Not even gaming laptops have quality ones.

And battery life...

And performance...

When you have enough performance on your machine, the physical touch, screen and overall stability goes beyond everything else. Thinkpads have better durability on keyboard tho.

I also used to have one of those OLED Thinkpads and that was the biggiest mess I ever hard. They even cancelled OLED screens on all products for 4 years after that. The screen just broke every one month.


The nipple on the ThinkPad is unsurpassed. It's basically the reason I can't migrate away from them.


I have tried to use nipple many times but maybe I was unfortunate with it. I was not able to get good enough drivers on Linux for it, and as a result it was never accurate, but very clunky instead.


This was on Linux. It takes a bit of getting used to, but I won't go back unless I'm absolutely forced to.


I agree that I liked it on Windows. But I was not able to get the same experience on Linux.


My brand new work thinkpad is total garbage compared to my two year old MacBook Pro. Only laptop where I need to use an external mouse.


Yes, and the fan blows directly onto your mouse hand, making mouse usage awkward as well.


Actually this is my favourite feature of my work thinkpad. Especially during winter. Really. Besides tons of disadvantages of the platform for the price.


Agreed. I got a gen10 thinkpad (with Linux). The power supply makes a crackling noise. After 6 months the fan starting making grinding. The screen flickers when the CPU is loaded, and once a week the whole thing just locks up. Worst laptop I've owned.


Did work install any crapware on the laptop? Or is it bad even without?


The software is fine. The computer isn’t slow at all. It’s just not a well designed computer.


I've used Linux+Thinkpads for years.

W-series, P-series, top of the line

Switched to Mac M1 this year, and....longer battery life, better performance, higher resolution, brighter screen...it's not even close.


I have an M1 Pro 14 and a work-issued P14s, which is awful. Creaky plastic, the worst trackpad I've ever used, has a terrible display, spongey keys and runs unfathomably hot. All. The. Time. Every time I see someone recommending Lenovo, I cringe. It is night and day when compared to the MacBook Pro.


Weirdly I have both of these as well and feel total opposite. Give me the Thinkpad keyboard any day of the week. The mac keyboard feels down right anemic.

I wish mac would stop making the track pad so damn big though, the amount of palm activations I have on that thing drive me bonkers.


I'm intrigued by this. I'm typing this one-handed on the MacBook with my other hand resting on the trackpad without interference of any kind. The hinged trackpad on the P14s takes between 5-10 minutes to be useable from a cold start (these happen often due to the combination of an anaemic battery and power-hungry Intel processor); it's as though it needs to warm up. It is a pile of overpriced junk not worth the value of the parts it's made with. I've had other ThinkPads and generally disliked them - even the X series, but that boiled down to personal taste - not a fan of the aesthetic - and a crap trackpad. This P14s, though, is unmitigated shite.

My experience of the P14s says that either I have a dud (other colleagues complain vociferously about them, too), your MacBook is defective, or both. None of which are ideal!


I have zero love for the trackpad+keyboard.

But 80% of my usage with external keyboard and mouse.


I've found Thinkpads to be trash for the past few years. Could be bad batches, but I sent back my X390 three times for warranty repairs and its replacement T14gen3 once so far.


Tachiyomi makes Android significantly better for comics/manga than iOS.

It's one of the reasons I sold my iPad for a galaxy tab s7+. Every alternative I tried on iOS was just shit in comparison, paperback especially.


> yet it seems like people are fighting battery life issues all the time

I miss the days when I charged my cell phone twice a week.

It didn't take 100 mexapixel photos or display 4K cat GIFs, so I understand why.

But I miss that battery life nonetheless.


You can still buy basic phones that last all week, they're very cheap.

But nobody wants a phone, they want a tiny portable computer.


> But nobody wants a phone, they want a tiny portable computer.

Yes, including me. I want everything :/


A modern smartphone will last for literally weeks if you turn off all data and only use it for calls, the batteries are massive to support streaming video and playing games.


It's largely the background data-hungry apps and the obsessive screen time with foreground apps. As someone who almost uses their phone like a dumb phone, I charge my Pixel 5a about once every 7 days now, when it gets to around 20% remaining. It was more like 10+ days when I first got it, but that was also before I decided that avoiding the 0-20% range might be better for long-term battery health.

The most power-hungry apps I run are Slack, the built-in GMail and Messages clients, the Garmin app to periodically sync with my watch, and sometimes Firefox if I actually spend time browsing there instead of on my preferred laptop environment. The camera app also eats power when actually taking pictures or video, but I guess I rarely do.

When I first start using a phone, I go through a little effort to disable things I don't want via the system apps menu. For me, that includes the Google Assistant and their native launcher, because the last thing I actually want is to trigger search functions willy-nilly. If I want to search, I'll open Firefox and search...


I have found that my newest phone (Samsung S23) has significantly better battery life than my old Pixel 3a ever did. I really have to try hard to run the battery down and with my normal usage it easily lasts two days.


I only charge my iPhone 12 about every 2 days as well - at least with normal usage. But I can also easily run it down with an audiobook playing all afternoon over Bluetooth. No idea why that uses so much power.

When I’m at home I’m so close to charge points all day that I don’t worry about it. And when I’m travelling, I bring an external usb-c battery pack that can fully charge my phone about 6 more times or give my laptop another few hours of use.

Sure, more battery life would be strictly better. But I’m happy with the state of things right now.

Apparently the EU has legislated toolless user replaceable phone batteries by 2027. I’m curious if apple plays ball and if so, what iPhones will look like in a few years.


> Apparently the EU has legislated toolless user replaceable phone batteries by 2027.

The legislation calls for phone owners to be able to remove batteries “with the use of commercially available tools and without requiring the use of specialised tools, unless they are provided free of charge, or proprietary tools, thermal energy, or solvents to disassemble it.”

Essentially, no glueing the screen to the battery, which is sensible. I take umbrage with the "without requiring the use of specialised tools, unless they are provided free of charge". Manufacturers should make available specialised tools commercially and be allowed to request deposits for specialised tools by individuals looking to perform one-off repairs to ensure return. Otherwise, it's reasonable. I doubt we'll see a significant change in the form factor.


I charge my Pixel 5 pretty infrequently, couldn't give a number though. I leave it in battery saver mode automatically when it's below 70%, and often I will just plug it in to a quick charging cable (the one from my steam deck actually) for 20 minutes to "top up" (may not fully charge but who cares). I only notice my battery life about once every other week. Other than frequent photography, I guess I don't do very much intensive processing on my phone though.


I don't get these battery issues people have. I use a 3 year old Huawei P30 Pro, and I still get 2 days of battery life when charged to full. Apparently I average around 4 hours of screen time per day.


I recently bought a Sony Xperia 10 IV and the battery easily lasts 4 days. Performance is great too, at least for my usage (I don't play games on the phone though).


> I've tried finding alternatives on my last upgrade cycle and it's like having to live with endless amounts of compromises just to get away from Apple.

Everything is a compromise.

A while ago, I wanted a new notebook:

I looked very hard at a 16" M1 Pro with 64 GB of RAM, at approximately USD 4000, tiniest storage possible. I really really wanted to run this with Asahi Linux.

I purchased a Dell Inspiron 7610 with 16" display (and known design touchpad design defects), 3K resolution, sort of light-weight, Tiger Lake 11800H CPU, Intel + Nvidia hybrid graphics, now running at 64 GB of DDR4 RAM, 2 TB of very fast PCIe 4 SSD, 1 TB of PCIe 3 SSD.

Professionally I run this with Windows 11 Professional + VMware Workstation -> Fedora 38 because of an Azure VPN; in my private life this is plain Fedora 38 dual-booted.

Why? USD 1700. Less than half the cost.

Native podman / docker. CUDA. The (still) dominant architecture (x86). I am typing on Proper Keyboards anyway.


>I tried looking at the Pixel for the Vanilla Android experience and long support, yet it seems like people are fighting battery life issues all the time.

my pixel has good battery life.....I know it's anecdotal but it's been fine.

I've always been an android user on smartphones. Was samsung in the past but now I have a pixel. I like the pixel and vanilla android experience overall. Android allows some freedom to customize, I like their call screening, and the wide choice of apps. Pixels are also starting to be supported for longer; maybe not as long as iphones but it's always been an android issue and they're finally starting to address it. Stability is there imo. I have all the apps I need to do what I do everyday but obviously everyone is different.

I own a variety of different hardware including apple products, but I'm not 100% reliant on their ecosystem, so I'm keeping the pixel for now.

>Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

Agreed


My strategy is to go all-in on Apple as my daily drivers, and then to tinker with Linux in my spare time. For example, my main phone is an iPhone, but I recently bought a used OnePlus 6 with the intention of installing NixOS on it. My laptop is an M1 MacBook Air, but I code in Linux VMs, and have a Linux NAS at home.

As many of my services as possible are self-hosted, and as many of my apps as possible are FOSS, but I access them with Apple hardware. And as my M-series Macs age, they'll become Asahi machines.

This is the best compromise I've found. The truth is, Apple stuff still does Just Work, and having an actual Unix base is nice too. Open standards and protocols can get you a long way away from the walled garden aspect of Apple to the point where really they're just nice computers that don't really infringe on my life in any way.


On the app development side of mobile, I find Android so much more messy to develop for. It has its perks like the Jetbrains IDE and ability to use bleeding edge libraries (Jetpack), but those don’t counteract the downsides like needing a laundry list of third party libraries to do practically anything, there being no well-supported vendor-preferred “happy path” for various things, Java ecosystem baggage, fighting Proguard, etc.

And that doesn’t even get into the “fun” of there being differences between the versions of Android shipped by different vendors significant enough that maker and model-specific bugs and behavior inconsistencies are a concern, which is only a thing because of manufacturer insistence on deep customization (compare to Windows where if it runs on fine your PC, it probably does for 99%+ of other PCs too).


Apple is an ecosystem much more than windows is. Devices work together and enhance each other. So jumping to Linux isn't as easy as Everything has to change.

It's more like emigration or a divorce it will hurt and it's a big change. For most it's not worth the bother.


I've found that ChromeOS + Pixel 6 is fine. I have a Fitbit Sense 2 for a watch (I hate the idea of my watch telling me I have a meeting coming up lol).

ChromeOS is great. The host OS "just works" and I have a VM/container env to do development in. It feels secure, operationally straightforward, and low maintenance.

I don't really care a ton about my phone. It works, battery is fine. It makes phone calls, plays music. idk. I care way more about my laptop.


> Also, couldn't find anything coming near the value of a baseline Macbook Air M1 as far as build quality, battery life, stability etc. is concerned.

And the trackpad! I always evangelize the Macbook trackpad because that shit is bananas, as it were.


I used to think that, but my latest Asus laptop has a trackpad just as good as the macbook (which I use professionally). Actually, I like PC trackpads more, just because they tend to be reasonably sized rather than the monster trackpads on most macbooks (although, fortunately, they seemed to not be quite as big in the M1/M2 iterations!).


I'm sorry, but I just don't buy that it's as good as the MacBook. Can you detail what model Asus laptop you're using? I'm legitimately curious what the hardware is.

The only trackpads I've found that feel close to Apple's on a hardware level are the ones produced by Sensel (https://sensel.com), and those aren't in every laptop. Then you have the driver situaton on top of it, and Apple's tight vertical integration just always seems to give it the edge - I literally never experience invalid taps or movement, etc.


https://rog.asus.com/us/laptops/rog-strix/2021-rog-strix-sca...

Not only does it have a great trackpad (with an ability to become a numberpad), it also has an optomechanical keyboard, with "blue key" like feel. Plus a 3080 in it for games. It also has great cooling, so the fan only is on if you're hitting the 3080 hard.

It is certainly the most fun laptop I've owned (it has led backlights for the keyboard, too)! Granted, it runs windows, but win11 is pretty nice, honestly. No idea how amenable it is to putting linux on.


Sometimes I think the biggest secret in this industry is that 2 people can outproduce even the largest companies. The only thing keeping everyone's job secure is that nobody can accurately identify which 2 people.


Isn't that often a case of "What one programmer can do in one month, two programmers can do in two months."?

Companies with all their bureaucracy are often slow, but that bureaucracy also isn't entirely superfluous, and frankly I don't know if any of this was on any given company's list of things to do. Not sure anyone beat anyone to the punch here.


Yes, it is. Such a case. A team should be dependent on what needs to be implemented.

For “boring” and “well-trot” stuff that people can just do on autopilot, sure, that scales.

For projects with zero obvious paths, an experienced and lean team is necessary.

I have wasted too much time for superficial CRs, and hand-holding people who didn’t provide value.


Yeah, that's definitely not true. You'll know when you interview them and they have extensive contributions to open source, an active GitHub or GitLab account, can show you things they're proud of having made in their spare time, can talk at length about technologies and implementation details none of your other candidates can, can show you their participation in mailing lists, etc.

There are a billion obvious signals compared to the people who clock in at 8 AM, sign out at 5 PM, and argue with your teammates about problems that entire swaths of engineers consider rudimentary: understanding how to write patches in readable ways, not screwing up git logs, not submitting PRs 10k SLOC long with no explanation as to how the automated output was created for replication and verification.

There are obvious clowns in the industry, and people who really love doing this stuff, and you can figure out who they are in a 5 minute conversation.

It really insults people when you tell them this, because yes, there are absolutely people that work harder and longer than you because they want to and enjoy it--they don't just go home and watch Netflix.


This is in my opinion very wrong, you will end up just selecting for a sub population which might or might have the qualities you are looking for.

You seem to be confusing volume of work and passion for skill. Some of the best engineers i know will are very diligent about work life balance (i guess you call that clock in/sign out), do not have publicly visible git hub because in their spare time they are either with their families/communities or enjoying other non tech related hobbies.

A good programmer (from an employer perspective) is first and foremost a professional, passion and loving doing stuff is good but can only get so so far. I know a lot of the passionate programmer who basically suck.

> There are obvious clowns in the industry

Agree, and the worst seem to be the one thinking they have this magical/predictive ability to discern talent based on their limited life experience.


People who do more will, by definition, have more experience than those who do less. The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

It's a delusional concept. If you want to be good at anything in life, you will end up spending more time than other cohorts in any given discipline. But for some reason in tech, people like to believe that's not true because "work life balance."

Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

"Very diligent" people doing less than "very diligent" people doing more will generally always have less experience and skill.

There's nothing meaningful to argue here. Some people delude themselves into thinking it's sacrifice and that some people have to give up "work life balance":

The reality is that there are populations of people across all sorts of discipline where giving up more time doing X, Y or Z isn't sacrifice, it's because people genuinely enjoy doing more than others, spending time becoming better than others, and producing more:

And yet, that scares people, and people like to deny that it's true instead of acknowledging their own mediocrity.


I think it's more that there are multiple spectrums here and a lot more than two buckets to put people in.

Sure there are very passionate people who enjoy programming so much that they wish to do so far outside the standard workday.

Some of those people are also very highly skilled, experienced and organized.

There are also people who work a standard day, push Jira tickets around, and try to blend into the organization and hope nobody really questions how much they personally get done.

There are also plenty who have good work/life balance and are also skilled, experienced, good problem solvers, good communicators, and very valuable to have on your team. They might have some measurable productivity loss compared to your ideal, but probably not by 2x or 10x.

I have also run into several of the ultra-focused passionate folks who will stay up all night hacking at a problem to make it work, who produce prolific line-counts of code, and fall very much into your camp of 10k line indecipherable PRs that are definitely not going to be maintainable long term in a team or organization.

You can try to correlate some of these factors together, but it's perhaps not as simple as your original comment presented.


> People who do more will, by definition, have more experience than those who do less.

You're assuming skill rises meaningfully with just volume of experience. Who's a more skilled driver? The plumber that drives his van around to jobs all day and does about 1K miles/month or the guy that mostly rides his bike except for weekends when he's taking a defensive drivers course or going to a track day?

I'd bet $$$ that the plumber spends a lot of time checked out / in zombie mode on the freeway between jobs and the motor-sports enthusiast is hyper diligent when driving.

> Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

Perhaps, but this is - again - because you're missing the point; meaningful advancement in skill comes from experience gained while attempting something an individual is new to/uncertain/uncomfortable with and not the same thing that the individual has done a thousand times before.

> You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

Yes. A simple counterfactual: not all musicians that practice 18 hours a day become successful. There's a lot of work in being the best, absolutely. But some people have some fantastic genetics/general-upbringing/predisposition to leverage. Same thing with sports. There are comedians that you've never heard of that spend more time writing jokes than world-famous comedians do.

> Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

Absolutely, but the people that do $thing all the time _and get better at it_ are the people that are constantly looking to $difficulty++ on $thing. I love reading and I'm always getting better at it because I don't stick with the same language/length/difficulty level all the time :).


Video worth a thousand words : https://www.youtube.com/watch?v=nMEzr5uvrNM


>> People who do more will, by definition, have more experience than those who do less. The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Nope. Some people spend a lot of time doing because they need more time to get it right. Hopefully they do improve over time.


You continue to equate volume and time with quality of work. But still...

> People who do more will, by definition, have more experience than those who do less.

Disagree. The quality of work, and the intensity is as important as the length of work. 100hr of focused work is better than 1000hr of distracted, unfocused work.

It also depends on the type of work, i would argue that someone who does 50hr of Haskell and 50hr of c++ has more experience than someone who does 150hr of just c++;

It also depends on the type of experience, sure maybe ( a strong maybe) your lone 1000 github star can produce more code, but being a professional programmer is more than just producing code. Communication, planning and general don't be a doucheness are also important. A parent who is doing 8-5, and spend the rest of the time managing a family would score higher on those metric.

> The engineers who spend their time reading and working on problems because they want to, creating larger volumes of work than those who only work an 8-to-5 job will generally always have a larger breadth and depth of experience.

Same remark here, depends on the focus and intensity of work. It's a well documented things that after 40hr a week, the quality of work and focus tends to degrade.

But even more important, you are assuming that the time the "8-5 people" spend not working on computer related things somehow also doesn't count as experience, and can not somehow synergisticly enhanced one professional work.

At the base level, you have the foundation health related thingy like good diet, proper rest etc...etc.. which does take time. But also well know thingies like ideas poping in when one get some distance to a given task.

More important, cultivating other interest and stretch one's minds and have some interesting effect.

> Saying the inverse is true is highly unlikely over large populations of individuals across any field, not just software development.

Strawman, not really gonna touch this.

> You're telling me that you think musicians who are "very diligent" and practice less can be "some of the best" you'll ever have exposure to compared to those who are working in music all the time, just because they like it?

This is still a strawman. But at least it's more interesting.

To answer the question : Yes... its called talent,training quality and genetic predispositions.

If long hours of work was the only thing required, the profession of coach wouldn't exits. In sport, most of the top player are very motivated people, with ungodly work ethics and drive... They still invest in personal coaching because just "doing" is not enough, doing the right thing and the right way is also very important.

> It's a delusional concept. If you want to be good at anything in life, you will end up spending more time than other cohorts in any given discipline. But for some reason in tech, people like to believe that's not true because "work life balance."

> Did it ever dawn on you that some people just like to write all the time? Or produce music all the time? Or paint? Or sing? Or act?

I see this as faulty logic at multiple level. Even if i give you the fact that hard work/passion etc... are strongly correlated with excellence. It doesn't follow that hard work/passion etc... are a good selection criteria when looking for excellence.

A good analogy is height and basketball skills. It's pretty clear that being tall helps some might even say is required. But within the NBA (or any other organization of "professional basketball player), nobody is drafting people based solely on height. One might even say that the relationship between height and skills in the NBA is fuzzy at best.


Supposedly there is good data to suggest that people who work 10% longer make 40% more money. Now, that's not necessarily causal or anything, but it doesn't have to be. From a hiring perspective, it just has to be true. Unless you suspect that a candidate has scammed his previous employers, it is rational to prefer candidates who, based on their hours worked, are more likely to be effective at making money. Making money is generally the business of business.

"The U.S. Bureau of Labor Statistics reports that the average person working 45 hours per week earns 44% more pay—that is, 44% more pay for 13% more work"


Yeah, that describes me! I’m still insulted by this, because I have worked with plenty of people who basically do not exist on the internet but are far smarter than I will ever be. You’re wrong, plain and simple.


And a lot of times companies won't hire these passionate people.


They don't even know who they are because B-players only hire other B-players.


They know who these A players are but are afraid to hire them.


A rational economic system would eliminate the distinction between education and production so that people are constantly learning/improving while occasionally contributing to massive breakthroughs. We systemically under-develop every single human being on a global basis right now.


Everyone deserves an experience like that, at the very least – everyone deserves to be in a position where they have the opportunity to really enquire, or engineer, or know what it's like to solve new problems. It really is a good purpose that education could meet.


I'm frustrated that "good management" involves breaking things down so small that there is no room for enquiry or engineering. Jira ticket #5229 "Port get_order_* API" doesn't leave much room for problem solving.


It’s almost like business needs come first


Is there a world where we put people's needs first?


Really? There are tons of people who learn while working, which is what I assmue you are advocating for rather than literally "eliminat[ing] the distinction between education and production," which would include school children working. This is especially true in the top end of fields like medicine. Surgeons create and learn new surgeries, for example. And I don't think Sergey Brin took a course on how to create Google in university, and the Google engineers did not learn how to scale it there either. See also the massive research arms of Google et al. So "every single human being" is trivialy false.

But even in the general case, college does not directly prepare you for your profession (nor is it meant to) and you are expected to learn on the job. And if what you say is true that you are not learning while working there would be no reason for companies to seek employees with experience.


> ... 2 people can outproduce even the largest companies.

Only if they're not including documentation. Or maybe only very, very, very minimal documentation.

If you want full documentation, translated to a bunch of languages, then (for now at least) you'll need more than those 2 people.


Fortunately, in the case of a GPU driver, you don't need a lot of documentation beside the standard it implements...


> This puts Apple to shame, plain and simple. They obviously don't care about standards, or compliance, because they like people to be walled in their own little private garden

Is everyone missing the part where Apple left the door open for other operating systems and development thereof when it would have been relatively trivial for them to lock the laptops down.

They literally made their own silicon and built an entire platform. Do you think it's a mere mistake that they left it open to running other operating systems?

It's really disappointing to see everyone bashing Apple when it's clear to anyone paying attention that they made a conscious decision to leave the door open for 3rd party development.


> It's really disappointing to see everyone bashing Apple when it's clear to anyone paying attention that they made a conscious decision to leave the door open for 3rd party development.

Maybe I'm just a jaded grey-beard but I suspect that this is more of a "placate the anti-trust regulators" play and not a genuine olive branch offering.

Apple gets to say "see, look! Not only are we not locking people out - there's a whole micro-niche community that's taken root. If that isn't proof we're not abusing our position, I don't know what is..."

They left the door open, but just barely. The reverse engineering efforts will always be a step behind making sure that there's always going to be a "non-apple" experience that will be objectively inferior in one way or another.


I can’t imagine any reasonable argument that would make Apple be a target for an anti-trust action for _Macs_.

I understand skepticism and not always giving corporations the benefit of the doubt, but they _clearly_ spent a lot of time and resources to make third-party OSes viable on Apple Sillicon Macs.


They did. Which is why it’s so baffling that they didn’t document any of this stuff. 5 minutes of documentation by apple engineers on the boot process or GPU would have saved 5 hours of reverse engineering work by the Asahi Linux team.

Seems to me like they can’t decide whether they want Linux on their hardware or not. I bet different people in the org are pulling in different directions.


It’s not baffling at all. Opening the boot chain is work, but making presentable documentation is a lot more. It’s not 5 minutes of work: it’s years of checking the licensing on everything, designing stable APIs that are fit to publish, supporting them, having engineers working on this. You can’t just throw your internal “G13G scheduling pipeline” docs over the wall.


> _clearly_ spent a lot of time and resources to make third-party OSes viable on Apple Sillicon Macs.

This actually isn't clear to me -- can you explain? Besides keeping an open bootloader [0], I'm not aware of any affirmative actions Apple has taken.

[0]: https://github.com/AsahiLinux/docs/wiki/Open-OS-Ecosystem-on...


The open bootloader didn't magically appear one night in Apple's git repository.

It boots in a notably different way than iOS machines do, and has some (AFAICT) pretty unique capabilities, including a fully-verified signed-boot of macOS partitions, while allowing third-party kernels at the same time.

Asahi's "Introduction to Apple Silicon" [0], and specifically "Security modes, Boot Policies, and machine ownership" paragraph outlines some of that, Apple's "Platform Security" [1] whitepaper does too.

Asahi's docs also explicitly state the same thing [2].

If you still don't think that shows significant amount of work and care were put into deliberately allowing third-party OS's to work on those machines, I don't think I can convince you otherwise.

[0]: https://github.com/AsahiLinux/docs/wiki/Introduction-to-Appl...

[1]: https://support.apple.com/guide/security/welcome/web

[2]: https://github.com/AsahiLinux/docs/wiki/Apple-Platform-Secur...


There is also no precedent for Apple making any kind of pro-active design choices around future regulation. They clearly are the kind of company that does whats best for them and when asked to change, nudges in that direction, and then moves on. This is in the DNA from the top down. It would certainly be weird to make the decision about third-party OSes be about that.


> I can’t imagine any reasonable argument that would make Apple be a target for an anti-trust action for _Macs_.

Why can't the same "there is no OS except iOS allowed on iPhones" argument be applied here? If the only os that boots on a macbook is macOS, that's starting to smell like anti-competitive behavior the same way that only app store approved apps can run on iOS is anti-competitive.


Because the market share is order of magnitude smaller.


This is one moment where I really hate what happened to Twitter, since I feel like I recall a tweet from Marcan ages ago pointing out that Apple has fixed some things regarding 3rd party OS support.

That is to say, unless I am truly off my rocker and remembering a fever dream: it's not just a "placate the anti-trust regulators" play.

I'm pretty sure I've also seen it mentioned on HN itself that Linux is still used within Apple for certain aspects of hardware development, so Apple themselves need it to work to a certain degree.


I remember reading that too, but I think it was on mastodon — I don't feel like tracking down that particular thread now, but maybe that helps you on your search :)


> I remember reading that too, but I think it was on mastodon — I don't feel like tracking down that particular thread now, but maybe that helps you on your search :)

That would be strong evidence that there's at least _some_ support internally for them but doesn't explain why they bothered at all.

The lack of explicit endorsements and documentation certainly has me thinking that at least _some_ of apple doesn't want this happening at all so they're at least going to make it hard. It may not be a "what's the bare minimum support we have to do to avoid being a poster-child for anti-competitive behavior" that's completely driving it after all.


> but doesn't explain why they bothered at all

I mean, you're kind of glossing over my second point from my comment:

> I'm pretty sure I've also seen it mentioned on HN itself that Linux is still used within Apple for certain aspects of hardware development, so Apple themselves need it to work to a certain degree.

Anyway, I went and dug around and found the HN discussion of that Marcan tweet that's been deleted - you can browse it below if you missed it or are curious:

https://news.ycombinator.com/item?id=29591578

Notably, this comment from Saagarjha who I trust on Apple-related matters is what I was referring to regarding Apple using Linux internally for some of their work:

https://news.ycombinator.com/item?id=29599889

All this to say, if the people who have some level of vested expert knowledge in this domain - like Marcan or Saagarjha - don't buy the conspiracy theory angle, then I'm inclined to side with them.


Here’s a Wayback Machine for that tweet: https://web.archive.org/web/20220102153759/https://twitter.c...

But I have to say I don’t understand how what Saagar is saying is supporting your (our?) point. Apple has ability to do a whole lot of things that will never make it to end-users — just because some flavor of Linux is being used in the CPU bringup process doesn’t mean anything for the final products.

As evidenced by the fact that M1 is far from the first chip they brought up in-house - and even then only on Macs, not on iPads which use the same chip.

I hear they even have some non-Apple hardware running macOS in data centers, the absolute horror ;P


If you really care about OpenGL drivers, it sounds like you want to be on Asahi, not macOS. Doesn’t seem like they’re always one step behind?


I just don't see how it's incumbent on Apple to implement a standard only because it exists.

It takes significant resources and impacts release schedules to, say, support an on-going vulkan implementation... I think there would need to be a business argument for it. Avoiding "shame" probably doesn't cut it.

It just strikes me as strange when people expect Apple to spend money and focus on their own personal priorities... especially for things that are inherently community-driven.

Perhaps there's a "community goodwill" argument to make, though I doubt they'll try to chase the goodwill of people who complain that Apple "has become is just a corrupted mess of greed behind a curtain of politically correct marketing videos."


>> I just don't see how it's incumbent on Apple to implement a standard only because it exists.

Well Apple does implement the standard. They DO have OpenGL, but their driver is not fully compliant apparently.

>> It takes significant resources and impacts release schedules

Well it seems 2 people working for a couple years managed to do it without much in the way of documentation that Apple clearly has (they built the F-ing thing).

Apple has considered OpenGL deprecated for a long time now, but they do support it (they brought it to M1 and M2) and it makes sense to keep doing so. They really should be more conformant. If a standard is worth supporting, it's worth supporting well - and Apply has unlimited funds in comparison to these two people.


They support it because they built some stuff on it at some point. They just want those things to continue working. They have no further need to support it.


> It takes significant resources and impacts release schedules to, say, support an on-going vulkan implementation...

Let's see... on one hand we have two devs who made it happen (for OpenGL; with Vulkan coming in the future), with essentially no funding, purely in their spare time and without any documentation, just by reverse engineering the hardware.

On the other hand we have a trillion dollar corporation, with 100 billion dollars in profit last year, with over 150k employees, with full documentation for the hardware.

Yep, it checks out. Apple's definitely resource strapped and can't afford to do it. No way they can compete with two spare-time hobbyist. Would be too expensive.

I'm sometimes really amazed how Apple-biased a big portion of HN is. People here can always find some sort of excuse to justify whatever shitty new thing Apple is doing (or why it isn't doing something), no matter how much mental gymnastics it requires.


No one said they don’t have the resources. They just asked why it’s incumbent on Apple to spend them, which your response.. doesn’t address at all? If we’re going to post with that much sarcasm let’s try to have equal amount of substance to go with it?


Developers don't want to create seperate Apple only backends for everything. Users want to use software, they don't care about what backend is used. When Apple has to personally implement a metal backend for blender, it's obvious that there is a problem. This is despite blender being one of the most well-funded open source projects. Having to support multiple backends also increases development time and future tech-debt surface area. It's mostly not a moral arguemt as you seem to imply. This is just a feature that developers and users want. Apple is free to not implement it. It just means that many apps will not support Apple, or they will support it at the cost of other features, or they will use MoltenVK.


What? I'd argue the graphics APIs are one of the most important interfaces right now. It's not that we all feel good if apple would adhere to standards for once, it's supporting a legacy of applications that have been build over the last 20 years or so. Allowing a platform to be used however the rest of the industry has been evolving, not cheap vendor locking


I personally do not believe in those 'the market will fix it' arguments, there is not only community goodwill but also some corporate responsibility to support open standards. Since apple is dominating a large part of the market and strongly profits from locked in effects, open standards are the few things that make competition over the best OS (aka a market) work. I understand however, that it is mostly commercial or regulatory incentives that makes current big tech moce. A bit of shame is still approriate IMHO: they could easily also support an open source driver without releasing something official.


> I just don't see how it's incumbent on Apple to implement a standard only because it exists.

I mean it isn't. But it would be nice if they supported at least any graphics standard.


It's because OpenGL is a dying specification based on an outdated, horrible programming model, and Khronos's conformance tests are weird and ridiculous. The conformance tests are not open-source (there's a separate "conformance suite" available on GitHub which is based on Google's dEQP, not the Khronos internal test suite), and there can be some pretty major bugs and gaps in your implementation while still getting it "certified standards-compatible".

Apple has committed to support for OpenGL 3.1, and that's it. They even rewrote their OpenGL driver for Apple M1 to be an emulation layer on top of Metal, so that existing applications keep working, but they're not going to implement any newer versions of OpenGL. Nor should they.

I have a lot of criticisms of Apple, and I think they could be doing a lot to make Metal a better API with better tooling, but not caring about OpenGL is a perfectly sane decision here.


For GLES all the relevant tests are open-source, under KhronosGroup/VK-GL-CTS on GitHub.

There's a small set of legacy "confidential" tests that you have to pass for GL conformance. They can't be open-sourced for legal reasons. The current CTS working group would like to get rid of them, but it's hard to justify spending time on GL these days...

You can definitely pass conformance with a driver that's horribly broken in practice. GL/ES/GLSL are huge and there are holes found in these specs all the time. And it's not like game developers read them anyway; whatever works on their test devices gets shipped.


Apple themselves tell you to use Metal instead of OpenGL on Apple platforms. It’s not beating the big corporation if the big corporation isn’t in the competition.


Agreed. It's a misrepresentation. Apple doesn't do it because it's too hard, they don't do it because they don't see value in doing it.


They probably see alternatives as providing negative value vs. total clarity from the platform owner re: how to use the GPU.


> This puts Apple to shame, plain and simple.

Not really? Like, we can look at this and point fingers, but Apple will not be ashamed, because they do not care. They deliberately don't bother with standards conformance unless doing so is a part of their strategy. Standards conformance for OpenGL or Vulkan doesn't make sense when they want everyone to use Metal.

It's dumb, and I marvel at how childish Apple always behaves with things like this, but that's just who they are.


> They obviously don't care about standards…

Apple has invented, contributed to, and adopted a long list of standards. You can easily Google or ChatGPT the list if you don't know your tech history.

Apple deprecated OpenGL and OpenCL support in 2018 for very reasonable technological and strategic reasons that I understand you disagree with. But that doesn't change the fact that OpenGL is a terrible fit for modern computing/GPU architectures.


Apple also invented and contributed OpenCL. Then later Khronos invented OpenGL compute shaders, which is completely different.


How did they beat the big corp who probably didn’t even have this on the agenda? Or any other big corp that wasn’t planning on doing this? I’m all for it but they clearly did something someone else wasn’t willing or even planning to do.


I've been watching the Asahi Linux project from the beginning from the sidelines. I find it both exciting and fascinating for two reasons:

1) It's a project that a lot of people want to see happen, and

2) It's a stellar example of a well-executed open source project at all levels.

Moreover, Asahi has been my daily driver since the alpha was released back in March 2022. I migrated to Fedora Asahi Remix (their new flagship distro) earlier this month, which is excellent (https://jasoneckert.github.io/myblog/fedora-asahi-remix/).


Yep. Just switched to PC for Davinci Resolve renders/exports and built a threadripper 64 core with 128G of RAM, 16 TB of SSD, dual RTX 4090 GPUs and dual thunderbolt 4 ports with awesome cooling for $15K. Something worse from Apple would cost you three times that and be less serviceable.


I’m curious how that is possible, considering that a fully maximal configuration of a Mac Pro (which including pre-installed FCP and Logic) costs less than $13k.


Genuine question. Why would apple make a driver for something they don't support? and if they wouldn't, is there really any shame?


well, the article mentions webgl


These two are tremendously cool.

The story seems really complicated from a technical point of view.

Whom is OpenGL ES 3.1 compliance for?

Apple is shipping DirectX compatibility in Game Porting Toolkit.

I understand you are making a stylized comment. My stylized comment is, there's a lot of stuff going on everywhere, all the time, with all sorts of technologies. You're not illuminating for me, compared to all the other people toiling in obscurity, what about this has pressed so many buttons for the Hacker News audience? Because it's not OpenGL ES 3.1 compliance.


Apple ended up in a battle with a patent troll over FaceTime, which could be part of why it hasn’t been opened up.


Lol. It's intentionally closed, so network effects can drive usage. Common Apple strategy, straight from Jobs.


Jobs is the guy who claimed they were going to make it an open standard originally.


Jobs also repeatedly lied and backstabbed his business partners when he thought he gained an advantage. As we see here.


Jobs just added that to the keynote presentation after running it past nobody.


He was the CEO, if he wanted it done it would have happened.


Yes, Jobs, the great satan of tech. He died so that Tim Apple could sin with impunity. No FaceTime for Windows users — it will bring the rebels to their knees.



I can't believe it. I can't believe it. Wow. Alright, now we're getting somewhere.


I'm just glad we now have much better alternatives to FaceTime that Apple will soon have no choice but to open up.


> If i weren't an iOS dev, i would have ran away

I used to wonder why anybody who is not an iOS developer would buy a Mac. Now I just accept that some people make different choices than I would - just with most everything in life.


If somebody makes a laptop as energy efficient as an apple silicon macbook, while being more serviceable, running linux etc, I will gladly buy that one. I am definitely not staying in apple for Macos.


Go into any Bay Area coffee shop and you’ll see it’s filled to the brim with zoomer developers developing their apps and projects on a Mac. Doesnt matter at this point why or how but it’s more or less become the standard development platform for a lot of stuff, or at the very least the frontend for it.


> This puts Apple to shame, plain and simple

But Apple clearly has zero interest in OpenGL. Thats been obvious since before the M series chips.

What they’ve done so far and continue to do is incredible. I love following what the Asahi team has done.

But anyone can win a “race” against an opponent who refuses to play.

Shame Apple for not playing the game if you want, but they could have easily done this if they wanted to.


You say that but the financial barrier to entry for using Facetime and iMessage mean I have not once received spam. That's a real feature in the modern day.

Apple are by no means perfect but over pretty much every other service (shout out to Mastodon being too small to bother with I presume) I have used I have received spam.

Advocates of truly neutral carriers, open code, et cetera, et cetera, so on and so forth don't really have an answer to spam in my experience. I would say that almost no one has an answer to spam except seemingly Apple. And while their answer sucks it is an answer.

In the modern world telecommunications are a requirement to participate in society. I don't think it's unreasonable to have spam be your number one concern. It's a waste of my existence even though I ignore all of it and take appropriate pro social actions by using whatever kind of report I have available to me. There are people who fall for the scams (I have met and even know a few) and they have lost heaps of money.

Apple would be destroying a huge amount of value for their users were they to open up their service more easily to spammers. It's not our job to give up our lives so someone's theoretical version of a better society can exist.

I say this with every bit of code I have ever published being GPL3 and having advocated it in every single project I have ever worked on. But I have never built a telecommunications platform.


> Advocates of truly neutral carriers, open code, et cetera, et cetera, so on and so forth don't really have an answer to spam in my experience.

Seems trivial by but allowing anyone to message anyone else unless they exchanged their contacts physically (by showing each others QR code or something).


Right, aside from friction for users, spammers will attempt to find ways around any system you implement. They are malicious actors. I don't think protecting a system against malicious actors is trivial.


If Apple didn't care, they wouldn't release anything at all.

But they did. It means somebody did care.

But they failed, got beaten by almost one-man-army "team" that started running miles behind from another country behind chain of mountains. It's unreal.

It's unreal because Apple has everything - talent, hardware and software and got beaten by reverse engineering the whole thing.

Such a slap in a face.


We're kidding ourselves if we think Apple was putting an earnest effort into this.


They don’t remotely care about supporting a standards complaint graphics api. It’s an anti-feature, the opposite of what they want (I worked there for a bit on gpus). If I had to guess the two things they care about it’d be 1) their own ecosystem (ie metal) and 2) what’s popular for game devs (ie hlsl dx and windows). So far that’s seems to be what they’ve been working towards shipping.


> If i weren't an iOS dev, i would have ran away from the apple ecosystem a long time ago.

Yeah, instead of drivers for M1, I'd be more impressed+happy if someone implemented the iOS APIs running on Linux, like we have Wine for Windows on Linux.



Did anyone use this to write iOS apps under Linux?


OpenGL(ES) are basically old legacy APIs at this point, it’s not too surprising that Apple won’t invest in them besides existing app conpatibility right?


What does Apple get from not making their laptops the most performant portable gaming machines out there?


They lost the gaming market a while ago so they don't care anymore. The gaming situation on a Mac is worse now than 10 years ago.


putting apple to shame implies that they have some form of honour, which is clearly not the case, they don't care


They learned from Microsoft in the 90’s.


Not a single mention of the word ‘Apple’ in the original post, instead ‘the manufacturer’ and ‘the big corporation’.. curious if that is deliberate and if so what the reasoning is (legal?)


Presumably, because this isn't about Apple. They don't care about Apple. Merely about getting the M1/M2 arch to run Linux properly. It could have been Microsoft, Amazon, Google, treatment would have been the same.


But they did refer to them but used strange phrasing to do it


It was fun to watch the livestreams showing the development of these drivers. Amazing work.


https://www.youtube.com/c/AsahiLina for those who aren't aware - some of the most incredible low-level programming work I've ever seen!


If only it weren't obscured behind unbearable levels of anime cringe. I guess I'm getting old.


It’s just part of the “coder” subculture these days. To quote @SwiftOnSecurity:

“I said that all furries are programmers. I have been contacted and am issuing a correction. There are three furries who are not programmers.”

https://x.com/swiftonsecurity/status/511043756788043776?s=46


Shrug. It's certainly unusual by the standard of programmers-as-geeky-yet-professional-folks, which is what most people expect these days.

However, programming has, for most of the last 60 years, attracted many people from the weird, misfit, or cringe-du-jour segments of the population. This is nothing new; plenty of hackers in the '80s and '90s had public personas that were weirder than this.


> plenty of hackers in the '80s and '90s had public personas that were weirder than this.

And they are?


For some, vtubing is capitalizing on a trend; for many, though, it’s a way to express and experiment with developing one’s identity in a way that transcends the physical limitations of both presenter and audience - and for those for whom that freedom to experiment can unlock unbounded creativity, it’s an incredible way to “limit break.”

A bit saccharine from an aesthetic perspective, but it’s something that would be entirely familiar (if old-school) to denizens of a cyberpunk universe, and I don’t think there’s a better way to articulate how it’s connected to a broader vision.


I choose to see that as the cyberpunk future that we are living in. Everything ends with your avatar, just like Gibson foretold. What an amazing time.


Why is M2 Ultra support missing? I'm curious more than anything--lack of developer hardware? Something technical?

"The Khronos website lists all conformant implementations, including our drivers for the M1, M1 Pro/Max/Ultra, M2, and M2 Pro/Max."

Edit: Interesting that so many people are downvoting this. I'm not complaining; I didn't know, so I asked a question. Seems like the answer is that it's too new, especially with the 30-day conformance review period on top.


You're being downvoted because people assume you're making an implicit criticism with your question, and many people tend to assume the worst intent rather than the best intent like they're supposed to. Fortunately it usually balances out after some time. Adding a clarification like you did is the correct thing to do (and always a good idea, especially in a text forum).


Right, but that clarification was pre-edit?

I'm aware of people's sensitivity to putting demands on efforts like this, which is why I tried to frame it as the genuine curiosity it is.

This sort of low-level work is out of my depth. Is there painstaking probing to find different addresses/states between a Max and an Ultra? The M2 Ultra has two GPU variants (60-core and 72-core), although the Pro and Max also have GPU count variants. Are those transparent from this OpenGL driver's point of view? Or is there timing/dispatch work involved? Hence the question!

I would have made an explicit criticism if I had one. I implied nothing, but I realize that has no bearing on what the reader can or will infer...


Yes I would have put it in the original (pre-edit), but better late than never :-)

Yeah I've never done GPU driver hacking so I can't answer your question with optimal specificity, but there's always something new. It's usually (but not always) backwards compatible, but Apple is a special case since they are their only supported customer. That means that they can iterate very fast with flagrant disregard for backwards compatiblity (I don't necessarily mean that in a negative way, it can be great for development/progress), and it means that any changes they make have to be reverse engineered by setting debug breakpoints, examining state, signals, etc to try and figure out what they changed and how it works now. It's a truly monumental undertaking.


> That means that they can iterate very fast with flagrant disregard for backwards compatiblity

In practice though, Apple is famous for not replacing everything left and right. IIRC, the UART part of their SoCs dates back to the first generation of iPhones, if not even earlier.


> Yes I would have put it in the original (pre-edit), but better late than never :-)

Huh? The clarification is in the original (pre-edit). It's the second sentence.

"Right, but that clarification was pre-edit?" wasn't a hypothetical, it was making a claim.


Good point. Either other people missed it like I did, or I'm wrong about why the downvotes are happening. (or for completeness I suppose a third possibility that it was edited in later, but I'm not alleging that in any way).


For what it’s worth M2 Ultra is the newest chip in the Apple silicon lineup. It was only introduced last June.


I didn’t downvote you, and I understood your meaning clearly. I think people might read “why is ___ missing?” and skip over the rest, assuming the question represents an expectation.


looks like bring-up was only two weeks ago

https://www.youtube.com/watch?v=EQAE13sZlsY


It's missing because Apple doesn't release drivers or upstream the drivers so it's on hobbyists to reverse engineer them.


What do you mean by "upstream" in this context? https://github.com/apple-open-source/macos ? Something else?


It's quite expensive, so my guess is the same as yours: lack of developer hardware


I really wish nerds would stop supporting Apple hardware and software. It is so closed and everything is a giant pain. Its one thing for the average non-techie that does not want to do anything advanced and just wants to swallow the blue pill. But for an actual techie to support a system that is the antithesis of general purpose compute...Can you even be considered a true techie anymore? I just don't get it.


I really wish people would leave ridiculous labels in high school where they belong, and use whatever works for them to get what they want to do done. If I want to tinker with Linux on my hypothetical M Series MacBook Pro in the future, I'm stoked that these particular nerds are devoting extreme amounts of time and intellect to solving that problem. It took me a lot of time and exposure to realize this, but in the world outside of your niche set of interests, it's a massive red flag to care about whether someone is a "real" whatever, because everyone's just into what they're into and trying to appease the gatekeepers doesn't do anything but sell yourself short.


The Mac hardware and macOS "do what you want knobs" are actually pretty open as far as computer hardware goes. Hell, Apple even made part of the process of booting custom kernels easier for the Asahi team after the posted about it. I mean it's not a Talos workstation with full open firmware blobs but it's also not preventing you from loading whatever you want on it... but that could be said of anything.

iOS based devices are where the lockdown is.


With that attitude, Linux and FreeBSD would never have started and would not exist today. You're advocating the total opposite mindset of being a "true techie". It has all been developed on proprietary hardware from the beginning. Everything was reverse engineered by talented hackers like these two. Vast majority of platforms are proprietary. Open source contributions from manufacturers started relatively recently. How can you call yourself a hacker if you're too lazy or stupid to reverse engineer anything?


x86 / IBM PC is probably more open than M1/2 even back in 90s


What a joke. None of the drivers on x86 were open source, and there were literally thousands of devices that needed reverse engineering. The M1 instruction set is ARM which is already supported by Linux.


I am talking about the CPU not other hardware. Comparing Apple’s apples with apples.


Hmm, well the CPU in the M1 is ARM64 which is already open. But the M1 is a system on a chip that includes many proprietary components besides the CPU, such as GPU. So it's not apples to apples with 90's x86, which was a CPU only, not system-on-a-chip.


A lot of dev companies force their employees to use bacbooks in part as a result of their locked down practices and the only way to actually compile and test for "everything", because of how much easier it is to cross compile to other platforms and markets.

For that reason, I've personally had to use a M2 for a week now, and I find the experience unpleasant. The OS feels old and cumbersome. I had flashback to a time long ago when I needed to install custom apps in Windows for some basic functionality that's been built in gnome. Things like "alt + tab" actually bringing an app to the foreground, instead of prioritizing the minimized state. Why this deliberate action would have seemingly no effect is correct, I'm sure will be defended.

Global shortcut to start up a terminal? Install a third party utility is probably the easiest. Want to place windows on the screen? Install a third party utility. Better screenshot functionality? Install a third party app. Change the default shell to a custom one? First you got to figure out how to access the root system through finder to actually find the executable.

It goes on and on. And the performance is much worse than what I've become used to. The only thing it actually has going for it, which I believe is great, is battery life. But, that's it.

I'm just glad I was able to disable the app validation check, because it had literally prevented me from installing a lot of third party software, and the suggested workarounds had no effect. Just the fact of phoning home every time run an executable makes me nauseous.

I wonder how often the homebrew team get offers to add malware, because the checks there seem woefully insufficient.


> A lot of dev companies force their employees to use bacbooks in part as a result of their locked down practices and the only way to actually compile and test for "everything", because of how much easier it is to cross compile to other platforms and markets.

A lot of devs are also choosing them because they are insanely good machines. Simply because ARM64 is a good architecture and there is pretty much no competition available, it's more or less a choice between Apple or not ARM64...


My personal experience:

- MacOS user experience: Terrible. It feels like Linux did 10 years ago. Bugs and inconsistencies everywhere. But hey, they are competing with W11 so, no need to improve it either. Forced updates and restarts. And, did I mention bugs and inconsistencies everywhere?

- M2 Computational performance: It feels like my laptop 10 years ago.

- M2 Batterly life performance: It is absolutely brilliant, but, I also work 80% of the time at a fixed desk, so, it's mostly only relevant when traveling.

I'm sure a lot of devs use them because they like them. But, your argument that ARM64 is "good architecture" whatever vagueness that entails, is pretty much bollocks.

Compile time performance is ok at best compared to other laptop CPUs (the fastest laptop Mx hardware is about half of the laptop competition, and that's just comparing CPU). The package management coherence is non existent, so everything is slow to start because everything ships a complete stack of libraries that need to be read from disk and copied in memory. I honestly believe that people who are blown away by the Mx hardware have simply lived too long in the walled garden to know what is out there.

In summary: MacBooks are decent hardware, with exceptional battery life. Priced at 50% would make sense. And the OS experience is somewhere between mediocre to garbage.

Ran an identical GPU workload on AMD hardware with dedicated GPU.

- 3 year old desktop AMD hardware: 50 seconds (1x)

- 1.5 year old laptop AMD hardware: 66 seconds. (1.32x)

- Latest M2 Pro 12 core: 142 seconds. (2.84x)

For CPU-only tasks. 1x, 1.9x 1.7x.

So, the M2 Pro 12 core is about 2x slower than a one year older AMD laptop for GPU tasks, and about the same for CPU only tasks. And the latest AMD Laptop CPU is twice as fast as the one I used.


I honestly can’t take you seriously when you write shit like “Linux did 10 years ago”..


Let's see:

- Alt+Tab is bugged. You need a third party app to bring consistent behaviour.

- Closing a window with window decorator very often minimizes it instead, no consistency.

- Recording or capturing screenshot is tedious without a third party app

- Homebrew is underwhelming as a package manager. It feels like Linux package managers did around 20 year ago.

- No global shortcut key to trigger arbitrary commands (you need a third party app). Like, starting a terminal.

- Bad support for external keyboard. Dead keys everywhere that have otherwise consistent meaning (delete, home, end, etc). The kinda stuff that could happen on Linux 10 years ago.

- No sensible support for external mouse, such as scrolling behaviour is stuck to what would make sense for the trackpad, and feels bad on a mouse. You need to install a third party app to handle this.

- No copy/cut paste in file manager. You need to drag and drop like a primate.

- Oh, and finder, in general, is bad at almost all the things a file manager should be good at. But, someone somewhere at apple decided that labels trumps hierarchical file structures, to navigate a hierarchical file system.

- Built in advertisements for apple products pop up from time to time.

- "I guess I won't start working right now because MacOS is installing a mandatory update" has already happened.

- Bypassing security check to launch app, as in the "open anyway" is bugged since 2016, but it's still there. And you can click on it without it doing anything. You have to google the command to disable the whole app audit thing and run it in a terminal. (You know, the kind of stuff you had to do on Linux 10 years ago).

- Moving windows around is tedious without a third party app.

- Accessing the root file system through Finder requires googling how and running a command on the terminal (you know, the kind of stuff you had to do on Linux 10 years ago).

This list is just the annoyances noticed this week, and I'm sure I've forgotten quite a few. It feels like crap, and it's not because "apple bad, buu huu". I don't care about your team/my team. I just want things to work and not be annoying. Windows 10 and 11 is crap for many of the same reasons as MacOS. Just a different collection of bad designs. The "it just works" has been Linux for at least the last 5 years.


You're working really hard to make this stuff upp mate.

> Alt+Tab is bugged. You need a third party app to bring consistent behaviour.

It's Cmd+Tab, and huh? What do you mean bugged? Elaborate please.

> Closing a window with window decorator very often minimizes it instead, no consistency.

This is app-specific functionality, you'll find similarly inconsistent behaviour across all OS:es to be honest.

> Recording or capturing screenshot is tedious without a third party app

Cmd+Shift+5, no app, records or screenshots with many convenient ways to select the scope of your screenshot/recording like window, screen, or just a specific area. This is one of the more polished and well thought out features in macOS.

> Homebrew is underwhelming as a package manager. It feels like Linux package managers did around 20 year ago.

It's an open-source project that's outgrown it's initial purpose, it's also not native and while Linux distributions generally build around the package manager as a central point, Homebrew is hacked together and tacked on as an afterthought, not a fair comparison.

> No global shortcut key to trigger arbitrary commands (you need a third party app). Like, starting a terminal.

Spotlight has been a part of macOS for I don't even know how long at this point, but I looked it up for you, it's 18 years.

> Bad support for external keyboard. Dead keys everywhere that have otherwise consistent meaning (delete, home, end, etc). The kinda stuff that could happen on Linux 10 years ago.

Fair, Apple have some special keyboard behaviours that aren't shared with other OS:es, but if you buy a keyboard meant for macOS it's gonna work fine.

> No sensible support for external mouse, such as scrolling behaviour is stuck to what would make sense for the trackpad, and feels bad on a mouse. You need to install a third party app to handle this.

Yeah, I think this is one of Apples eccentric stubbornness things, I honestly can't fathom why the OS doesn't have a native way of setting scroll direction differently for the touchpad and an external mouse.

> No copy/cut paste in file manager. You need to drag and drop like a primate.

Copy is there, but there's no cut, I wanna say that's another Steve Jobs legacy that's probably never going to go away. I hate this too.

> Oh, and finder, in general, is bad at almost all the things a file manager should be good at. But, someone somewhere at apple decided that labels trumps hierarchical file structures, to navigate a hierarchical file system.

I have no idea what you're talking about here. What are you comparing to and what's so much better about it? Honestly I do most of my file managing using the terminal and I suspect you do too, and my parents have no issue navigating Finder, so I don't know why this is such an offensively bad thing to you.

> Built in advertisements for apple products pop up from time to time.

They do? Where? In 18 years of using macOS I can probably count on one hand the amount of times I've felt that Apple were peddling something at me.

> Bypassing security check to launch app, as in the "open anyway" is bugged since 2016, but it's still there. And you can click on it without it doing anything. You have to google the command to disable the whole app audit thing and run it in a terminal. (You know, the kind of stuff you had to do on Linux 10 years ago).

With you here. I get what they're trying to do, but the UX here is pretty terrible.

> Moving windows around is tedious without a third party app.

Window management I concede is something Linux distributions generally excel at.

> Accessing the root file system through Finder requires googling how and running a command on the terminal (you know, the kind of stuff you had to do on Linux 10 years ago).

Most Mac users shouldn't be accessing the root file system, this is a feature, not a bug.

> The "it just works" has been Linux for at least the last 5 years.

HAH, now I know you're just trolling. Good one!

In your defense, I'll agree that many of the things that make macOS and other Apple OS:es nice to work with are under-the-hood things most people wouldn't have any idea exist without being told, I know because I'm technical and have learned one or two a year for 18 years. Needless to say there's now hundreds of such things to learn and most users have no way of accessing that knowledge.

That's an actual UX challenge!


I don't like that you say I am trolling. Kinda makes the discussion hostile. Either you are interested in hearing the experience of someone used to other OS, or you are not.

I agree about the screenshot functionality. I didn't know about CMD+shift+5 (and 4). It was built in, and it is fine. I must have incorrectly assumed it didn't exist since apple hadn't bother respecting the "Print Screen" button.

As for cmd+tab, due to the inconsistencies in the close/minimize behavior, this also means that cmd+tab leads to inconsistent behaviour. You literally either replace cmd+tab with something else (e.g. get used to adding option key in there, or remap the keybindings, or use a third party app). The alternative is to start memorizing which apps behave differently. Which is out of the question. As for what other OS'es do, I cannot speak for W10/11 because I don't want to boot into it. But, for gnome, apps that default to hide instead of closing (e.g. slack) actually hide, so they don't show up in Alt-Tab overview. Those that still show up there will also be brought to the front, as one would expect.

Also, launching spotlight and typing something hitting enter is not the same as having a global key-bind to launch a terminal.

Regarding the finder, perhaps a got a bit biased there. It's probably something different rather than bad. Perhaps just the disbelief that I couldn't use it to cut and paste, and thought "fuck this" and used a terminal, or that I couldn't find the root file system.

In any case, another week has passed, and I've made some more observations:

- Animation between desktops is incorrectly implemented and tied to frames, rather than duration. If you use "ProMotion" i.e. 120hz the animation takes twice as long as when on 60hz. Two seconds compared to one second. Either is ridiculously long. And, of course you cannot adjust this, because Apple has decided what the perfect experience is... Bleh.

- When downloads complete, MacOS will show an animation on the docker for the completed download, which... steals the input focus. Another amateurish bug.

When I thought about having to MacOS professionally, I was prepared for some frustration about bad apple design decisions. I was not prepared for the inconsistencies, the bugs, and seemingly petty refusal to facilitate anything non-apple.

Also, I'm just mentioning things that seem like straight up bugs or deficiencies. There are maaaaany tiny frustrations that are more "alright, this seems like a less useful implemention of what I'm used to". For example if I open the overview I very often think one of two things:

- Ah, I would like to close that app.

- Ah, I don't have that open, so I want to open it.

Neither of these work flows are implemented. For the former, you have to either click the app to bring it up to focus, then close it with Cmd-Q (which is ALSO inconsistent btw. Chome won't let you for example). Or you need to find it in the docker, right click and Quit. Oh, and you cannot do this while the overview is open even though the docker is shown. Right clicking doesn't do anything. Just add it to the pile of tiny inconsistencies fucking everywhere.

And for opening an app, you cannot just type as if you had spotlight open. You have to leave the overview and start spotlight. This isn't all that bad, but, definitely annoying compared to what I'm used to.

Oh, and I had a chuckle as well when I plugged in an external mouse and it asked me to press down' the button to the right of the left shift. Come on... USB peripheral identification isn't that hard.


You might be interested in this: https://github.com/mathiasbynens/dotfiles/blob/main/.macos

I don't advise just running Mathias' config as is, but read through it and see if anything seems to be something you want in yours, make the changes, and save it somewhere for the next time you're setting up a Mac.

There's some stuff in there about speeding up certain animations (look for "Speed up Mission Control animations"), and about not reordering "spaces" (desktops, full-screen apps, search for "Don’t automatically rearrange Spaces based on most recent use") based on use which I think may also affect cmd+tab ordering? Not sure, but it's a setting I always change anyway because the default doesn't make sense for power users.

About the trolling thing, sorry, I was genuinely not sure if you were arguing in good faith or just making stuff up, as most of the things you were saying were just plain incorrect or dishonest, from comparing desktop computer performance to low-powered laptops, to incorrect statements about features macOS truly excels at.


The dotfile looks like a great collection of minor tweaks. Thanks.

I don't know what I lied about on the performance though. The MacBook M2 feels slower and sluggish than what I'm used to. It is of course plenty fast enough for what most people need it for. So, I'm not really saying it's not that. But, on discussions on the Mx architecture, the general opinion is blown out of reality. It started when apple straight up lied and sait it was 3 times faster than the existing competition. They just forgot to mention that existing competition was their own previous offering, which was already mid-range when it was introduced 5 years prior.

Take a look at the numbers there and see if that matches your assumptions:

https://www.cpubenchmark.net/compare/5232vs5183vs4782vs5234v...

So, to clarify, my objections are not against "M1/M2 are good CPUs". They are actually pretty great, especially for how well it is supported and integrated with the rest of the system, allowing for exceptional power usage. But, it's not nearly as fast as many think it is. And it's tiring if/when that's the baseline argument for why apple hardware is great. And, I think they are wildly overpriced, especially when upgrading the hardware to the bare minimum for a serious work station (64GB ram and 2TB disk).

The alt-tab issue is solved with the `AltTab` which improves it in all ways.

I also remembered what I found lacking in finder. In Nautilus (the file browser I use on Linux), I can browse any ssh server as if a normal directory. Same goth with samba, sftp. And it also helps that it supports multiple different file systems out of the box.


> But, it's not nearly as fast as many think it is.

I'm running on an M2 Max with 64GB RAM here, both Kali and Windows VMs active and working, while my macOS host is running browsers, email clients, Docker containers, etc. All of this without my fans ever spinning, I'm not even convinced they actually installed the fans in mine, I don't think I've ever heard them.

In short, I'm a heavy user and I've never felt I need more performance in my laptop than this provides.

> And it also helps that it supports multiple different file systems out of the box.

What other filesystems do you need your computer to be able to directly mount? macOS supports FAT, exFAT, NFTS, HFS(+), APFS. Anything else you're probably gonna access through SMB, AFP, NFS, CIFS, FTP, SFTP, or FTPS anyway, all of which are natively supported in macOS (OK, you can't mount SFTP, but I don't get why you would want to honestly).


When you realize the world isn't strictly dividen it categories such as blue or red. you might begin to understand why people do what they do. also let me ask you a question since you're obviously for free and open source. how much have you contributed to riscv development or donated to move their cause forward? are you actively maintaining any open projects? because if you read the post the authors stated that they are small with limited funding which is the case for most open source projects.


Who are you responding to? By closed I meant they artificially restrict you from doing things in the name of selling more devices. Like running VM clusters for testing unless you run it on "apple hardware" when there really is no requirement. Removing support for 3rd party GPUs. Having to buy a million dongles that use proprietary connectors (not really a problem anymore). Limiting what you can control at the OS level. Not letting you run what you want. The list goes on. I obviously think what the asahi linux team has done is bad ass. Just think us nerds have to force ourselves off the tech giants or the end result will likely be; you won't be able to easily buy something you actually control.


How is it any more of an antithesis, than HP, Dell, Lenovo.. etc?


> everything is a giant pain

Have you ever used a Windows machine?


If you go fullscreen in a WSL terminal, at least the ads can't hurt you. I still get jumpscared by the Apple Music popup when I put on my headphones with a Mac...


> If you go fullscreen in a WSL terminal, at least the ads can't hurt you.

Ads? What ads?


Apparently, there are ads everywhere in macOS. By which, they mean the notifications in settings that upsell iCloud, or when News or Apple Music opens, there is a prompt to subscribe to the service. Can they be invasive? I suppose so, but I believe the GP is exaggerating somewhat. There are far more egregious examples in Google and Microsoft properties (cue something about Apple being the richest...)

There also appears to be an issue when connecting or plugging in headphones where Music.app opens automatically. I've tried to replicate this in Monterey and Ventura using AirPods, Beats and Sony MDR-7506 without success. Researching online suggests that this is a LaunchAgent/LaunchDaemon or cache problem. In short, it's a relatively simple bug to fix. Certainly annoying, but anyone with the tiniest bit of experience - especially in a field like DevOps - should be able to diagnose and correct.


> There are far more egregious examples in Google and Microsoft properties

I keep hearing about ads in Windows, but I have yet to experience anything like people claim.


The regular Windows ones. Nag notifications, Edge updates, search bar intrusion, that sorta thing. Nothing in the terminal app itself; the recent rewrite did a world of good for it.


> The regular Windows ones. Nag notifications, Edge updates, search bar intrusion, that sorta thing.

I still have no clue what you are referring to. I don't see any of that.


> Have you ever used a Windows machine?

Yep. I find it much better than anything Apple.


They are doing such good work. That was amazing to read and I learned something about finding hidden instructions by thinking about how a HW engineer would encode bits.


rosenzweig's drive and passion is enviable. I just hope that burnout won't come to her way. she's still have a long road ahead, yeah.


Wow, very interesting that mesa can now emit an instruction (and use a hardware feature) that even Apple's GPU driver does not support.


I have a currently broken installation of Asahi Linux on an M1 iMac. I am motivated to fix that and update!

I don’t know what is going to happen in the future, but I wouldn’t be surprised if Asahi Linux becomes widely used. From my experience of turning a 7 year old MacBook and an 8year old MacBook Air into great Ubuntu Linux boxes, I think that old Apple hardware will be a common host to Linux. It feels great to get use out of old hardware.


Fedora is supposed to be the new focus.


Super impressed by the whole Asahi Linux project - and this takes it a big step forward.


> Of course, Asahi Lina and I are two individuals with minimal funding. It’s a little awkward that we beat the big corporation…

Is the big corporation even racing you? Does it care? What devs and superusers want is regularly at odds with what makes sense for the actual business cases of a product.

I want all these things, too! But I recognize that it doesn't happen because what I want fails to make a solid business case, not because Apple is somehow incapable of writing an OpenGL driver for their own hardware.

If I'm in charge of the resourcing that could be committed to broader graphics standards support for M1/M2, what is the concrete (not hypothetical) case one can present to sell me that this is the right thing to focus on? I won't pretend to have the answer to this. There may very well be a very strong case to make. But it just doesn't quite feel right how I notice, on a daily basis, this lack of exploration or even just appreciation of all the factors and constraints that exist outside the technical portion of the problem. Maybe I'm just sensitive to this kind of issue because I feel this pain often at work.


Conformance is not easy. Congratulations!


As a user, what does this mean? Are there games that still target opengl?


Yes, quite a few Linux games still run on OpenGL, even recent ones[1]. Most support Vulkan as a backend as well, and this would be preferred once supported on MacOS, but that is still a work in early stages of progress.

[1] One example is the Clauswitz engine from Paradox Development Studio, which is used in all their games.


Surprisingly there has not been a rush to go back and update multiple decades' worth of games to use newer rendering APIs.


I was more hoping this would open up another pathway (in addition to Crossover 23 and GPTK) for Apple Silicon computers to be able to play modern games. Doesn't sound like that's the case.


I have found GeForce Now perfect for the Apple’s new devices. Since the client is native code, you don’t even notice that game is running in cloud. And <10ms latency, at least for me.

And you can play 7 hours in battery the best graphic games out there, with 1000nit display…


Yeah, I've been using this for a while and it's awesome! Sadly some of the big publishers don't opt in their games (Activision Blizzard, Bethesda) and some of the indie devs don't bother either.

But yes, it's wonderful.


It means general support for hardware acceleration in linux. OpenGL is used more places than you would think. Switch to software rendering and you’ll immediately see the difference just navigating the OS.


When I tested it, at Asahi Fedora at M1 Air, Extreme Tux Racer runs very smooth.

Godot 3 and Godot 4 did not recognize this OpenGL ES 3.1.

Wishing for Godot compatible support and external monitor support for M1 Air.


SecondLife


Amazing work. Congratulations to Alyssa, Lina, an the Asahi team. This is some hard core hacking!


> Of course, Asahi Lina and I are two individuals with minimal funding. It’s a little awkward that we beat the big corporation…

It's sort of battle when the opponent have not come to the battlefield at all due to no interest...I'd say corporataltion was beaten is bit of stretch here.


> Instead of one row after the next, pixels in memory follow a spiral-like curve.

This is called a z-order curve[1, 2]. It makes texture mapping and mipmapping significantly more cache-friendly than the usual x-y ordering, because these operations are often performed in tiles of 2^n size.

I'd say it's more a fractal pattern than spiral-like, to be precise.

[1]: https://demonstrations.wolfram.com/ComparingXYCurvesAndZOrde...

[2]: https://en.wikipedia.org/wiki/Z-order_curve?wprov=sfla1


Interesting. Wouldn't it be more strictly 2^(n*(2^m)) size where m is the number of dimensions?


I meant 2^n along one side, strictly in two dimensions (which is generally the most common texture layout). In other words, square tiles.


How is the current graphics driver situation on Apple Silicon macOS? Would it make sense or even be possible to backport this OpenGL ES 3.1 implementation from Linux to macOS?

Or maybe an OpenGL ES 3.1 / Metal compatibility layer


It's great to read about Rosenzweig's achievements with the graphic driver for the M series. How did they hack the proprietary compiler to output their instruction? What compiler are they referring to?

The OpenGL compliance gives hope that a simple recompile will make some graphics applications available at high performance under Linux on Apple Silicon, this is good news. It's a niche of a niche though. Devs would be porting their apps to MacOS as a priority to Linux on the M1/2/3 if at all.


This blog is a real gem.


alyssa and asahi lina are both geniuses, it's unbelievable


Can someone ELI5 what this means?


You can play games and use gui applications on the new M1 devices using Asahi Linux. Which means running Linux instead of MacOS.

The only caveat is that most applications won't be compiled for ARM yet, so emulation will still be needed.

This is in contrast of Apple dropping support for OpenGL on their devices, and using their own API layer: Metal.


Source? The mainstream distros already have comprehensive ARM support in their repos.

Asahi is working on a Fedora respin which will mean all your usual packages are a `dnf` command away.


OpenGL is used for pretty much all graphics acceleration on Linux desktop - desktop compositing, browsers (booth 2D content and WebGL & WebGPU), games (also console game and windows game emulation), etc.


Way to go Alyssa!!!!!

I knew you were going to get this done!


Thank you A. Rosenzweig.


This is reality-bending class of achievement without any random-luck help, just pure attention. Incredible.


I love this. Finding an unused/undocumented instruction to interleave bits extremely simply :)


Fantastic work, must really applaud the Asahi Linux team for their commitment and sheer competence.


Waiting for ability to run machine learning models on Linux instead of MacOS on M1


Have been eying a MacBook Air 15-in (M2), and wondering on the possibility of running Linux on it. Anyone had success, or failure, on doing this themselves? And, how hard will it be to port this M1 GPU driver to the M2?


Pretty sure sleep still doesn't work, unless the Asahi wiki FAQ / Wiki is out of date. Bit of a deal breaker for me until then.


I am running Asahi on a 13” MacBook Air M2. The only real show stoppers are that the speakers and sleep/hibernation don’t work. Otherwise completely usable. WiFi, Bluetooth and all they keyboard functions like setting screen brightness work. There is no support for TouchID, either, which is somewhat obvious.

In my opinion it’s ready to be a daily driver if you can stand using Arch or Fedora. I’m waiting for a Debian version, personally (there’s some instructions on getting it going already but I haven’t tried yet).


The article says: "Conformant OpenGL® ES 3.1 drivers are now available for M1- and M2-family GPUs"


Do we know that there aren't some iterations of the hardware that lack the shuffle instruction? It seems a bit odd that the official compiler fails to emit the instruction.


So, dare I ask, how does the performance of this driver compare to native? I mean I know this implements some functionality that the native driver does not, I'm just curious how they compare apples to apples?


It is native GPU driver in Linux kernel.


Now if Steam and SteamVR compile to ARM, VR on a M1 might be a possibility!


Would need the Vulkan driver for SteamVR on Linux.


how close are we now to being able to plug in an eGPU into an Apple Silicon device?


It might just work, eg here's a tryout on another arm platform where it seemed chipset bugs were the main problem : https://www.jeffgeerling.com/blog/2022/external-graphics-car...


That's two completely different things. I'm not even sure if the required hardware elements for eGPU - namedly, a "back transport" of rendered frames via Thunderbolt and shuffling them directly to the screen - are there on M-series machines.


Usually you plug the eGPU into a monitor directly instead of backfeed it to the built in display. The bandwidth is already low enough without showing rendered frames back across the link too.

There are some limitations on the PCIe/Thunderbolt but Marcan has written a few times he thinks it may be possible though probably not worthwhile from a performance perspective. Of course even base thunderbolt is a WIP at this point so that answer could always change.


Bravo! A tour de force.


Extremely impressive!


Eat that, Apple.


Huge!!


damn pretty cool


Someone should ping Tim and let him know Alyssa is not working at Apple for some strange reason.


Why do people want her to stop working on Asahi so much?


Apple is not Google. She could hack on OSX AND Linux(in sparetime)


Even if that was true, most probably but she wouldn't.

But who knows, perhaps she might. I don't claim any privilege to her decisions, just offering probabilities around human endurance.


How do you know they're not?


[flagged]


Please don't do this.


> Unlike ours, the manufacturer’s M1 drivers are unfortunately not conformant for any standard graphics API, whether Vulkan or OpenGL or OpenGL ES.

Yes, it is conformant to a graphics API. It's called Metal¹. You might enjoy pretending that "the manufacturer" hasn't given a damn to provide a fully working and highly performant graphics API designed under the same roof alongside the people who designed the hardware, but this kind of petty dunking on Apple that seems to be so popular is not a good look.

1) https://en.wikipedia.org/wiki/Metal_(API)

Protip: Downvoting is not the same thing as refuting someone's argument with a cogent one of your own.


Metal is not a standard graphics API. It's a platform-specific Apple thing. I use it and it's pretty nice, but it's not a cross-platform standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: