Not the person you asked, but my interpretation of “left in the dust” here (not a phrasing I particularly agree with) would be the same way iOS development took off in the 2010s.
There was a land rush to create apps. Basic stuff like the flash light, todo lists, etc, were created and found a huge audience. Development studios were established, people became very successful out of it.
I think the same thing will happen here. There is a first mover advantage. The future is not yet evenly distributed.
You can still start as an iOS developer today, but the opportunity is different.
The introduction of the App Store did not increase developer productivity per se. If anything, it decreased developer productivity, because unless you were already already a Mac developer, you had to learn a programming language you've never used, Objective-C, (now it's largely Swift, but that's still mainly used only on Apple platforms) and a brand new Apple-specific API, so a lot of your previous programming expertise became obsolete on a new platform. What the App Store did that was valuable to developers was open up a new market and bring a bunch of new potential customers, iPhone users, indeed relatively wealthy customers willing to spend money on software.
What new market is brought by LLMs? They can produce as much source code as you like, but how exactly do you monetize that massive amount of source code? If anything, the value of source code and software products will drop as more is able to be produced rapidly.
The only new market I see is actually the developer tool market for LLM fans, essentially a circular market of LLM developers marketing to other LLM developers.
As far as the developer job market is concerned, it's painfully clear that companies are in a mass layoff mood. Whether that's due to LLMs, or whether LLMs are just the cover story, the result is the same. Developer compensation is not on the rise, unless you happen to be recruited by one of the LLM vendors themselves.
My impression is that from the developer perspective, LLMs are a scheme to transfer massive amounts of wealth from developers to the LLM vendors. And you can bet the prices for access to LLMs will go up, up, up over time as developers become hooked and demand increases. To me, the whole "OpenClaw" hype looks like a crowd of gamblers at a casino, putting coins in slot machines. One thing is for certain: the house always wins.
I think it will make prototyping and MVP more accessible to a wider range of people than before. This goes all the way from people who don't know how to code up to people who know very well how to code, but don't have the free time/energy to pursue every idea.
Project activation energy decreases. I think this is a net positive, as it allows more and different things to be started. I'm sure some think it's a net negative for the same reasons. If you're a developer selling the same knowledge and capacity you sold ten years ago things will change. But that was always the case.
My comparison to iOS was about the market opportunity, and the opportunity for entrepreneurship. It's not magic, not yet anyway. This is the time to go start a company, or build every weird idea that you were never going to get around to.
There are so many opportunities to create software and companies, we're not running out of those just because it's faster to generate some of the code.
What you just said seems reasonable. However, what the earlier commenter said, which led to this subthread, seems unreasonable: those people unwilling to try the tools "are absolutely going to get left in the dust."
Returning to the iOS analogy, though, there was only a short period of time in history when a random developer with a flashlight or fart app could become successful in the App Store. Nowadays, such a new app would flop, if Apple even allowed it, as you admitted: "You can still start as an iOS developer today, but the opportunity is different." The software market in general is not new. There are already a huge number of competitors. Thus, when you say, "This is the time to go start a company, or build every weird idea that you were never going to get around to," it's unclear why this would be the case. Perhaps the barrier to entry for competitors has been lowered, yet the competition is as fierce as ever (unlike in the early App Store).
In any case, there's a huge difference between "the barrier to entry has been lowered" and "those who don't use LLMs will be left in the dust". I think the latter is ridiculous.
Where are the original flashlight and fart app developers now? Hopefully they made enough money to last a lifetime, otherwise they're back in the same boat as everyone else.
> In any case, there's a huge difference between "the barrier to entry has been lowered" and "those who don't use LLMs will be left in the dust". I think the latter is ridiculous.
Yeah, it’s a bit incendiary, I just wanted to turn it into a more useful conversation.
I also think it overstates the case, but I do think it’s an opportunity.
It’s not just that the barrier to entry has been lowered (which it has) but that someone with a lot of existing skill can leverage that. Not everyone can bring that to the table, and not everyone who can is doing so. That’s the current advantage (in my opinion, of course).
All that said, I thought the Vision Pro was going to usher in a new era of computing, so I’m not much of a prognosticator.
I think it's a mistake to defend and/or "reinterpret" the hype, which is not helping to promote the technology to people who aren't bandwagoners. If anything, it drives them away. It's a red flag.
I wish you would just say to the previous commenter, hey, you appear to be exaggerating, and that's not a good idea.
I didn't read the comment as such a direct analogy. It was more recalling a lesson of history that maybe doesn't repeat but probably will rhyme.
The App Store reshuffled the deck. Some people recognized that and took advantage of the decalcification. Some of them did well.
You've recognized some implications of the reshuffle that's currently underway. Maybe you're right that there's a bias toward the LLM vendors. But among all of it, is there a niche you can exploit?
Well the Russian drones are munitions, so that's not comparable. Is the UK dropping bombs on Gaza? I have seen zero reporting to say they have.
The UK might be flying spy planes outside it's airspace when it's citizens were kidnapped. That's not a "combatant". Was the UK a combatant when flying spy planes near the Ukraine border?
I think you are way off the mark based on reporting, I'm not even sure how you are coming to these stated opinions.
No, they're not dropping munitions, they're simply coordinating with the Israeli military to facilitate the dropping of munitions, doing everything possible save putting boots on the ground (that we know of). So you're right, in that case Britain is more like China in this situation, perfectly blameless.
I spent so much time tuning the WAP site for the forum I worked for back in 2008.
I had some sort of Nokia running on whatever 2kbps networking was going then, and would shave absolutely anything I could to make the forums load slightly faster.
I did that a lot initially, it’s really only with the advent of Claude Code integrated with VS Code that I’m learning more like I would learn from a code review.
It also depends on the project. Work code gets a lot more scrutiny than side projects, for example.
Or, given that OP is presumably a developer who just doesn't focus fully on front end code they could skip straight to checking MDN for "center div" and get a How To article (https://developer.mozilla.org/en-US/docs/Web/CSS/How_to/Layo...) as the first result without relying on spicy autocomplete.
Given how often people acknowledge that ai slop needs to be verified, it seems like a shitty way to achieve something like this vs just checking it yourself with well known good reference material.
ChatGPT: Certainly, I'd be glad to help! But first you must drink a verification can to proceed.
Or:
ChatGPT: I'm sorry, you appear to be asking a development-related question, which your current plan does not support. Would you like me to enable "Dev Mode" for an additional $200/month? Drink a verification can to accept charges.
A little hypothesis: a lot of .Net and Java stuff is mainlined from a giant mega corp straight to developers through a curated certification, MVP, blogging, and conference circuit apparatus designed to create unquestioned corporate friendly, highly profitable, dogma. You say ‘website’ and from the letter ‘b’ they’re having a Pavlovian response (“Azure hosted SharePoint, data lake, MSSQL, user directory, analytics, PowerBI, and…”).
Microsoft’s dedication to infusing OpenAI tech into everything seems like a play to cut even those tepid brains out of the loop and capture the vehicles of planning and production. Training your workforce to be dependent on third-party thinking, planning, and advice is an interesting strategy.
This is currently negative expected value over the lifetime of any hardware you can buy today at a reasonable price, which is basically a monster Mac - or several - until Apple folds and rises the price due to RAM shortages.
$2000 will get you 30~50 tokens/s on perfectly usable quantization levels (Q4-Q5), taken from any one among the top 5 best open weights MoE models. That's not half bad and will only get better!
If you are running lightweight models like deepseek 32B. But anything more and it’ll drop. Also, costs have risen a lot in the last month for RAM and AI adjacent hardware. It’s definitely not 2k for the rig needed for 50 tokens a second
Could you explain how? I can't seem to figure it out.
DeepSeek-V3.2-Exp has 37B active parameters, GLM-4.7 and Kimi K2 have 32B active parameters.
Lets say we are dealing with Q4_K_S quantization for roughly half the size, we still need to move 16 GB 30 times per second, which requires a memory bandwidth of 480 GB/s, or maybe half that if speculative decoding works really well.
Anything GPU-based won't work for that speed, because PCIe 5 provides only 64 GB/s and $2000 can not afford enough VRAM (~256GB) for a full model.
That leaves CPU-based systems with high memory bandwidth. DDR5 would work (somewhere around 300 GB/s with 8x 4800MHz modules), but that would cost about twice as much for just the RAM alone, disregarding the rest of the system.
Can you get enough memory bandwidth out of DDR4 somehow?
Look up AMD's Strix Halo mini-PC such as GMKtec's EVO-X2. I got the one with 128GB of unified RAM (~100GB VRAM) last year for 1900€ excl. VAT; it runs like a beast especially for SOTA/near-SOTA MoE models.
Just you wait until the powers that be take cars away from us! What absolute FOOLS you all are to shape your lives around something that could be taken away from us at any time! How are you going to get to work when gas stations magically disappear off the face of the planet? I ride a horse to work, and y'all are idiots for developing a dependency on cars. Next thing you're gonna tell me is we're going to go to war for oil to protect your way of life.
Definitely. Right now I can access and use them for free without significant annoyance. I'm a canary for enshittification; I'm curious what it's going to look like.
We already had that happen. When GPT 5 was released, it was much less sycophantic. All the sad people with AI girl/boyfriends threw a giant fit because OpenAI "murdered" the "soul" of their "partner". That's why 4o is still available as a legacy model.
I can absolutely see that happening. It's already kind of happened to me a couple of times when I found myself offline and was still trying to work on my local app. Like any addiction, I expect it to cost me some money in the future
There was a land rush to create apps. Basic stuff like the flash light, todo lists, etc, were created and found a huge audience. Development studios were established, people became very successful out of it.
I think the same thing will happen here. There is a first mover advantage. The future is not yet evenly distributed.
You can still start as an iOS developer today, but the opportunity is different.
reply