I know this is anathema around here, but this is why I have always liked grant-funded open source work. Whether government or private, someone at a policy level decides that something is important, and pays for development, leading to a new public good.
The development cost is based on the complexity of the work. It doesn't require a royalty payment in order to deploy more copies or to run them at higher loads. The software already exists. Separately, normal economic decisions can be made around support of deployments, e.g. whether to use in-house labor, hire consultants, or subscribe to some service contract. Sometimes, but not always, the users are another grant-funded project.
This model isn't a lottery ticket for the developers, nor the capital class. But the developers get paid a good wage for the time they spend on a product. I've done it for the majority of the last 30 years, almost like being a conscientious objector to the VC marketing complex.
Unfortunately, there are societal forces working hard against open source public goods. I think regulatory-capture is turning the whole security space into a compliance moat for heavily capitalized players. And the higher education cost spiral keeps increasing the overhead for universities, where a lot of these open source developer jobs used to be found. These are overlapping, but I'd say not the same thing. The overhead in academia is more than just compliance burden.
And, the whole fad-chasing and hustle aspect of contemporary IT is an inflationary process, eroding the value of previously developed open source products. Over my career, it seems that production-ready code is getting an ever-shorter service life. More maintenance and redevelopment work is needed or else users abandon it for the Next Big Thing. It's been quite a ride for me, following the whole wave of GNU, MIT, BSD, Linux, Python, and scientific computing tools since the early 90s...
The development cost is based on the complexity of the work. It doesn't require a royalty payment in order to deploy more copies or to run them at higher loads. The software already exists. Separately, normal economic decisions can be made around support of deployments, e.g. whether to use in-house labor, hire consultants, or subscribe to some service contract. Sometimes, but not always, the users are another grant-funded project.
This model isn't a lottery ticket for the developers, nor the capital class. But the developers get paid a good wage for the time they spend on a product. I've done it for the majority of the last 30 years, almost like being a conscientious objector to the VC marketing complex.
Unfortunately, there are societal forces working hard against open source public goods. I think regulatory-capture is turning the whole security space into a compliance moat for heavily capitalized players. And the higher education cost spiral keeps increasing the overhead for universities, where a lot of these open source developer jobs used to be found. These are overlapping, but I'd say not the same thing. The overhead in academia is more than just compliance burden.
And, the whole fad-chasing and hustle aspect of contemporary IT is an inflationary process, eroding the value of previously developed open source products. Over my career, it seems that production-ready code is getting an ever-shorter service life. More maintenance and redevelopment work is needed or else users abandon it for the Next Big Thing. It's been quite a ride for me, following the whole wave of GNU, MIT, BSD, Linux, Python, and scientific computing tools since the early 90s...