Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think everyone is missing the bigger picture.

This is not a matter of whether AI will replace humans whole sale. There are two more predominant effects:

1. You’ll need fewer humans to do the same task. In other forms of automation, this has led to a decrease in employment. 2. The supply of capable humans increases dramatically. 3. Expertise is no longer a perfect moat.

I’ve seen 2. My sister nearly flunked a coding class in college, but now she’s writing small apps for her IT company.

And for all of you who poo poo that as unsustainable. I became proficient in Rust in a week, and I picked up Svelte in a day. I’ve written a few shaders too! The code I’ve written is pristine. All those conversations about “should I learn X to be employed” are totally moot. Yes APL would be harder, but it’s definitely doable. This is an example of 3.

Overall, this will surely cause wage growth to slow and maybe decrease. In turn, job opportunities will dry up and unemployment might ensue.

For those who still don’t believe, air traffic controllers are a great thought experiment—they’re paid quite nicely. What happens if you build tools so that you can train and employ 30% of the population instead of just 10%?



"I became proficient in Rust in a week". How did you evaluate that if you weren't an expert in Rust to begin with? What does proficient mean to you? Also, are you advocating we get rid of air traffic controllers with AI? How would we train the AI? What model would you use? If you can't solve a safety critical problem from first principles, there is no way an AI should be in the loop. This makes no sense.

Cynically, I'm happy we have this AI generated code. It's gonna create so much garbage and they'll have to pay good senior engineers more money to clean it all up.


To your second point we’re seeing a huge comeback of vulnerabilities that we’re “mostly gone”. Things like very basic RCEs and SQLi. This is a great thing for security workers as well.


I don't understand, no one ever needed an LLM to automate air traffic controllers. 1980s tech could do that just fine. The reason they continue to exist is essentially cultural. Fell into a local maximum trap and now the entire industry and governance is incapable of lifting itself out of it and instead come up with stuff like "standardized phrases for the voice coms that we have inexplicably made crucial to the entire system" while riding cultural cliches like "the pilot must be in control" as they continue manual flight into big rocks.


Can you talk about Rust without your friend computer?


Of course not! But I can definitely ship useful tools, and I can could learn to talk the talk in a tenth of the time it would otherwise have taken.

Which is my point, this is not about replacement, it's about reducing the need and increasing supply.


How are you going to ship a tool you don't understand? What are you going to do when it breaks? How are you going to debug issues in a language you don't understand? How do you know the code the LLM generated is correct?

LLMs absolutely help me pick up new skills faster, but if you can't have a discussion about Rust and Svelte, no, you didn't learn them. I'm making a lot of progress learning deep learning and ChatGPT has been critical for me to do so. But I still have to read books, research papers, and my framework's documentation. And it's still taking a long time. If I hadn't read the books, I wouldn't know what question to ask or how to evaluate if ChatGPT is completely off base (which happens all the time).


Can you talk about assembly without the internet?

I fully understand your point and even agree with it to an extent. LLMs are just another layer of abstraction, like C is an abstraction for asm is an abstraction for binary is an abstraction for transistors... we all stand on the shoulders of giants. We write code to accomplish a task, not the other way around.


I think friction is important to learning and expertise. LLMs are great tools if you view them as compression. I think calculators are a good example, people like to bring those up as a gotcha, but an alarming amount of people are now innumerate on basic receipt math or comprehending orders of magnitude.


It is absolutely essential that we still have experts who know the details. LLMs are just the tide that lifts all ships.


> Can you talk about assembly without the internet?

Yes.

Can you not?


> I became proficient in Rust in a week, and I picked up Svelte in a day. I’ve written a few shaders too! The code I’ve written is pristine. All those conversations about “should I learn X to be employed” are totally moot.

fucking lmao


My point is you learn X and your time to learn and ship Y is dramatically reduced.

It would have taken me a month to write the GPU code I needed in Blender, and I had everything working in a week.

And none of this was "vibed": I understand exactly what each line does.


You did not and you are not proficient. LLMs and AI in general cater to your insecurities. An actual good human mentor will wipe the floor with your arrogance and you'll be better for it.


I think you're under the impression that I am not a software engineer. I already know C, and I've even shipped a very small, popular, security sensitive open source library in C, so I am certainly proficient enough to rewrite Python into Rust for performance purposes without hiring a Rust engineer or write shaders to help debug models in Blender.

My point is that LLMs make it 10x easier to adapt and transition to new languages, so whatever moat someone had by being a "Rust developer" is now significantly erased. Anyone with solid systems programming experience could switch from C/C++ to Rust with the help of an LLM and be proficient in a week or two's time. By proficient, I mean able to ship valuable features. Sure they'll have to leveraging an LLM to help smooth out understanding new features like borrow checking, but they'll surely be able to deliver given how already circumspect the Rust compiler is.

I agree fundamentals matter and good mentorship matters! However, good developers will be able to do a lot more diverse tasks which means more supply of talent across every language ecosystem.

For example, I don't feel compelled at all to hire a Svelte/Vue/React developer specifically anymore: any decent frontend developer can race forward with the help of an LLM.


I realize I came across as harsh and I surely don't want to judge you personally on your skills as A) that's not necessary for my point to make sense and B) uncalled for. I'm sure you are a capable C developer and I'm sorry for being an asshole - but I am one so it's hard for me to pretend otherwise...

Being able to program in C is something I can also do, but it sure as heck does not make me proficient Rust developer if I cobble some shit from a LLM together and call it a day.

I can appreciate how "businesses" think this is a valuable, but - and this is often forgotten by salaried developers - as I am not a business owner I have neither the position nor the intention of doing any "business". I am in a position to do "engineering". Business is for someone else to worry about. Shipping "valuable features" is not something I care about. Shipping working and correct features is something I worry about. Perhaps modern developers should call themselves business analysts or something if they wish to stop engineering.

LLMs are souped up Stack Overflows and I can't believe my ears if I hear a fellow developer say someone on Stack Overflow ported some of their code to Rust on request and that this feature of SO now makes them a proficient Rust developer because they can vaguely follow the code and can now "ship" valuable features.

This is like being able to vaguely follow Kant's Critique of Pure Reason, which is something any amateur can do, compared to being able to engage with it academically and rigorously. I deeply worry about the competence of the next generation - and thus my own safety - if they believe superficial understanding is equivalent to deep mastery.

Edit: interesting side note: I am writing this as a dyed in the wool generalist. Now ain't that something? I don't care if expertise dies off professionally, because I never was an "expert" in something. I always like using whatever works and all systems more or less feel equal to me yet I can also tell that this approach is deeply flawed. In many important ways deep mastery really matters and I was hoping the rest of society would keep that up and now they are all becoming generalists who don't know shit and it worries me..


It would have taken you a month and you would have been able to understand it 100x more.

LLMs are great but what they really excel at is raising the rates of Dunning-Kruger in every industry they touch.


Yes, this is definitely missing a /s, I hope.

Please for the love of god tell me this is a joke.


Re the last sentence, is the answer that more people will die in aviation disasters?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: