you misunderstand how software systems are built. These days, they are "assembled" out of building blocks which have clean interfaces.
similar to how houses are built from prefabricated components. Imagine that the AI is producing these pre-fab parts to order. But the design of the interfaces and assembly is still an art.
if you think of your job as a mason assembling bricks to build components, you are mistaken. you need to think of your job as an artist creating systems that work well built out of pre-fab components.
there is a story of how woz wrote the entire software for the early apple in assembly. nobody does that anymore. because, we have tools that produce assembly like compilers and such. that didnt take away the jobs of software engineers, because, someone had to now produce code at a different layer of abstraction.
imagine you ranting at the compiler 50 years back because it took away your job producing assembly code.
I’m thinking about how that boilerplate isn’t going to move the business forward because it lacks domain insights, isn’t structured quite right, ignores the needs of the end users, and looks too much like the code conventions we’ve been trying to move away from (for example). Yes you can prompt your way out of that, but by the time you have you could have written the code yourself.
I have never been able to use a large output directly from an LLM. They’re useful for starting from nothing or tickling your brain when you’re blocked on something, but they’re nowhere near making me wonder what I’m doing with my life.
There’s so much more to software than code. I get that the final implementation is code, but the path to that delivery is incredibly complex. Especially with existing code belonging to relatively non technical teams. They need a real human to connect to them, their history, current and future needs, nuances of the teams’s capabilities, etc. AI is nowhere near serving teams like this, and I have a hard time imagining when that would change.
My hot take is that GPT is getting better everyday at generating complicated and correct code that's maintainable, but its still got quite a ways to go.
On the other hand today it's already passably good at generating stuff that requires less precision and less accuracy (i.e. humanities type stuff, marketing, managerial/secretarial tasks, etc.).
So while you might be looking at your own CS job eventually being in jeopardy, whole swaths of alternative careers you aren't looking at are probably way, way more in jeopardy.
It the business model of a great demo to get people to signup and reduce resources each release to make it lose less money. Rise and repeat.
Meanwhile the free version is trying to keep up with other vender so they are forced to give away more and the difference between the two versions (paid/free) after a period of time favours the free. version
The boilerplate stuff is just that, boilerplate. It should be able to pump out drivel utility code. It doesn't change
The real value is Doing Useful Stuff with the code, written by you, your LSP, or AI.
The robots only take the jobs we give them. I don't give them code, more using LLMs as a sort of consultant with the pathological urge to lie and please.
Really? It automates the extremely boring and low skill part of your job and you "feel sick"? I feel pretty great about getting to focus on the actual problem rather than on boilerplate.
No, you are absolutly wrong. If you would automate it yourself then its correct way of doing thing. You learn by the process, increasting your skill. But no, you choosed to do automation by AI, so you become nothing more that a frontend to put data in. Also, you are slowly loosing control about stuff you doing. I know, todays engineering culture is weird. Dont ask questions, its not of your concern, etc.. I wonder when that stuff finaly collapse.
This is a boring and tired take, I’m sorry but it is.
Might as well take it to the logical conclusion that if you aren’t creating all your components (I’m taking IC’s, transistors, etc) by hand then are you really learning or increasing your skill. You’ve picked an arbitrary point in the tech stack and declared that any lower is not needed but any higher means you’ve lost control.
It’s like the people that scorn higher-level languages.
Precisely. LLM assistance is how I felt in the 90s about programming in Delphi and C++ Builder. It felt like cheating, but in a good way, I was getting to solving my problem in no time at all. Except back then it was mostly about the UI, and now it’s for other things as well
I find it weird that people talk like that. Did you even try to make something (e.g. a side project) with the help of AI?
AI is doing amazing things for me. If i feel i'm out of control because some parts of the generated code i don't understand, i just let it explain me, even ELI5 if that would be needed. If the explanation doesn't make sense,or it does make sense but it is not where i want to be heading, i point that out and the AI corrects the problem, or explains what the options are.
Together we come to a more concise solution.
I'm really amazed/confused why people talk like that about AI.
No, I have no reason to use it. I have piles of my own programs, notes, docs. If I need to do sth, I can refer to those. In worst case, I have to use search engine to grab info with is immediatly mirrored.
If it works for you, thats great. But its interesting people do NOT see danger, when its actually too late. Maybe ask AI about it ? ;)
I'm trying to wrap my brain around this. I understand that people see danger, but in fact there is no other option for us than to follow evolution. All major companies are embracing AI already and it's being used/pushed upon their customers as well.
There's danger in copying stackoverflow code as-is as well. But would that mean we should not use info that is shared on SO?
> In worst case, I have to use search engine to grab info with is immediatly mirrored.
I've never heard anyone saying they use a search engine "as a last resort"
>I have piles of my own programs, notes, docs.
For me, working that way would only succeed if I would need to repeat a project that is say 90 percent the same as any of the hundreds of previous projects, and stay with a programming language like C or fortran.
Im not sure if this is evolution really.. Maybe.. divolution? Why? Because AI tools like that are "FINAL" tools. I will repeat myself, but I will say it again. Smart people around the world develops those tools, and they provide them as IS for anyone (smart, dumb, adversaries, etc). Most people have NO clue how they work, but those tools does the job so who cares. NOW, the problem. All kind of "managers" and "politicans" who DO NOT care about knowledge are happinly grab all those tools on enpower them even more. And once those tools will be able to sustain themsels (one step before of true AI) you all guys/gals are obsolete. And I mean it really. No joke. It happened before, it will happen again. Because those who are in power have NO respect for you.
Wars and genocides happens because some small minded dickhead in power try to pour sweet on his ego or fullfill his ambitions and sends milions of people to war. Sad is, that all those people just DO IT, no resistance.. like WTF?
And this is the main risk. If you do NOT get it, im out of words.
Also, I have no problem being replaced by true AI. I am mortal so, I will die eventually. And that new AI will be different from humans. Will it be better. I have no idea, but lets give it a chance. Humans had it, and failed miserably.
Yeah, it seems you are confused. Guys in Power cannot do anything by themselfs. Usually they are no-skill people, hence went for power to give orders. Now imagine situation when you have an AI that can provide you with any data and solution. All you need is to provide a query and you will get answer with explanation. Next step is, it will send command to automated factory to actually produce the thing. Now, all those people in power, why they will need scientists and engineers for? They never listen to them unless it will directly benefit them. Is sad that people lack such an imagination.
All those scientists and engineers just provide tools for their own doom.
Im dont with this thread, I cannot explain it better anymore.
the derivative and repetitive nature of most coding tasks.
when I ask GPT for some code and it spits out 200 lines of boilerplate in 2 seconds, i feel sick. what am i doing with my life.