A friend of mine dropped out of a CS program because he couldn’t quite get programming languages. He’s always said he really wished he could just program with English. And now here we are. We can program with English. But, as the article says, the challenge has never been writing the code[1]. The hard part is actually accurately describing what you’re trying to build. And the ironic thing is that English actually makes that harder as it is significantly less precise. As the article says that’s not to ignore the power of generative AI in software development. There will be faster iterations, and more code written. It just still doesn’t solve the hard part, and in some situations it might even make the hard part harder.
[1] Yes of course there are some cases where the code is the hard part. The majority of software however that is not the case.
I don't think it's strictly an English thing, but a spec thing. Companies have been infuriatingly moving away from having a properly designed spec for whatever system you need to write into adhoc just create a micro service that vaguely fits the bill and move on.
So from that perspective it makes sense why companies would rush to replace pesky engineers asking for how something should be written. But they're piling on technical debt AND no one even understands the system! And as you add more technical debt the system context gets larger which means whatever LLM you're using can grab less and less of it.
It's madness created by idiots that have no clue what they're doing. I'm glad I'm over the experience hump so when the shit hits the fan and flies into the face of these companies I can demand a premium. But they're going to screw over an entire generation of entry level software engineers along the way.
Yet there will be a subset of that new generation that will be expert at harnessing AI coding tools, so we will be right back where we started… getting pwned by the top kids.
We call this survivorship bias and the foregone conclusion.
The inherent assumption precludes the possibility that organized production fails (including food), and with it goes the ability to survive. If exchange becomes impossible because of imbalances created by such tools alongside moneyprinting (which is very likely), then its pretty much over for everyone.
All efficient methods of production today depend on brittle non-resilient supply chains. Food production fails back to pre-industrial methods and Malthusian ecological overshoot reversion ensues, (i.e. half the world population dies because there isn't enough food to sustain).
Maybe you should spend some time properly educating yourself on the foundational principles of civilized society, what and how it works, starting with social contract theory.
AI fundamentally destroys the foundations of civilized rational society.
Couple the chaotic disruption of this with a debt trap caused by runaway money printing and you can easily get cascading failures that cause production failures.
Its flawed to assume you and everyone else can continue to follow a business as usual approach anymore. We already reached the limits of growth. There is very little time before a fall.
You should be aware that there is no experience hump that will make these things safe. I've a decade of direct IT experience at the principal engineer/admin level, and been out of work for two years now; I'm retraining to HVAC installation now since no ones hiring.
What I'm doing is what happens to the most competent people when there is no work to be had. Once I'm in HVAC, I may play around on side projects in IT, but I wouldn't subject myself to more torture by being dependent on my IT experience.
What happens in IT first, later happens broadly. IT is a labor multiplier for everyone else. When labor value of skilled work hits 0 broadly, everything collapses.
If you can't find legitimate positions because they all look legitimate, and you never make it to the interview (1:1000 conversion ratio), it won't matter how much you know or how much direct experience you have.
I'm seriously looking at securing off-grid food options, and the necessary people/skills to ensure it can't be stolen.
Non-reserve debt issuance and money printing will continue driving this trend until food production collapses, then half the global population starves. Its the problem with positive feedback systems. They runaway, and cascade towards failures.
Let's be real, legalese wasn't invented "to convey a false sense of authority" but to settle disputes as objectively as possible while leaving no room for misinterpretation. i.e. by formally specifying the constraints that define a law's applicability using unambiguous language.
The article you posted is a secondary effect. It's like saying that writing in math lingo "conveys a false sense of intelligence". Sure but that's not the point and it's mostly a projection of the listener's insecurity and lack of familiarity with the language anyway.
Note the part of the article that says "Lawyers tended to prefer plain English versions of documents, and they rated those versions to be just as enforceable as traditional legal documents."
It's only because they know how to translate from English to a rigid, formal language that they already know.
Just like you can map English to the language you use at work. You had to learn the language first, or you wouldn't have internalized its grammar and semantics.
I'd rather get a complex spec in English than in pseudocode. But I'm still going to write the code in C# or whatever.
That's a bit misleading because it neglects the history and changes of how things were, and how they are now.
Communication by words and language in general is the sharing of meaning. At one point, each word had a distinct and unique meaning, that may have depended on context, but it always followed rational principles in the definition.
This is no longer the case. Words have been corrupted so that by definition (formal or informal/spoken), they may have two potentially contradictory meanings. Fallacy and other deceit then creep in, and without a direct unique and non-contradictory definition, you can't communicate anything; even if it sounds like communication (it isn't when meaning isn't shared).
The hard part in programming is understanding the perspective and the paradigm shift, as well as the theory from an intuitive perspective. In many respects that degree is a gatekept subject, where you are useless outside your very niche specialty. You'd need to take courses in EE, CSE, CS, and at least to Abstract Math from the Math department to have an intuitive understanding of the entire stack.
There are impossible problems that cannot be solved by computers, but through changing the structure/requirements we can make many of those problems possible to solve.
The problem with AI is it makes the value of labor worthless at the early stages of a generational pipeline of professionals.
If there is no economic benefit, the best and brightest leave the field for better returns on investment. If no one enters the beginning, no one makes it to the end. Intermediate professionals in their first decade stop existing, Senior level professionals die off from old age. The only people end up being left are those who cannot pivot, which is the bottom 50-60% who aren't particularly competent but can scrape by if they work alongside competent people.
Then there is also the societal failures that this will rapidly accelerate, where production will inevitably cease from chaos, and half the world starves.
AI is worse than pandora's box. Its a proverbial devil's pleasure palace, where you get everything you want in the short term, and self-annihilate in the long term (if you fall prey to its initial allure).
A friend of mine dropped out of a CS program because he couldn’t quite get programming languages. He’s always said he really wished he could just program with English. And now here we are. We can program with English. But, as the article says, the challenge has never been writing the code[1]. The hard part is actually accurately describing what you’re trying to build. And the ironic thing is that English actually makes that harder as it is significantly less precise. As the article says that’s not to ignore the power of generative AI in software development. There will be faster iterations, and more code written. It just still doesn’t solve the hard part, and in some situations it might even make the hard part harder.
[1] Yes of course there are some cases where the code is the hard part. The majority of software however that is not the case.
I don't think it's strictly an English thing, but a spec thing. Companies have been infuriatingly moving away from having a properly designed spec for whatever system you need to write into adhoc just create a micro service that vaguely fits the bill and move on.
So from that perspective it makes sense why companies would rush to replace pesky engineers asking for how something should be written. But they're piling on technical debt AND no one even understands the system! And as you add more technical debt the system context gets larger which means whatever LLM you're using can grab less and less of it.
It's madness created by idiots that have no clue what they're doing. I'm glad I'm over the experience hump so when the shit hits the fan and flies into the face of these companies I can demand a premium. But they're going to screw over an entire generation of entry level software engineers along the way.
Yet there will be a subset of that new generation that will be expert at harnessing AI coding tools, so we will be right back where we started… getting pwned by the top kids.
We call this survivorship bias and the foregone conclusion.
The inherent assumption precludes the possibility that organized production fails (including food), and with it goes the ability to survive. If exchange becomes impossible because of imbalances created by such tools alongside moneyprinting (which is very likely), then its pretty much over for everyone.
All efficient methods of production today depend on brittle non-resilient supply chains. Food production fails back to pre-industrial methods and Malthusian ecological overshoot reversion ensues, (i.e. half the world population dies because there isn't enough food to sustain).
No idea what you are talking about.
Maybe you should spend some time properly educating yourself on the foundational principles of civilized society, what and how it works, starting with social contract theory.
AI fundamentally destroys the foundations of civilized rational society.
Couple the chaotic disruption of this with a debt trap caused by runaway money printing and you can easily get cascading failures that cause production failures.
Are you familiar with Thomas Malthus?
Here's an overview albeit very simple.
https://www.resilience.org/stories/2005-03-15/overshoot-nuts...
Its flawed to assume you and everyone else can continue to follow a business as usual approach anymore. We already reached the limits of growth. There is very little time before a fall.
It is madness/chaos, and it will only get worse.
You should be aware that there is no experience hump that will make these things safe. I've a decade of direct IT experience at the principal engineer/admin level, and been out of work for two years now; I'm retraining to HVAC installation now since no ones hiring.
What I'm doing is what happens to the most competent people when there is no work to be had. Once I'm in HVAC, I may play around on side projects in IT, but I wouldn't subject myself to more torture by being dependent on my IT experience.
What happens in IT first, later happens broadly. IT is a labor multiplier for everyone else. When labor value of skilled work hits 0 broadly, everything collapses.
If you can't find legitimate positions because they all look legitimate, and you never make it to the interview (1:1000 conversion ratio), it won't matter how much you know or how much direct experience you have.
I'm seriously looking at securing off-grid food options, and the necessary people/skills to ensure it can't be stolen.
Non-reserve debt issuance and money printing will continue driving this trend until food production collapses, then half the global population starves. Its the problem with positive feedback systems. They runaway, and cascade towards failures.
TIL that formal languages are better at specifying rules than informal languages (it's why legalese exists)
That's not really why legalese exists, or at least not why it's quite as obtuse as it is: https://news.mit.edu/2024/mit-study-explains-laws-incomprehe...
Let's be real, legalese wasn't invented "to convey a false sense of authority" but to settle disputes as objectively as possible while leaving no room for misinterpretation. i.e. by formally specifying the constraints that define a law's applicability using unambiguous language.
The article you posted is a secondary effect. It's like saying that writing in math lingo "conveys a false sense of intelligence". Sure but that's not the point and it's mostly a projection of the listener's insecurity and lack of familiarity with the language anyway.
Note the part of the article that says "Lawyers tended to prefer plain English versions of documents, and they rated those versions to be just as enforceable as traditional legal documents."
It's only because they know how to translate from English to a rigid, formal language that they already know.
Just like you can map English to the language you use at work. You had to learn the language first, or you wouldn't have internalized its grammar and semantics.
I'd rather get a complex spec in English than in pseudocode. But I'm still going to write the code in C# or whatever.
That's a bit misleading because it neglects the history and changes of how things were, and how they are now.
Communication by words and language in general is the sharing of meaning. At one point, each word had a distinct and unique meaning, that may have depended on context, but it always followed rational principles in the definition.
This is no longer the case. Words have been corrupted so that by definition (formal or informal/spoken), they may have two potentially contradictory meanings. Fallacy and other deceit then creep in, and without a direct unique and non-contradictory definition, you can't communicate anything; even if it sounds like communication (it isn't when meaning isn't shared).
The hard part in programming is understanding the perspective and the paradigm shift, as well as the theory from an intuitive perspective. In many respects that degree is a gatekept subject, where you are useless outside your very niche specialty. You'd need to take courses in EE, CSE, CS, and at least to Abstract Math from the Math department to have an intuitive understanding of the entire stack.
There are impossible problems that cannot be solved by computers, but through changing the structure/requirements we can make many of those problems possible to solve.
The problem with AI is it makes the value of labor worthless at the early stages of a generational pipeline of professionals.
If there is no economic benefit, the best and brightest leave the field for better returns on investment. If no one enters the beginning, no one makes it to the end. Intermediate professionals in their first decade stop existing, Senior level professionals die off from old age. The only people end up being left are those who cannot pivot, which is the bottom 50-60% who aren't particularly competent but can scrape by if they work alongside competent people.
Then there is also the societal failures that this will rapidly accelerate, where production will inevitably cease from chaos, and half the world starves.
AI is worse than pandora's box. Its a proverbial devil's pleasure palace, where you get everything you want in the short term, and self-annihilate in the long term (if you fall prey to its initial allure).