Yes. But you have to know the requirements before you can pour them in code. You don’t start coding and are granted a vision by the god of logic about where your journey will lead you.
you don’t start coding and are granted a vision by the god of logic about where your journey will lead you.
What is Agile for $200?
From the article:
There are times when I’m writing software just for myself where I don’t realize some of the difficulties and challenges until I actually start writing code.
I get what you’re saying but regardless if you have them upfront or along the way coding is modeling those requirements as we best imagine or understand them…even accidentally when following practices learned from others we may not even realize what requirements our modeling has solved.
Sure, sometimes you find requirements that you didn’t think of beforehand.
But what is programming at the core? I’d summarize it like this: “Explaining how to solve a complex problem to a very fast idiot.” And the thing C-Suits like to forget is that this explanation is given in a specialised language that, at least ideally, leaves no room for interpretation. Because ultimately the computer doesn’t understand Python, Rust, C or even assembly. It understand opcodes given in binary. Assembly may be the closest human-readable approximation, but it still has to be translated for the computer to understand it.
So what happens when you “replace” programmers with neural networks? You replace a well-defined, precise language to use for your explanation (because you still have to explain to the fast idiot what you want it to do) with English or whatever natural language you train your network on. A language littered with ambiguities and vague syntax.
Were it a tool to drive nails into wood you would’ve replaced a nail gun with a particularly lumpy rock.
I don’t see neural networks effectively replacing programmers any time soon.
Hmm. I agree with everything you’ve said, but disagree regarding the utility of AI.
Everything we’ve done since the patch cord days has been to create tools that make it easier to reason about our code. We’ve done little or nothing to address the problem of reasoning about requirements and specifications. The closest we’ve come is a kind of iterative development, testing, and user validation process.
I think that ChatGPT and its siblings and descendants are likely not the answer, but I think that it must be possible to create tools to help us reason about requirements and specifications before we start coding. Given the difficulty of processing natural language, I think that whatever those tools are will either be AI systems or draw heavily on AI concepts.
Or maybe not. Maybe it really does take a trained and creative human acting only in concert with others to implement desires.
I agree. A neural network that you can basically treat like a fellow programmer that is always free to help you would be amazing. Rubber duck debugging with an intelligent Ducky. But for it to be useful it has to be able to understand domain knowledge as well as understand and question the explanations you give it. I think that would at least be extremely close to general intelligence.
Yes, you might be right that the kind of system I’m thinking of is too close to general intelligence to be anything we’ll see anytime soon.
I still hold out hope that we can somehow make progress on the problem of gathering requirements that can be turned into specifications that make sense. As far as I can tell, the number of people who can do that job effectively are becoming an ever smaller fraction of what’s actually required to keep up with the demand to create new systems.
But coding seems quite literally like modeling those requirements.
Yes. But you have to know the requirements before you can pour them in code. You don’t start coding and are granted a vision by the god of logic about where your journey will lead you.
What is Agile for $200?
From the article:
I get what you’re saying but regardless if you have them upfront or along the way coding is modeling those requirements as we best imagine or understand them…even accidentally when following practices learned from others we may not even realize what requirements our modeling has solved.
Sure, sometimes you find requirements that you didn’t think of beforehand.
But what is programming at the core? I’d summarize it like this: “Explaining how to solve a complex problem to a very fast idiot.” And the thing C-Suits like to forget is that this explanation is given in a specialised language that, at least ideally, leaves no room for interpretation. Because ultimately the computer doesn’t understand Python, Rust, C or even assembly. It understand opcodes given in binary. Assembly may be the closest human-readable approximation, but it still has to be translated for the computer to understand it.
So what happens when you “replace” programmers with neural networks? You replace a well-defined, precise language to use for your explanation (because you still have to explain to the fast idiot what you want it to do) with English or whatever natural language you train your network on. A language littered with ambiguities and vague syntax.
Were it a tool to drive nails into wood you would’ve replaced a nail gun with a particularly lumpy rock.
I don’t see neural networks effectively replacing programmers any time soon.
Hmm. I agree with everything you’ve said, but disagree regarding the utility of AI.
Everything we’ve done since the patch cord days has been to create tools that make it easier to reason about our code. We’ve done little or nothing to address the problem of reasoning about requirements and specifications. The closest we’ve come is a kind of iterative development, testing, and user validation process.
I think that ChatGPT and its siblings and descendants are likely not the answer, but I think that it must be possible to create tools to help us reason about requirements and specifications before we start coding. Given the difficulty of processing natural language, I think that whatever those tools are will either be AI systems or draw heavily on AI concepts.
Or maybe not. Maybe it really does take a trained and creative human acting only in concert with others to implement desires.
I agree. A neural network that you can basically treat like a fellow programmer that is always free to help you would be amazing. Rubber duck debugging with an intelligent Ducky. But for it to be useful it has to be able to understand domain knowledge as well as understand and question the explanations you give it. I think that would at least be extremely close to general intelligence.
Yes, you might be right that the kind of system I’m thinking of is too close to general intelligence to be anything we’ll see anytime soon.
I still hold out hope that we can somehow make progress on the problem of gathering requirements that can be turned into specifications that make sense. As far as I can tell, the number of people who can do that job effectively are becoming an ever smaller fraction of what’s actually required to keep up with the demand to create new systems.
You’re clearly not yet a Sr. Developer. Everyone knows that you start getting apocalyptic software visions the day after promotion.
Source: It was revealed to me in a dream
Those tend to come from the product owner, though.
Look at me! I am the product owner now!