What's the reason? Despite claims to the contrary I do not believe it stems from incomplete, vague or ever-changing requirements although that plays a factor. I think the issue is that frankly, we mere humans (!) are just not smart enough to build complex systems that require so many interacting and moving parts, and to do so in one fell swoop - that is with a Big Design Up Front (BDUF).
(Other factors include the benefits to being first to market even if it's with a lot of bugs! Sadly people's expectations for software are quite low)
Alternatively you could say that the technologies we use are still too complex . . . .
Take for example a sample 3-tier Java/J2EE application - on the Presentation Tier you have
- JSP pages (probably tens to hundreds or more pages each with a myriad links or buttons and inter- and intra-page flows)
- Now add Ajax (it makes the user's experience better but the work as a whole got harder)
- Struts / Servlets
On the middle tier you have
- Facades / Business Delegates
- Session Beans, Entity Beans, Message Beans
- Probably a great deal more: JMX, logging, multi-threading, JMS messaging, Enterprise Service Buses
- Web Services
On the data tier
- DAOs, ORM mapping (and all that this involves)
- Transactional behavior
This is not to slam Java/J2EE - the same problems must be solved in .Net, Ruby and C# - presentation tier (and it's concerns such as usability, flow, chattiness of requests etc.), middle tier (concerns include scalability, threading, caching, user sessions, fail-over, security etc), data tier (look for data model extensibility, transactions etc.)
And we wonder why BDUF doesn't work!?!?! It's like trying to hire someone who can speak 9 world languages fluently and being surprised by a lack of resumes!
Can it be any surprise then that BDUF doesn't work? Due to complexity of the technology space, half the time we're not aware of a problem until we hit it. As a CS geek with a streak of "true" engineer I find this embarassing.
BDUF is not inherently the problem - *WE* are the problem - we're either not smart enough (not sure we can fix that) or we've made the tools too complex and hard to use (we can address this).
We end up having to write code to help solve issues related to things like Cross Site Scripting (XSS) and database table indexes etc. which are so far removed from the actual problem we are trying to solve - building applications for some end-user need.
As Einstein said:
"We can't solve problems by using the same kind of thinking we used when we created them"
In the same way that Java removed the need for us to handle memory management ourselves, we need something (not *just* a language) to solve the "junky" problems - yeah they're fun problems to solve but it's not what our customers, clients or bosses are really paying us for! What's needed is a paradigm shift . . . . 4GLs, MDA etc. all fail because they seek to make abstract or automate what's inherently a very complex space.
As any 6 year old can tell you, automating an error-prone process means you just make more errors in less time!
And as any mathematician would tell you, if you can control the problem space then do so! Why make a problem harder than it needs to be? (Job Security?)
How do we get there? Well if I knew I'd be a millionaire - but I think it has to be a solution which focuses on the data - how it's represented, displayed and persisted - if we can radically simplify that, then we're on our way.
Perhaps even our day-to-day IT language itself - about "data", "code", "interface" is part of the problem - it's all so very Turing/Shannon/Von Neumann - so dry, so linear, so artificial. True paradigm shifts occur when we look outside - FAR outside!
The downside? Probably massive layoffs in the IT industry - the same way mass production and industrialization spelled the death knell for the livelihood of many skilled individual artisans.