For example when I'm managing, things like refactoring take a back seat to rolling out new product - even though I'm aware of the positive impact of refactoring. When being an architect I'd rather minimize code duplication (or triplication) - a long-term gain to team productivity but at the cost of product in the short-term. When developer I'd rather be focused on developing the cool new product than bug fixing. There's few things I can wholeheartedly stand behind when having to wear all three hats.
However good automated regression testing is something that I'm pretty sure everyone can agree on. It's usually not a huge investment of time but the payoff is large and grows over time - like a savings account.
From each point of view
- Developer --> Less Time bug fixing. I can go home before 7pm. Yeah!
- Manager --> Better quality product, risks identified earlier. Less screaming customers! Yeah!
- Architect --> Great way to support refactoring with minimal product risk.
- QA --> Less manual drudge work.
As mentioned above, regression testing is first and foremost a "risk reduction" exercise - you want to find bugs in code as fast as possible with minimal effort (automated!). This becomes more important the longer your product has been in production - no-one wants a release to be "three steps forward and two steps back". You want your product to always be increasing it's value to your user base. Bugs will happen - you just want them to either be minor ones or stuff in your new code - not the old stuff that people rely on to get their job done.
In any event, people's expectations for software these days are sadly low (Thanks Microsoft!) so delivering anything reasonably reliable gets noticed.
It also acts to reduce the "drudge" work of manual testing. That work is also subject to human memory - unless you have all the test cases written somewhere - are they all up to date? You sure? For the most part the answer to that is no.
Similarly automated regression tests also act to codify and formalize one's experience so, God forbid, you lose a QA person, a product manager, developer etc. You don't lose the entirety of their knowledge.
Although some would argue this makes those roles more replaceable - in my experience it acts to free up QA, Developers, Product managers to do more valuable - and LESS replaceable work.
It also helps your team be "more proactive and less reactive" (forgive the management speak). But as we all know - the more your team spends fighting fires the harder it is to have a truly enjoyable work place. Or at least that's my opinion.
So how do you "regression test"?
0) Stop Here!
First I'm going to assume some things - first and foremost that you have a source control system. If you don't then that's the first order of business - get one! SVN, CVS or whatever (personally I prefer Perforce - actually I love it).
I'm also going to assume your team or organization cares about product quality - no problem right? Think about it though - think about your past work experiences. I've been in several situations where there was no will to do any regression testing of any real merit - I tried to fight the good fight but sooner or later you get labeled as obstructionist. True you can't spend all your time on this - but if you can't spend a few hours per week per developer you're in for trouble. I'm sure I'm not alone in this experience.
1) Unit Tests
Well JUnit test cases are an obvious place to start. So obvious I won't spend time on them - the benefits and myriad and have been discussed ad infinitum (nauseum) here and elsewhere.
The key though is to remember to also focus on unit testing from a product point of view as much as one can - not solely a code point of view. Although code coverage tools are helpful too - 100% code coverage doesn't really mean as much as one would hope (see this great article by Andy Glover).
Also remember you don't just want to test functionality (both positive and negative cases) but also, if you can, performance / scalability and security.
2) Automated GUI tools
Assuming your product has a GUI you'll probably find yourself or your QA automation team writing scripts in some tool such as Segue Silk, HP/Mercury WinRunner, Quick Test etc.
These tools are good because it's very hard to unit test a GUI. The number of paths through the GUI are myriad for anything more than the simplest UI.
However these tests are often much slower than unit tests and require significant resources - both people and machine to run.
Again don't just focus on tests for functionality but also look at testing performance/scalability and security (and I'm sure other things too)
3) Projects-specific ways
Often you'll find ways that are specific to your software or project. Some examples are below
Two examples I'm very proud to have been a part of are described here.
The first was a product where my firm transmitted data to another firm - financial information about products we had for sale. The products were then advertised on the other firm's web site.
The key problem was that the data was constantly changing (think stock prices) so the information had to be pretty current (not totally real-time but a gap in minutes was about the worst we would tolerate).
Fortunately this customer had a test system which we could attach to - upload data and verify that any changes we made would still would work. How to tell it all worked as it was supposed to? I wrote a Java program using HttpUnit that would login to their web site - find the appropriate product pages, parse the HTML and pull out the prices. Then it would compare that data against what was in my local DB (the data I was supposed to send them).
Once I had run that for an entire day without issues I was extremely confident that all was right with the world.
Also a nice upside was I could run that SAME program against our production system (it's read-only after all) to verify that production was fine.
Now the REALLY cool thing was when something happened in production e.g. network outage or some remote problem typically I could inform my user base and my management very quickly. Often they would know before our business partners were aware that THEIR system was having problems. It's really nice to be able to have that level of confidence in your product and your users and management notice too let me tell you.
Another good example was more typical to software product firms where one has many customers running the product. Typically as we're solving problems with this product we would get copies of their databases to debug issues. In turn we would take several of these customer databases and run them through nightly regression tests.
The test would compare the current release vs. the last known release (considered the "gold copy"). Not only would we diff results but also we'd compare execution times. And although the environment was not totally locked down for a true performance comparison - a 20%+ discrepancy of performance for > 1 day was typically a sign of a potential performance issue in something in our code or the data model, indexes etc.
OK enough bragging . . . let's talk about some best practices.
#1) Find and Test the Gaps
Testing is never done and neither will your regression testing. Naturally you will (or should) want to test the riskiest pieces first - places that have largest impact if they break and/or break frequently. Often too you should use customer feedback and bug reports to help guide your list of functionality to add to the test. Look for patterns - key areas of misunderstanding or bugs. Code coverage tools can also be helpful but don't rely on them to tell you when you're done.
Keep a prioritized list of areas to cover - start at the top and work your way down.
#2) If you're not finding bugs ask why?
One way you're regression test can stop finding a lot of bugs is if it's out of date - if you haven't been spending time adding to your test suite (you're adding more functionality say?) then you're short-changing yourself. With every release of your code - your regression test should grow or improve.
Another is if the tests being created aren't aligned with what's known for risky / bug-prone areas.
Another is when your developers are so damn good they don't have bugs anymore - but we all know that's not all that common. But often you'll find that's because they run the regression themselves when they realize how valuable it is.
#3) Have dedicated hardware
You don't need the biggest, baddest machine - often a recent late-model desktop will suffice in most cases. But you need this to do #4
#4) Run them all the time or at least nightly
Your tests should be running all the time or at worst - once per night. Once your tests are big enough there may not be enough time in the day on on existing hardware - so you may wish to keep some long running lower priority tests for the weekend or alternate test suites every other day.
#5) When you max out - throw hardware at it
Hardware is a commodity - it's cheaper to buy some high end Solaris hardware for $10k than it is to take two months of a developer who earns $100k per year plus benefits to speed things up. So when you start to max out your current regression test machines throw hardware at it first - then improve the performance (unless there's some blindingly obvious quick fix in software).
#6) Critical tests first
In the spirit of "fail fast" and risk mitigation - you first want to perform the tests that will find the most critical items or test bug-prone pieces of code.
#7) Automate Result Interpretation
Once the results from your regression test are ready it should email relevant people and provide a synopsis of the results. A daily regression isn't very useful if it takes an hour to figure out if you passed or not. Eventually no-one will read it.
#8) Add tests continuously
Don't wait for a particular "phase" of the project to add more tests - do so continuously otherwise you'll find that time suddenly disappears and then you're playing catch-up.
#9) Don't reinvent the wheel - use tools
If you're going to automate don't waste time - use the all the tools you can. There are so many great tools out there from open source to commercial (although I must say the commercial tools I find VERY expensive compared to their utility).
- Canoo WebTest
- AutoHotKey -- if you don't know this already - check this out ASAP!
- Apache POI especially if you produce reports in Word Doc / Excel format
- Good ole unix diff, pipe and sort (this reminds me why I dislike Windows as a dev platform - Unix tools rock so thank God for CygWin)
Don't forget static code analysis tools too (See "Analyze this - put your code on the couch!") to help find bugs (e.g. multi-threading) that your regression tests might miss.
Having a great suite of regression tests brings great confidence for a team of developers - who are often frankly a pessimistic lot - that can mean you and your team will stand out in the crowd and separate the true engineers (who care about delivering a quality product) from the cowboys.
(p.s. Thanks to Rick for the push!)