I had the privilege to visit and present at the 2009 Star West conference in the Disneyland Hotel in Anaheim.
The first very noticeable fact was the sheer size. The conference is up well more than 50% over Star East. This is good news. I feel the Star conferences are a core infrastructure for our developing industry, and I’m glad to see them flourish.
Also I had the impression that attendants were focused and knowledgeable. I noticed that both in personal discussions with various people, often as ad hoc as at a breakfast table, and in the questions and responses to presentations and forum discussions, including my own presentation. An explanation may be that budgets are limited in most organizations, and only those who have a clear interest and have good things to bring to the table themselves are sent to the conference.
Attendants tend to bring their practical experiences to the show. Once they start sharing these with each other and with us speakers a lot can be learnt. This works both ways. As long as I’m in the business I have seen the difference and delicate relations between theory and practice. From an academic perspective the world can seem clear (although a real scientist will never fall into such a trap), but practice can prove resistant. Just going about daily practice however is also not the solution. Some theory and/or conceptualizing is needed to provide structure and make things easier. If I didn’t believe that, I wouldn’t bother with my own concepts (like Action Based Testing and Soap Opera Testing, for whatever they’re worth).
Even very practice driven approaches like extreme programming and exploratory testing don’t flourish without conceptual directions and ideas to organize them. The relation between theory and practice is in my view an eternal balancing act, that has basically brought man-kind where it is now (regardless whether that is always a good place).
Testing needs to be organized. It costs time and money, and you need to understand why you’re doing it and how you will do it. In my own approaches I emphasize high level test design (as opposed to merely scripting tests) as a main driver for this organizing process. Please see my other postings about this. I was very happy with the tutorial by James Whittaker, in which among other things he presented “tours” as a way to organize and design the tests. These tours would in my view help the high level test design, thus making testing efforts more efficient and effective (“lean and mean”).
On the other hand theory can easily be more harmful than useful. Rigid life cycles in particular can kill creativity and obscure valuable insights. Cem Kaner, Bret Pettichord and James Back wrote a very good book: Lessons Learned in Software Testing. They use the term “context driven”, meaning whatever you do you should not close your eyes for what you’re working with (my own terminology, please read the book). Driving a car is for me a good example: there are clear rules and directions, and you can plan a detailed route of where you want to go, but I would not recommend to do it with your eyes closed…
My own talk however was about off-shoring test automation. It is to me a fascinating topic, since you try to marry two ways of saving time and money: (1) applying automation to relieve test execution times and costs, and (2) off-shoring the automation development to save even further on developing this automation.
What we try to do in our own company is to achieve both efficiency and effectiveness, in particular for high volumes of tests. In my view, and in my methods, the two typically go together: focused tests, driven by well thought through high level test designs, lead to high efficiency while at the same time they can be effective. The opposite is also possible, neglecting high level test design will typically result test sets that are bulky and lame: costing a lost and not very interesting or aggressive. Even worse: their automation can also prove to be hard to build and maintain.
However, both automation and off-shoring are massively complex operations, and even after many years of experience I’m learning new things every day. Without repeating the entire talk I did (feel free to contact me for more details), here is some of the gist of it. Because of the complexity I decided to organize the topic as follows:
- four main “challenges”: (1) automation, (2) other country, (3) distance, and (4) time zone differences. All four of these “buckets” add to the complexity, each in their own way
- several “patterns”, each with a definition of symptoms and some possible solutions. In the course of time more patterns could be added, and existing patterns can be refined
One example of a “pattern” is the “solution” pattern. In particular with automation a team may encounter problems, either in subject matter knowledge, in test design or in automation. In several cases I have seen they feel obligated to come up with a “solution”, without necessarily communicating that solution clearly, or without the US based side paying enough attention or having the appropriate skills to understand it. If such solutions are too ad hoc, they can (1) introduce new problems, and (2) hide the actual problem (root cause). For example adding a long sleep to a test action can slow down an entire test run while it is only needed in one instance. And if that sleep is not long enough the problem can still come back.
Bottom line: the conference was fun for me. The sheer variety of ideas and issues coming up show me that we will need many more conferences, blogs, books and articles to sort it all out. Real practice should be the driver, and we should also keep developing our methods to get to a next level.