March 21, 2003

Showing Test Automation Code

The other day i said that i was disappointed that i didn't see more code at the test automation conference in San Francisco. It wasn't just this conference or these speakers. Most of the code you find published in work on automated testing is tool specific. More general principles are often expressed without example code. I said that one of the problems was a lack of a lingua franca. If i want to express ideas about test automation in code to a general audience, what language should i use?

I decided to survey my library.

Software Test Automation (Fewster & Graham) shows little code. The chapter on test techniques uses a simple pseudo code and the chapter on comparison shows a little Perl code (regular expressions). There isn't much code in the second part of the book either, which features reprints of case studies by other authors.

Automated Software Testing (Dustin, et al) also shows little code. The chapter on test development includes pseudo code and some SQA Basic (Rational Robot). It includes a CD, but it only has document templates: not code. Indeed, my main criticism of this book is that it is more about how to create a software testing factory than it is about test automation.

Automated Web Testing Toolkit (Stottlemyer). I couldn't find any code, other than some example web pages. It also contains a CD with documents, but no code. Yet another book on test management posing as a book on automation.

Integrated Test Design and Automation (Buwalda, et al) uses an invented language that looks a like Visual Basic. An appendex details its syntax. Some of this code is used to demonstrate important technical issues like synchronization and how to verify different elements of a GUI. Finally, we have something we can dig into.

Automated Testing Handbook (Hayes). A small book with a dash of pseudo code.

Visual Test 6 Bible (Arnold). Lots of code here. It's in Test Basic, a Visual Basic varient. It also contains some useful general principles of test automation, including a chapter on using state models to create "monkeys." And it has a CD with code! But the book presents itself mainly as a reference for Visual Test, rather than a development of more general principles.

Testing Object-Oriented Systems (Binder). A tome, at over 1,000 pages long. It has code in Ada, C++, Eiffel, Java, Objective-C and Smalltalk. The multiple languages is one of the reasons for the weight of the book. Most of the code is examples of code under test, but the chapter on test drivers includes code in all these languages for automating tests.

Java Tools for Extreme Programming (Hightower & Lesiecki). Most of the tools covered are testing tools and most of the text amounts to an annotated source code of a case study.

Just Enough Software Test Automation (Mosley & Posey). The core of the book are the chapters on Data Driven Test Automation. Indeed i wish they had written more on this and dropped some of the more generic material on testing (and retitled it Just Enough Data Driven Testing). There is a lot of SQA Basic code in the book. Moreover, the authors provide the full testing framework on their website. (In a few places the book references a "CD", but a late decision was made to host the code from the publishers website instead. But the publisher has had trouble keeping their website up.)

Mosley was one of the speakers at the conference last week. Because his book is one of the best examples of test automation books that demonstrate principles with code, i was hoping to learn more about the framework and maybe see a demo of it. That didn't happen.

One of the reasons for my focus on this issue is that i'm seeing more and more developers getting involved in test automation. A concern i have is that much of this work seems to be unaware of relevant work that's been done in testing and test automation. On the other hand, i am concerned that they'll dismiss much of it, because so little is illustrated with working code.

Posted by bret at 01:53 PM

March 15, 2003

Nothing much new or exciting

I spoke at SQE's Software Test Automation Conference in San Francisco this week. I was disappointed by the conference. First, too much of the talks were general principles without examples. I've been trying to repair this problem in some of my own presentations, but i realized that it's a more widespread problem. This problem is partly the result of the lack of any lingua franca for test automation. Much of the work is in proprietary vendorscripts and it is hard not to have example code overshadowed by the quirks of the vendorscripts or the testing interfaces (indeed the tool focus is so strong that i think many test automators would have trouble distinguishing the two). I found myself complaining about this and then realized that a talk i was due to give shortly suffered the same problem. So about two hours before my talk i whipped together an example to show how you could test notepad using Perl and the Win32-GuiTest module. (My talk was on how you didn't need to buy a commercial tool to automate your tests.) During my talk i ran it and showed the code, warts and all. Well, it was mostly warts. But my hope is to try and set a bar, and no one can accuse me of setting it too high.

So my feeling was that if people are going to present fairly standard solutions to test automation, they at least should have elegant examples. Instead we had lots of bulleted powerpoint slides. And amazing insularity. One of the keynote speakers, a Rational Robot user, asked the audience what WinRunner's language was -- 4Test? (No, it's TSL.) Even though he's published the code for a testing framework, he didn't show any of it.

Despite all this, there still seems to be an aversion to code. I've heard of an economist who'd published two versions of the same book: one with equations, the other without. The one without sold tons more copies. It seems like there is a fear that showing code will offend the audience. But can this really be true? I realize that frameworks for data-driven testing are popular and that there is a big demand for being able to express tests as words in tables or spreadsheets. But these frameworks require code to function. Shouldn't we be seeing more of this at a test automation conference?

I'm more willing to forgive weak slides if the ideas are new. But there was little of that either. Part of my dissatisfaction surely rests with the fact that i've seen so many eye-opening test ideas over the past year from the XP community. But they barely made a dent on the conference. I talked about XP a lot, but no one else did. I saw one other reference to JUnit, in a not very agile context.

I got the impression that there were many other attendees with similar complaints. I did notice more developers at the conference than previously, possibly a consequence of XP and test-driven development.

I do have some thoughts on how to get more technical content into conferences like this. I'll post some of them soon. Or write me if you are interested in helping out.

Posted by bret at 03:06 PM

March 01, 2003

We Don't Need Any Testing

Yet another attempt to fire those pesky testers who just slow things down by focussing on the negative. Testing is a reality test and is a big threat to fantasy projects (in this case, missle defense).

In that case, the goal is to prevent testing. In another case, the tester found too many problems and was fired. But then once again, we have another fantasy project (in this case, electronic voting).

Does your software need to work? Or does it just have to give the appearance of working?

Posted by bret at 11:52 AM