Thursday, January 31, 2008

What is a Test Case?

I've been training testers for 18 years by now. The thing I've dealt with, struggled with, whatever...is that everyone seems to have their own interpretation of what a test case is.

I guess this is on my mind today because I had to sort through it all again in a class. Don't get me wrong - I don't mind the exercise, but it makes me wonder about where we are in the testing profession when there isn't even anything close to a common definition of a test case.

Sure, there's IEEE Standard 829, but even that definition can be taken many different ways and does not prescribe a format (not that I think it should dictate a format).

So, here's my point. Don't get hung up on the format you use, just make sure you understand why you are using it and that it meets your needs.

I use the format I do because I feel it has some distinct advantages both in manual and in automated testing. However, I'm cool with the fact many people will take one of my classes and still create test documentation in their own ways. Hey, whatever works for you is fine with me. I do think, though, that having a common set of test terminology has tremendous value for a test project and organization.

(By the way, I take the view that test cases are small, distinct tests that can be specified as inputs, predicted results, and a set of execution for each item to be tested.)

However, I would suggest that you take a look at how other testers in other organizations define test cases, why they use the definition and format they use, and if it might be helpful to consider adopting a different approach. Of course, if it ain't broke you don't need to fix something just to be fixing it.

Will we ever have a unified view of test cases as a professsion? I doubt it. Don't even get me started on test scripts, test plans, test scenarios, QA, QC...

What do you think?

Till next time...

Friday, January 25, 2008

What's Wrong With This Picture?

OK, I don't actually have a picture here, so I'll explain.

The last two or three times I've been at my local post office, I've noticed a friendly gentleman standing by the "Automated Postal Center" (APC) assisting people using it. Meanwhile, at the main counter, there are four windows, one attendant and a line winding out the door. (Personally, I only try to get counter service at the post office at 12:15 p.m. on Saturday because everything thinks they close at 12:00, but they are actually open until 12:30!)

Ideally, the APC would help reduce the load of the long line. However, it takes some people longer to use the machine than it does to get help at the counter. Then there was the day when a local business was at the APC shipping about 40 boxes!

I've used the APC and I can't really fault the machine or the software. However, I do think the postal rules have become more complex, which leads to people taking longer to read and interpret the rules.

Now, back to my original thought and a question. Does an automated assistant really help when it must be accompanied by a human assistant?

Maybe it's just a matter of cultural conditioning, but good grief, we've had these things for years now. Whether it's at the post office or grocery store, it takes a live human being on a regular basis to help people use the technology.

All of this makes me think about software usability. I'm sure in these automated service machines there has been due attention paid to usability. Yet, people still struggle. I can think of two notable exceptions: ATMs and airline check in kiosks.

Do you have similar observations, or is it just me?

Oh, and by the way, could we PLEASE get more people at the Post Office counter?