Tuesday, March 25, 2014

Slides from Defect Sampling Presentation at ASTQB Conference.

Hi all,

Thanks to everyone at my presentation at the ASTQB Conference on defect sampling.

Here are my slides from today's presentation:

http://www.softwaretestingtrainingonline.com/cc/Slide%20Shows/Defect%20Sampling%20-%20ASTQB.pdf

Here is the video from Gold Rush:

http://youtu.be/0XXzdNma7O4

Enjoy!




Wednesday, March 19, 2014

Webinar Video and Slides - What to Automate First?

Hi Folks,

Thanks for attending the webinar today on What to Automate First - The Low-Hanging Fruit of Test Automation. If you missed it, here is the video:

http://youtu.be/eo66ouKGyVk

Here are the slides:

http://www.slideshare.net/rrice2000/what-do-we-automate-first

I hope you enjoy the content!

Tuesday, March 04, 2014

Is Software Testing Once Again a "Concept Sale?"

In marketing there is a thing known as a "concept sale." That is, before you can even discuss features or price, you have to make the case for WHY the item or service is needed.

Back in the day (1990 - 1995), test automation tools were a concept sale. People had to be shown in very clear and compelling terms what test automation was and how it was helpful. After a few years, people understood the need and value, so the questions started moving toward, "Which type of test tool(s) do we need?"

Based on what I see in terms of software quality overall and hear in conversations with c-level execs, I get the feeling that software testing is going down the same path as other quality-related practices, such as six-sigma, TQM, requirements engineering, standards, etc.  Yes, I fear that software testing is quickly becoming a "concept sale."

What used to be considered an essential part of the software development life cycle, is now in danger of becoming a secondary and optional part of the life cycle. It's certainly not that way in all companies, but I see a clear trend emerging that companies are willing to take huge risks in releasing largely untested software.

It seems that management knows that testing is needed, but it is a luxury. The attitude seems to be that defects found "in the wild" can be fixed cheaper and faster than before release. Risk is secondary to release dates.

Does that mean that "testing is dead" after all? Well, it reminds me of the scene from Monty Python and the Holy Grail, where some people are trying to put a man on a cart, but he keeps saying "I'm not quite dead yet."

I have no easy answers on this. All I can say is to: 1) keep showing your value by being an information provider of things beyond defects and failures, 2) be professional and continue to work on your skills (get better at providing feedback, learn how to notice details better, become a creative thinker, etc.), and 3) keep making the message that testing is a lifecycle activity, integrated into a development process. (I do think it is significant that the concept of a software lifecycle is also a confusing and elusive topic for many companies.)

The particular method of developing software isn't the issue. The same thinking goes for architecture, requirements, design and all the other disciplines that make solid software. The magic is not in the method. The magic is to understand user needs, deliver to those needs in value-added ways, and know the solution actually works by testing it!

Don't assume that everyone understands why testing is needed.  Be prepared to show the value of your testing at a moment's notice. Dashboards are a good way to do that. In fact, I have a presentation on this posted on YouTube and Slideshare.net.

You never know when you'll have to make a concept sale for testing!

I would be interested in hearing your opinions on this.

Thanks,

Randy








Thursday, February 20, 2014

Tester to Developer Webinar Slides Posted

Hi everyone,

Thanks for attending the webinar today and for all the great conversation. Here are the slides:

http://www.softwaretestingtrainingonline.com/cc/library/The%20Elusive%20Tester%20to%20Developer%20Ratio2014.pdf

The video will be posted very shortly.

Thanks!

Randy

Wednesday, February 12, 2014

Metrics for User Acceptance Testing

Recently I responded to a question at Quora.com about UAT metrics.

"What user acceptance testing metrics are most crucial to a business?"

Here is an expanded version of my answer, with some caveats.

The leading caveat is that you have to be very careful with metrics because they can drive the wrong behavior and decisions. It's like the unemployment rate. The government actually publishes several rates, each with different meanings and assumptions. The one we see on TV is the usually the lowest one, which doesn't factor in the people who have given up looking for work. So, the impression might be the unemployment situation is getting better, while the reality is a lot of people have left the work force or may be under-employed.

Anyway, back to testing...

If we see metrics as items on a dashboard to help us drive the car (of testing and of projects), that's fine as long as we understand that WE have to drive the car and things happen that are not shown on the dashboard.

Since UAT is often an end-project activity, all eyes are on the numbers to know if the project can be deployed on time. So there may be an effort my some stakeholders to make the numbers look as good as possible, as opposed to reflecting reality.

With that said...

One metric I find very telling is how many defects are being found per day or week. You might think of this as the defect discovery velocity. These must be analyzed in terms of severity. So, 10 new minor defects may be more acceptable than 1 critical defect. As the deadline nears, the number of new, critical, defects gains even more importance.

Another important metric is the number of resolved/unresolved defects. These must also be balanced by severity and should be reflected in the acceptance criteria. Be aware, though, that it is common (and not good) practice to reclassify critical defects as "moderate" to release the system on time. Also, keep in mind that you can "die the death of a thousand paper cuts." In other words, it's possible to have no critical issues, but many small issues that render the application useless.


Acceptance criteria coverage is another key metric to identify which criterion have and have not been tested. Of course, proceed with great care on this metric as well. Just because a criterion has been tested doesn't mean it was tested well, or even passed the test. In my Structured User Acceptance Testing course, we place a lot of focus of testing on the business processes, not just a list of acceptance criteria. That gives a much better idea of validation and whether or not the system will meet user needs in the real world.

Finally, stakeholder acceptance is the ultimate metric. How many of the original acceptance criteria have been formally accepted vs. not accepted. It may be the case where just one key issue holds up the entire project.

As far as business value is concerned, a business must see the value in UAT and the system to be released. Here is an article I wrote that address the value of software quality: The Cost of Software Quality - A Powerful Tool to Show the Value of Software Quality.

I hope this helps and I would love to hear about any metrics for UAT you have found helpful.

Thanks,

Randy


Monday, February 10, 2014

Tester to Developer Ratio Survey

I have a new survey posted for the Tester to Developer Ratio topic. If you have 5 minutes to answer it, I would appreciate it!

http://freeonlinesurveys.com/s.asp?sid=zikft6dtg1mueho417711

Sunday, February 09, 2014

Free Webinar - February 20 2014 - The Elusive Tester to Developer Ratio

You are invited to attend a FREE 30-minute (or so) webinar on Thursday, February 20th, 12:00 CST.tester to developer ratio

Since 2000, I have been researching the question, "What is the recommended ratio of software testers to developers?" I have written two articles on that topic, with the original article, "The Elusive Tester to Developer Ratio" getting over 30,000 hits on my web site and being cited in many other articles and books.
 
This is an important metric, but it also raises other important questions, such as:
 
  • What if your needs are different from "average"?
  • Is this metric really the best way to plan the staffing of a test organization?
  • What are other, perhaps better, ways to balance your workload?
  • How can small test teams be successful, even in large development organizations?
 
I will also present up-to-date research.

To sign-up, just go http://www.anymeeting.com/PIID=EA52DA86864F3C (You will get automatic reminders beforehand.)

There are limited slots available, so be sure and sign-up and show-up early to reserve your place. (The last time we had a completely full session.) We will be recording the session and post it a little later on my YouTube channel.

Feel free to pass this invitation along to a friend!

I hope to see you there!

Thanks,

Randy Rice

Tuesday, February 04, 2014

Revising an Important Software Testing Course

You would think after writing over 60 software testing courses, I would finally get tired, get bored or something. Yet, new topics interest me, so I keep going.

Another challenge is maintaining all this courseware. Thankfully, I have some friends like Tom Staab and Tauhida Parveen for their help in this effort.

The course we are revising is Security Testing for the Enterprise and the Web. The need has never been greater for security testing, and so many organizations place too much trust in their security policies and procedures. The bad guys don't give a rip about policies and procedures. They are out to exploit software defects to give them greater leverage.

As with any software testing course, the key is application. So, we are revising this course with updated examples and exercises using recent exploits as examples.

We'll have it ready soon, so stay tuned for details.


Friday, January 24, 2014

Resources from "How to Test Without Defined Requirements" Webinar

Here are the files from today's webinar on How to Test Without Defined Requirements. 

Thanks for viewing!


Video: http://youtu.be/RW4FiM2RjxY

(I re-recorded the session due to audio problems during the webinar. I have discovered that the cause of the dropped audio was a DOS attack against the webinar's audio service provider.)

Slide Handouts
1 slide/page:

http://www.softwaretestingtrainingonline.com/cc/public_pdf/Testing%20Without%20Defined%20Requirements.pdf

2 slides/page:

http://www.softwaretestingtrainingonline.com/cc/public_pdf/Testing%20Without%20Defined%20Requirements%202-up.pdf

Other Resources Mentioned:

The article “How to Test Without Defined Requirements” at http://riceconsulting.com/home/index.php/General-Testing/testing-without-defined-requirements.html

Free graphing tools

Yed - http://www.yworks.com
Gliffy - http://www.gliffy.com
FreeMind - http://freemind.sourceforge.net/wiki/index.php/Main_Page
Xmind - http://www.xmind.net/
CTE-XL - http://www.berner-mattner.com

Checklists and Functional definition

Checklists - http://www.riceconsulting.com/public_pdf/ERROR_CONDITIONS_TESTING_CHECKLIST.docx
METS spreadsheets - http://www.riceconsulting.com/public_pdf/METS_Worksheets.xls
Or http://www.gregpaskal.com

MindMap - http://www.riceconsulting.com/public_pdf/Methods.pdf
http://www.riceconsulting.com/public_pdf/Methods.mm

Wednesday, January 22, 2014

Great Deal on ASTQB Conference registration

The earlybird deadline has passed. Or has it? Here is a code good for 24 hours to get 10% off your ASTQB Conference registration: astqb2014h9

Register now and join us in beautiful San Francisco this March 24-26! Build your skills, your professional connections, and your career at the ASTQB Conference:
  1. Learn how the ASTQB Conference can help your software testing career.

  2. Register for the ASTQB Conference now.

  3. Certification isn't required to attend the conference, but we are offering ISTQB Certification exams on March 24th. If you're interested, register for the Foundation Exam or Advanced Exam.
Improve your testing. Improve your career. Learn more now.

I'll be there speaking on Free and Cheap Test tools and a track session on Defect Sampling. I hope to see you there!

Saturday, January 11, 2014

Google Hummingbird and Other Stuff

As a follow-up to my last post about Google Hummingbird, it's becoming more clear what Google is really trying to do with all of their apps and services.  They want Google+ to be king over Facebook and other social sites. Problem is, the adoption just isn't there. So, they integrate Google+ will other stuff they own, like YouTube. The search results you see when you "Google" something are based on where you live, where you have been on the web, your activity on Google+ and all kinds of behavior detectors.

Here's an article that I totally agree with:

Sorry, Google+, We Still Won't Come to Your Party.

But, here's the problem. People don't like this level of privacy invasion. I know I don't.

So, the big goal for Google is to own it all by owning all the knowledge about you and your activities. This is attractive for businesses, because this level of information is normally very expensive, but anyone with a website can get it free with Google Analytics.

All I know is that the more I see of this direction, the less I like.

Here's an interesting test. Try the same search terms in Google, Bing and Yahoo and notice how the Google-affiliated sites seems to float higher. That's cool if you are in the "Google party", but what if you are just looking for helpful information?

More to come...

Friday, December 27, 2013

Google's Hummingbird, Target misses the Target, and the Healthcare Site that Still Doesn't Work

As a software tester, webmaster and business owner, I learned early on that 1) I would have to get good at being online, 2) get good at being found online and then 3) be very good at serving customers well online. When I first learned about Search Engine Optimization back in the late 90's and on until (almost) present day, everything was about keywords.

In case you haven't heard yet or noticed (which is understandable because Google didn't make a big deal of it), Google just installed a new search engine called Hummingbird, not just a tweak like the recent Panda and Penguin. This is a new V8 engine dropped in the car.

I first noticed something was up when my site (www.riceconsulting.com) dropped from the #3 position for "software testing training" to #19, then #23. The sites now at the top (with the exception of SQE, who you would expect the be there) all rank low in the individual things that USED to count, like keywords, backlinks, etc. (I think I'm back up to Page 1 again.)

I started reading the SEO blogs and learned about the significance of Hummingbird, which Google says is minimal with over 90% of sites unaffected. I find that contradictory (why make such a big change if only 10% of sites are affected?), but that's another article for another time.

The point is that Google says it has noticed (perhaps with help of an unnamed government entity) that people are forming their search queries in different ways these days. They cite mobile users as an example. Let's say you are walking down the street and want to get a pizza, so you ask Siri "where can I find pizza in downtown Chicago?" (OK, you probably don't need to ask Siri for that information, but hang with me here.)

So I tried a test with the query, "Tire Chains". I got results for where to buy tire chains, how to apply tire chains, which types of tire chains are best - all on page one. So, the challenge is clear. Millions of sites available, so how do you cut through the clutter. It's the "long tail effect", essentially.

One of the other things that appears to be a criteria for doing well with Hummingbird is to have a good social site presence. My friend Mickey O'Neill has a great blog post about why a business needs to be on Facebook, even though you may not think you need to be on Facebook. A lot has to do with how the search engines use social sites to give authority to web sites. (That's another thing Hummingbird links - sites that are authoritative.)

You may not have a web site and you may not care about how Google ranks sites, but it affects you anyway. That's because if you use Google like most people do, you will have to change how you phrase your searches. You will probably start phrasing them in the form of questions. (If you want a laugh, perform a Google search for "What is Hummingbird?") If you test web sites, I would suggest making SEO tests a part of your test suite, even though you may have people already doing that.

The most sobering thing about all this is that basically, everything written about SEO before December 1, 2013 is now obsolete due to Hummingbird. I'm not saying SEO is dead, but with keywords playing a such a minor role, not the focus is on quality content and links.

And...one more thing to think about is that Google knows where you have been on the web (along with the unnamed government entities), so they will use that also to select the results they think you want to see.

My prediction is that there will be much more "tweaking" to come on Hummingbird.

OK, now on to some other big topics, briefly....

Boy, oh boy. The data breach with Target stores will have some big ripples. Now it is know that PIN numbers (encrypted) were also snatched. Let's hope the encryption is strong. When I heard that cause had been found and fixed, I kind of chuckled to myself, "Breaking News. Horses Stolen, Gate is Now Locked."

http://www.nbcnews.com/business/target-confirms-encrypted-pins-were-stolen-recent-data-breach-2D11811618

 "The attack began Nov. 27, the day before the Thanksgiving holiday, and continued until Dec.ember 15, making it the second-largest data breach in U.S. retail history. The largest breach against a U.S. retailer, uncovered in 2007 at TJX Cos. Inc., led to the theft of data from more than 90 million credit cards over about 18 months."

For those of you who may be keeping score the cost of the TJX data loss was over $250 million.

To me, the lesson is that data theft occurs at may levels and now it is up to the individual to monitor accounts closely. The bad news is that law enforcement is not equipped to chase down the crooks, so the stores and banks have to absorb the losses, which eventually get passed on to you and me.

I'll have more on this later when some of the facts emerge.

Speaking of facts emerging...

I've been holding back on the whole Healthcare.gov debacle until I could find a point of entry to even make comments. The story still evolves daily. Clearly, this is going to be the classic software project failure story for years to come. I don't think lack of testing was the problem. I think this was classic government procurement meets clueless project management and sponsor oversight, all combined with a fixed date deadline.

The sad part of this story is that people's healthcare is at stake. The performance issues are just the tip of the iceberg. Data security is high-risk, as is data interoperability with insurance companies. I'm sure there will be other shoes to drop in this story.

Oh, and this just in...

http://blogs.marketwatch.com/thetell/2013/12/27/lucky-travelers-score-6-99-tickets-to-hawaii-after-delta-glitch/

"The news traveled like wildfire across Facebook and Twitter — a computer glitch had triggered unbelievable Delta ticket prices on the airline’s website and other travel sites. Among the bargains: roundtrip tickets to Hawaii for $6.99, a first-class flight for $12.83 from Oklahoma City to St. Louis and a $132 fare from Houston to San Francisco, again first-class. Cory Watkins, a travel agent in Oklahoma, told CNN that he’d paid $1,387.38 for 12 flights for himself and clients for first-class trips all over the U.S., saving thousands of dollars.

Perhaps the most unbelievable part of all is that Delta is going to honor the fares. Airline spokesman Trebor Banstetter couldn’t say how many tickets were sold during the fare glitch — which occurred during part of Thursday morning — but said Delta would allow the flight tickets to go ahead."

That, my friends, is the high cost of software defects. The trend does not seem to be getting better. If anything, it seems like each new day brings new stories of software failures. I think the future is bright for software testers. Now, I need to go finish my macro to continually search for "cheap air fares to Hawaii." It will send me a text message each time it finds a fare of $20 or less. :-)

Saturday, November 09, 2013

Software Testing Training

Since I do a lot of training in the field of software testing and QA, I reflect often about what makes training "stick". I also think about how many issues we face on projects are knowledge-based and experienced-based.

A few years back, I commuted weekly from Oklahoma City to San Francisco to consult on a long-term testing center of excellence project. My hotel was near the Apple Store and I had "spare time" in the evenings, so I decided to avail myself of the free training offered on how to use common Mac applications. I was amazed as some of the functionality that was not obvious in the applications. Attending these classes reinforced to me that we all need training.

We can flail around, experiment and generally waste time and money, trying to figure things out on our own...

Or...we can take some time and learn first about what we are doing.

We in IT have a disease. I call it the "dive right in" disorder. We tackle problems even before we know what they are. Therefore, we can fail to apply the most effective solution.

Software testing has so many facets, that training is required on a continual basis. Two days a year doesn't cut it.

If your company doesn't pay for it, you owe it to yourself to find a way to self-study or invest in your own training. There are more options now than ever. Even buying a $5 used book on testing can yield big results.

Of course, I can help as well. We have all kinds of free tutorials on my YouTube channel, as well as software testing e-learning and other courses.

If you are a test manager, check out some of my in-house software testing courses for your team. If I can help give any advice, just contact me from my website at http://www.riceconsulting.com

Tuesday, October 22, 2013

Principles Before Practice - Utah QA Group

Thanks to everyone that came out to hear my presentation tonight at the Utah QA group!

Here are the slides in PDF format.  I'll be posting the video soon on YouTube.

http://www.riceconsulting.com/public_pdf/Principles%20Before%20Practice%20-%20SLC.pdf

Thanks,

Randy

Tuesday, October 01, 2013

Obamacare, Where Art Thou?

There are so many things that come to my mind about the Obamacare launch today.

1) That so many state exchanges that experienced problems make me wonder if none of these IT shops have heard about performance testing? Failover servers? Load balancing? In a way, this was almost engineered (my apologies to all engineers) to fail because it's the "everybody show up at the same scenario".

2) There were reportedly functional defects in some states that prevented people from even setting up a user account.

3) Once again, the idea prevailed that just because someone in government declared "Let there be a system for...", people assumed the resulting  system would be on schedule, adequate quality, etc. There are no magic IT wands. But, on the other hand, how hard is it to build a web site that is just a directory to other sites? Of course, I'm just the consultant looking in from the outside. I've seen simple problems grow into complex monsters once vendors and government meet.

4) Then, of course, there are the flaws in the requirements concerning rate calculations.

I'm glad my state of Oklahoma opted out of building its own exchange.

I will be surprised if the problems are resolved quickly. I've seen these situations before and the more people try and fail to get access, the more they keep trying. It's a death spiral of performance.

Maybe, the people from United Airlines and the various state exchanges could get together and we could all have free insurance!

******* Update *******
10/6/13
In USA Today (http://www.usatoday.com/story/news/nation/2013/10/05/health-care-website-repairs/2927597/) we find the following:

"U.S. Chief Technology Officer Todd Park said the government expected HealthCare.gov to draw 50,000 to 60,000 simultaneous users, but instead it has drawn as many as 250,000 at a time since it launched Oct. 1." and "These bugs were functions of volume,'' Park said. "Take away the volume and it works.''

So, it appears that one contributing factor to the "bugs" (I would suggest this is system failure, not just a "bug") is that the performance targets were set way too low. This is like the infamous Victoria's Secret online fashion show failure at halftime of the Super Bowl a few years back. In performance testing of new launches, you have to take into account the curiosity factor. In the case of Obamacare, you tell 300 million people that a certain day is the day to check it out and expect only 60,000 people to show up? Come on, guys, you have to set your sights higher than that.

This is a great lesson in performance testing. You always go for high numbers for big launches (Like the Facebook IPO). Unless, of course, you want to go the public apology route.



Sunday, September 15, 2013

When Defects Go Big Time

I remember a conversation with a test manager many years ago who worked for a major airline. This gentleman told me about the great challenge of ensuring ticket prices are correct. He said that if the price is off by just a few dollars, millions of dollars can be lost in a day - and this was in the 90s!

I'm sure by now you have heard about the free tickets issued by United Airlines recently. Those of us in testing are thinking, "Man, I'm glad that wasn't on my watch."

Regular readers of this blog and my newsletter know that I don't attack companies over public defects. I prefer to use them as learning experiences and let people make up their own minds.

Sometimes in class case studies, we make a list of risks. One of the risks is often "bad PR" or "loss of credibility/trust". I often tell clients the two places they don't want to wind up are on the front page or evening news. It's hard to measure that kind of impact.

In the recent United case, there are measurable costs, as United as decided to honor the bookings.

El Al Airlines had a similar defect on August 8 of last year. According to reports immediately after the defect, the airline said it would honor the fares. But, a day later they walked that decision back. They did eventually honor the prices.  "In all, about 5,000 tickets were sold before the error was fixed. El Al blamed an outside contractor for the mistake." (http://newyork.cbslocal.com/2012/08/08/website-glitch-offers-travelers-a-big-airfare-bargain/)

A few years back when I was working with one of the big travel websites, they had a defect in the currency conversion rates. People were booking rooms in the U.K. that could cost up to $1,000/night for a penny per night. The company only honored those which were part of a package deal because the people could rightfully say they didn't know there was an obvious problem. I think they honored something like 4,000 reservations!  (By the way, the big travel websites have a huge data quality challenge because they depend on the vendors to provide pricing.)

At the end of the day, it often becomes a PR decision. There have also been legal cases on these kind of problems where the case is based on whether or not the website is an order entry system or not. Now, if you or I book the wrong dates, they will charge us a fee to change the ticket. 

However, in researching this article, I did discover that, "In January, the Department of Transportation enacted a new regulation to help protect consumers when they’re buying airline tickets. It states: 'The seller of the air transportation cannot increase the price of that air transportation to that consumer, even when the fare is a mistake.' But that regulation has never been tested in court. So as far as consumer rights attorney Brian Bromberg is concerned it’s still really up to the passenger to take action." (http://newyork.cbslocal.com/2012/08/09/el-al-to-honor-all-tickets-in-price-snafu/)

Back in 2003, Sheraton had a similar defect where they sold $850/night hotel rooms in Bora Bora for $85. http://usatoday30.usatoday.com/travel/news/2003/2003-01-14-starwood2.htm

Sheraton chose to take the PR hit and not honor the bookings. "Over two days, 136 people booked 2,631 rooms at the cheap rate and some made multiple reservations covering more than two months of vacation, Starwood says. If all the reservations were kept, the glitch would cost the resort $2 million."

In the above referenced article you will also see this little factoid, "United Airlines has had several glitches on its United.com that let some passengers pay $25 for San Francisco-Paris flights and, more recently, $5 for Chicago-Denver flights. In each case, United honored the cheap fares." And, there have been other United pricing defects, such as in July of 2012, tickets to Hong Kong for $40.

"It's deja vu all over again," to quote Yogi Berra.

The other side of the argument is like if you or I to the store and an item scans for .01, chances are either most people would not feel right about paying the incorrect price. Plus, the cashier would probably call the manager and they would take 10 minutes to find out the right price.

We don't know the reason yet for the recent United Airline ticket pricing defect, so I can't say much beyond speculation. I would love to see the root cause analysis. I hope United tells the public the cause much like NASDAQ did on the Facebook IPO performance defect.

The part that troubles me is that system defects of all sorts are becoming a pattern with the airline industry. From scheduling systems to ticketing systems and website problems, the stories are almost expected. My real concern is when the safety line will be crossed. I predict it will happen. With today's "systems of systems", there are extremely high levels of system integration. These systems are very difficult to test, to say the least.

I was on a flight once that was delayed because the database on the plane wouldn't work. The mechanic came on board with a CD to fix the problem! Avionics are one thing. The integration between systems is another.

The one lesson I know for sure is that software defects can get expensive, either in direct losses or intangible losses in image and confidence. People are getting used to minor defects in software, and we know there will always be bugs. But just like in Jurassic Park (I recommend the book over the movie), we need to be very careful. Some of these defects can grow into monsters.

I would love to hear your thoughts on this one!

Randy




Friday, September 13, 2013

Software Testing Master Class and Testing Mobile Applications - Salt Lake City, UT

I'm excited to announce two events in October in Salt Lake City:

Software Testing Master Class (Advanced 4-day workshop) October 21 - 24, 2013

This is a unique session that is project-based and covers advanced topics in software test management and software test analysis and design. We will be learning by testing an actual project in four days.

Click here for more details.














Testing Mobile Applications (Full-day tutorial)
Friday, October 25, 2013

Mobile applications are not only the future, they are here, now, and need to be tested. The big question is "How?"  In this tutorial, bring your mobile device(s) and we will explore a framework for testing mobile applications, look at some of the tools and generally get a view of the mobile testing landscape. Click here for more details.



Thursday, September 05, 2013

Book Review - Peopleware, 3rd Edition

Peopleware cover
Click to buy on Amazon.com

I have been a fan of this book from the first edition in 1987 because it brings weight to the human factors in computing. Peopleware, first edition, caused me to think about the relationship between workspace and productivity.

Unfortunately, these “people issues” are prioritized at the bottom of the stack in IT. However, most people in the trenches know that people make or break what we do in IT both long-term and short-term. The most critical and chronic problems are not technology-related, they are people-related!

A minority of managers fully understand the impact of people in IT projects. The rest of the management population tends to treat people like interchangeable components that can be located anywhere in the company and become instantly productive.

This is one of those books that you wish your manager would read and adopt. The problem is that too often, the management solution of people issues is reorganization or layoffs. A few rare and valuable companies that do value people and their long-term value have learned that people require time, care and feeding to be productive.

The value in the third edition of Peopleware is that DeMarco and Lister have had about 25 years to validate the insightful book they originally wrote in 1987. For sure, a lot has changed in the workplace since the 80’s, especially the IT workplace. Cubicles come and cubicles go, and some dysfunction is very much the same. The third edition clarifies many key issues in short and concise chapters that not only point out the problems, but also offer solutions. The third edition is definitely a value-added update to a classic.

One insight in particular I took away from the book is the long-term cost and effectiveness impact of employee turnover. Some companies seem to totally ignore this impact. DeMarco and Lister start with a learning curve assumption of three months for a role with moderate complexity. This is in addition to the existing experience and knowledge a person might have. The curve can be from six months to two years in some companies, which places the net capital investment at $200,000 per person.

So when a valuable person (like yourself) leaves a company, most managers won’t do what it takes to keep you, or even to fix the issues after you leave. Instead, they continue to pay out this cost without even knowing the actual costs incurred. And this doesn’t even include the cost of delayed projects, mistakes made by the replacement person, etc.

In case I haven’t made the point - In my opinion after 35 years in the IT profession, this is the one book I think every IT professional should own, read, and hopefully, apply.

Wednesday, August 21, 2013

Principles Before Practice

I've been thinking about the role that principles play in the context of software testing. As I was sitting with a test analyst awhile back consulting in a test method, I could see that there were quite a few variables that had to be considered.

It reinforced to me that good test design is a very nuanced thing. It's more than just "Step1, Step2, ..."

Many times, people get frustrated because they do things without understanding the rationale. People may learn a new test technique, try to apply it, and fail because they didn't really understand the nuances of the situation they are in and how they impact the technique they are using.

I'll never forget what someone told me a long time ago about trying to teach someone a skill. They used the analogy of washing dishes. This person said, "I can either teach you how to wash every kind of item in every situation, or I can teach you the principles and let you figure out the rest." I thought the second option sounded reasonable.

There are many ways to wash dishes. Even with appliances, there are some principles that really help. I know because I violate them sometimes and have to repeat the entire load. Things like:

  • Rinse off the big stuff first.
  • Save the really messy dishes until the end so you don't get everything else in the sink messy too.
  • Use hot water, but not too hot or else you will scald yourself.
  • Be careful with sharp knives in sudsy water,
  • You get the idea...

In testing, there are some similar principles, like:

  • Take some sample tests early and find where the big problem areas seem to be.
  • Don't test the really complex areas at first. Get your bearings first.
  • Have strong tests, but if you make every test strong, you may not have time to finish.
  • Early testing is good, expect when the thing you are testing isn't ready even for early testing.

The reason that principles come before practices is because they build understanding of WHY something is done a particular way. Without the WHY the WHAT can become meaningless and wasteful. See, there's the principle behind principles.

So the next time you are conveying your testing knowledge, be sure to convey the principles first.

Keep on testing!

Randy