Sunday, October 24, 2010

TPI Presentation Summary

This post resulted from typing up the notes taken on flip-charts that I promised to type and send to the participants in the workshop I did at TesTrek in Toronto.  My thanks go to all the people who were there and participated in the discussion, particularly Lynn McKee, Paul Carvalho, Michael Bolton, Michael... the other Michael who did not have business cards and whose last name I don't recall.  That this session took the path it did, and that the quality of the discussion it had was due very largely, if not entirely, to you.  I know I learned a great deal and I was the one with the microphone. 

My notes:

Test Process Improvement:
Lessons Learned from the Trenches
Flip-chart notes from TesTrek Conference, October 2010

Points made in discussion during presentation portion:
  • (In looking at testing…) How do I add value? (Lynn McKee)
  • Something’s wrong, Customers report “issues”
    • What’s an issue?
    • Issues may not be problems to everyone (Michael Bolton)
    • Expectations don’t match somehow
      • Problem in Requirements or Scope creep?
Points made in discussion of SWOT:
  • Allow your team to make mistakes (Paul Carvalho) 
    • Nothing teaches more than failure… 
  • Understand why you are doing something…
Introduction to SWOT Analysis:

SWOT is a tool to look at your team’s Strengths and Weaknesses while being aware of external Opportunities and Threats – Things that you may be able to take advantage of and those things that may block your progress.

These items are from the ideas that were volunteered by participants.

Strengths

Technically Competent
Dedicated
Finds “Good Bugs” fast
Detail Oriented
Shows Craftsmanship / Professional Pride in work
Team Gels
Good Communication (and skills)
Understands Roles
Big Test Case Toolbox
Adaptable
Has the trust of peers and colleagues

Weaknesses

Hard to say “No”
Resistant to change
Low technical knowledge
Poor estimation skills
Staff not as good as they think they are
Lack of creativity

Summary

The conversation around these points was important and I allowed it to flow freely, believing that they bore greater value than walking through a planned exercise. It was interesting to note that the strengths were drawn out very quickly, while the weaknesses took nearly twice as long and ended with far fewer items.

This is almost exactly in line with my experiences with using this technique.

It is easy to state what a person or team is good at – what their strengths are. Getting this down to specifics from the more general terms can be a bit more challenging, but usually bears fruit. Saying out loud (admitting to yourself and your team) what the weaknesses and short comings are is far harder. We all have frames around ourselves that limit our vision. We all want to be heroes in our own minds – no one wants to be the villain. Most people want to believe they are good if not very good at what they do.

Getting your team together and discussing what the weaknesses the team has means at some point people must trust each other to help improve individual shortcomings. If your list of strengths includes something about “teamwork” and people are not able or are unwilling to be honest with each other (yes, you can be polite and honest at the same time) then the “teamwork” strength needs to be removed from the list.

The greatest single challenge is to face yourself as you are. This is compounded when attempting to do this with, and in front of, co-workers and team members. The leader/manager may need to get help in doing this very hard task, and to break down the barriers that exist to allow frank discussion to occur. Tempers may flare and nerves will certainly be on edge. The “trick” is to allow cooling-off periods. Perhaps meeting for a couple hours each day for a couple of weeks instead of reserving three or four days in a row to do nothing but this would make it easier. This will allow people to talk privately and do their own reality-checks on what happens, or should happen.

Sometimes, the most potent force in this process is to have people thinking about these topics in the back of their minds while working on their “real” work. While focusing on a challenge, don’t be surprised if something pops into your mind related to the SWOT discussions and how that revelation can bear on the next discussion session.

AND SO, in simple language:

• To improve your Test Process, you must improve your team’s testing.
• To improve your testing, you must have a solid understanding of what your team is capable of RIGHT NOW.
• To understand your team’s capability, you must understand your team’s Strengths and Weaknesses.
• If you understand the Strengths and Weaknesses, you can consider what it is that Management or Customers are expecting.
• Recognizing what Management and Customers are expecting becomes your first Opportunity.
• Recognizing Opportunity may reveal things that will block those opportunities: Threats.
• Engaging in this process with your entire team will demonstrate to your team how serious you are to improving the team and making the individuals better at what they do.
• When you make the testing itself better, the Testing Process will be improved.

Saturday, October 23, 2010

Conferences and Vacations and Learning

So, my lady-wife and I don't do "anniversaries" - we do "annual honeymoons."  We (try to) take a week and go be sweet-hearts.  With the schedule at the day-job, getting a week off this summer was "not highly likely" which meant that a day or two here and there was the best one could hope for.  However, I was able to schedule a full week off for our "honeymoon."  We had planned to return to Beaver Island, in Lake Michigan, where we had gone after our wedding.  Then, interesting things happened. 

A colleague (well, highly regarded tester and speaker on testing and test consultant) asked if I'd be interested if she put my name forward as a possible candidate to fill her spot on the schedule at TesTrek, run by QAI.  I was a little surprised, well, a lot surprised, and said I'd be happy, and honored, to be considered.  The only drawback was the timing - the week slated to go to Beaver Island.  That could be a problem.  The week we try and reserve just for us would turn into a little bit of "us time" and a lot of "conference/work" time.  Positive side, it was in Toronto.

With a little concern, I approached my Lady Wife and asked what she thought.  Her response was "I LOVE Toronto!"  So, away we went.  As things happened, I found myself in a position to prepare an abstract and submit it to the folks at QAI.  It was approved, which meant getting a presentation together that consisted of more than vague ideas. 

The topic was one that I suspected may be a big draw - Test Process Improvement.  That is one of the "Almost 124.7% certain to be on a Conference Agenda" topics, along with Estimation, Automation and Requirements.  Now, this was not the intimidating part.  The intimidating part was that there were a stack of people who were going to be there who would very probably disagree with me.  I don't have a problem with that.  In fact, I've gotten quite good at having people disagree with me.  I can even be gracious when people can explain why clearly, and with a reasoneid argument.  I've been known to get along quite well with people with whom I have a professional disagreement.  Mind you, some folks have a harder time with that.

The thing was, I've done lunch and learns and training sessions and presentations for teams I've been on and led and worked with.  I've been doing drumming workshops for many years, in addition to the group and private lessons I've done.  The thing was, these weren't novices or non-testers I'd be speaking to - they were testers and test managers and consultants and maybe famous testing people.  Gulp.  Some of them were bound to know the topic better than I did.  Then I remembered the Abstract I worked on and the presentation I had worked so hard on. This was sharing what I had learned - not what some expert said was the "right" way to do things.  And that was my starting point. 

I do not have answers nor do I have a magic wand to reveal the secrets that need to be revealed.  But I can talk about what I learned myself.  And if some of the people who wrote the things I read and tried some of their ideas were sitting in the room - fine!  My hat's off to them and I'll credit them with being the source for ideas.

Now, I had done "practice runs" with the slides and watching myself in mirrors and such - and done a dry run with volunteer co-workers.  I had three possible paths planned for the exercise portion, depending on the number of people, the room layout and, frankly, how the lead up to it went.  Five minutes before I was to start, I had the projector ready, a bottle of water handy, the way-cool remote clickey thing to advance the slides was hooked up - and the wireless mic was clipped to my belt.  No worries.

The "Track Host" walked up to introduce me and... the next 30 seconds were a warning.  The "click on" for the wireless mic didn't.  The cool remote thingie... didn't.  I muttered something about testing software and not hardware and dove in.  The next 90 minutes flew by.  I asked questions, people answered, people asked questions, I responded - then attendees responded - then all of a sudden things were cruising. 

Moral of the story - If you have never tried to present on a topic, ANY topic - try it.  It does not need to be at a conference where "major names" are speaking.  It could be a local testing group meeting, a company lunch and learn, something.  Maybe a "lightning talk" at local meeting or regional conference?  It does not need to be a 60 or 90 minute presentation.  But make it something, somewhere.

The fact is, you know something that may help someone else.  Someone else may likely have the same kind of questions you did.  If you ever wondered what you could do to improve yourself - this may be it.  Do something that may help someone else and learn about yourself.  It may also help you meet some really way cool people. 

Oh, we had a great honeymoon, too.  Toronto's a great city to visit.

Tuesday, October 12, 2010

Improving Test Processes, Part IV, or The TPI Secret of Secrets

So far, I rambled about Improving Processes.  In Part I, I wrote about how we may recognize there's a problem, but may not be sure what the problem is.  In Part II, I wrote about the problem of introspection and how hard it can be to see outside ourselves and look at how we really are.  In Part III, I wrote about Don Quixote and the unattainable goal of  Process when the Charter and Mission are in disarray. 

The simple fact is, each of these things play a part in that which makes up Test Process Improvement

Now for the secret.  TPI is not the pointTPI is not the goal

In the end, TPI doesn't really matter except as a means to the REAL goal. 

The REAL goal is this:  Better Value from your Testing Effort.

The thing is, most humans don't think in a clear fashion.  I know I don't think in a way that can be described as linear in any way, shape or form.  That is particularly true when I'm working on a problem.  If I DID I would long ago have stopped looking into something I was testing because it did not feel right, even though there was nothing on the surface to indicate there was a problem.  When I have one of those feelings, I sometimes will go over what I have for notes, look at the logs from the applications (not the nicely formatted versions, but the raw logs) or poke around in the database.  Sometimes its nothing.  Sometimes, I sit back and think "Well, look at that.  Where did that come from?"  (Actually, I sometimes say that out loud.)

That is the pay-off for me as a tester.  I found something with a strong likelihood of causing grief for the users/customers which will in turn cause grief for my company. 

I don't know how to describe that in a linear fashion.  I wish I did, I'd probably be able to make a pile of money from it and live comfortably for the rest of my life from the earnings.  The fact is, its too organic - organic in the sense that Jerry Weinberg used the term the first time I encountered it in this context (Becoming a Technical Leader) not in the Chemistry organic/carbon-based context. 

The Test Script (and its companion, the formal Test Process Document) is not the Test.  The Test is the part that is done by the M1-A1 Human Brain.  Using that most powerful tool is the key to gaining value from testing - or improving the value you are currently getting. 

You can have the best Process in the World of Software.  You can have the best Charter and Mission statements.  You can have the best tools money can buy.

Without encouraging your people to think when they are working, and rewarding them when they do it creatively and do it well, none of those other things matter.

Monday, October 4, 2010

Improving Processes, Part III, or, Why Don Quixote's Quest May Have Ended Better Than Yours Will

A few weeks ago, while looking for some other information, I stumbled across the power point slides of a conference session on Test Process Improvement that I decided was "not a good fit" for me.  Yeah, I walked out... about 10 minutes into it. 

The premise was "If you don't have a Process, you need one.  If you don't have a Process, you have Chaos and Chaos is bad."  Following the obligatory introduction, and some seven minutes of what appeared to be gratuitous assertions, I said, "Enough" and walked out.

Having a Process is not a silver bullet.  Simply having a Process will not magically fix your Chaotic environment.  If you are trying to impose Process on the organization wearing your "Tester" white hat or the plate mail of the Quality Paladin, good luck.  Most places where I've seen Chaos rule, its because someone with a lot of scrambled eggs on their hat likes it that way.  (I wonder how many metaphors I can pull into one paragraph?  Better quit there.)

However, if you have a Process and no one follows it, the question should be why not?  My previous blog posts (Part II and Part I of this thread) talked about how the "problem" might not be the real problem and how you need to seriously look at what you are doing before you can fix what might need fixing.

When you look long and hard and honestly at what you and your group is doing, when you find the places where what is done varies from what The Process says, you must determine why this difference exists. 

I suspect that it will boil down to a matter of relevance.  The official Process has no relevance to the reality of what actually is needed in those situations.  If it is a one-off, then there may be something that can be tweaked.  If it is a regular occurrence, then the value of The Process comes into question.  If it doesn't work, why pretend it does?  Why bother having it at all?

Granted, The Process may have been relevant at one time and things may have changed since it was introduced.  However, nothing is permanent.  Change is inevitable.  Even The Process may need to be updated from time to time. 

When you do, look to the Purpose your team is to fulfill.  Why do you exist?  What is your Charter?  What is your Mission?  Do you have a Mission?  I'll bet you do, even if you don't know what it is.

To start, look to what Management expects.  If a boss-type is telling you that the Test Process needs improvement, try talking with them.  Discuss with them what they believe needs to be improved or where the gaps are.  This may become the basis of the group's Charter

The Quest that you are expected to follow.

What are they seeing as "broken" that needs to be fixed?

If the gist is "there are too many defects being found by customers" ask if there are specific examples.  Anecdotal evidence can paint a compelling story, yet without specific examples, you may never be able to find hard facts  Is this a hunch or are there specific examples?  Are these defects, as in they should have been found in testing? 

Maybe these are aspects of the application that the customers expected to behave differently than they received?  If they are, why is that?  How can that be?  How can the expectations be so different than what you believed it would  be?  After all!  The Design and Requirements, that you based the tests on, matched perfectly!

Let us ask Dulcinea how these things can be so different than what they appear to be?

Saturday, October 2, 2010

Improving Processes, Part II

O would some power the giftie gie us
to see ourselves as others see us.
 - Robert Burns, To a Louse

Right. So if you need that translated, Rabbie was saying this, in English: 

O would some power the gift to give us
to see ourselves as others see us.

There are a couple of things that are just hard to do for most people.  One, is to change our perspective and see what others see in us.  The really hard part is similar to that: to look at ourselves honestly, and see what we really are like. 

Any time that self examination comes into play, most folks will avoid it as much as possible.  We can tell wee little fibs and stories and justify actions by many interesting machinations.  Yet when we strip those aside, what are we left with?

That is the really hard part in any form of process improvement.  When we look at ourselves, as individuals or as a group, the challenge is to set aside what we wish we were doing or like to think we do, and focus on our true actions.

If our real processes and the official process don't match, we need to ask ourselves, "Why not?" 

Indeed, to see ourselves as we really are, or as others see us, can be an event that we don't want to face.  Without doing that, we can never improve.

Friday, October 1, 2010

Improving Processes and Other Stuff, Part 1

I've been teaching a lot of pipe band drumming workshops lately.  Well, "a lot" compared to the last two years anyway.  They can be hard, but generally are great fun.  By the time I realize how tired I am, I'm half way home - close enough where a cup of Tim Horton's coffee will get me home (yes, there are some in Michigan.)

So, this last session was a mix of absolute beginners and those that were a step or two beyond that.  They all play in the same band, or aspire to anyway, and so have a common bond between them.  Part of the intention of the workshop organizers is to not only teach the beginners, but teach the more advanced players how to teach. 

That actually is easier than it sounds, at least with drumming.  I get to present an idea, work on some exercises, see who is getting it and who isn't.  If it is a physical thing, there are other exercises to try.  If it is a mental thing or thought process thing, then I present the same basic idea another way - changing the context sometimes makes it easier. 

This last session we were working on triplets.  Cool things, those triplets.  They are also bread-and-butter stuff for pipe band drumming. The kind of thing where if you don't get them, your future as a pipe band drummer is quite limited.  One guy was having a bit of a hard time with them than the other students.  Mind you, these students ranged in age from 8 years to the mid-30's or so.  This particular fellow was doing alright, but was having an issue with getting his head around the idea of "three notes where normally there are two."

I handed out a bunch of candy/sweets to the other participants and asked this fellow to play the bit he was having a problem with.  Perfect.  So I asked him to do it again.  Perfect.  Third time was still perfect.  Hmmm... it does not look like its the actual "hard bit" thats the issue.  So I had him play the exercise from the beginning.  Trainwreck!  Had him slow things down and do it again - same thing.  The second time I noticed something in what he was doing.  As he got closer to the "hard part" his grip tensed up (he gripped his sticks harder) his muscles in his forearms tensed visibly - both bad things for drummers.  Try as he might, the sticks simply were not going to do what he wanted them to do.  When he jumped right into it, things worked fine.

If the first step to solving a problem is recognizing you have a problem, how do you know what the problem is?  In this poor fellow's case, he knew he had a problem and simply could not see what it was.  It wasn't the problem he thought he was having - it some something else entirely.  When he stayed relaxed throughout the line of the exercise, he played it flawlessly.  Problem solved.  But what was the problem? 

When it comes to process improvement for nearly anything, I try and apply the same approach:  There may be a problem, lets see what the symptoms are and see if we can isolate the problem - instead of whacking the symptoms or results of the problem. 

When looking at Test Process Improvement in particular, the problem that gets described is usually a symptom or list of symptoms - not the actual problem.  We can stomp on symptoms one at a time, without really addressing the crux if the problem.  That will continue to churn and bubble and fester until something else breaks. 

The "problems" presented usually are presented in a handfull of ways:
  • Testing is taking too long;
  • Testing costs too much;
  • Too many defects are being found by customers.
Of course, there are other variations, but what I have seen have usually falls into one (or more) of those complaints.

When I was younger and not as diplomatic as I am today, my reaction to each of those points generally ran something like "Compared to what?"  Now, in my more mellow state and place of being, I only think that.  Sometimes loudly, but what comes out of the mouth runs something like, "Can you explain what you mean by 'too long' so I can understand better what you expect?  An awful lot of the time our original estimates on test effort or duration are based on the understanding of what the project will entail at the time the estimates are made.  As the project develops, we learn more and that will impact both the revised effort estimates and the actual duration.  What we are doing is based on the instructions given to us and the mandates for what critical processes must be validated.  Perhaps this needs to be reconsidered."

So, I guess that's a long-winded version of "Compared to what?" but it tends to go over a bit better. 

What I do wonder, however, when a boss-type says something like those statements, is "Are those symptoms or problems?"  Are we running over timelines and cost estimates because of other things than lousy estimates?  Are "due dates" being missed because of testing?  Are there a flurry of defects keeping customers from using the new release? 

Is there something else going on and these are perceptions of problems, rarther than symptoms of problems?