Friday, November 30, 2012

Thinking and Working Agile in an Unbending World

Yeah - that was the title of the session I presented at Agile Testing Days in Potsdam.  

My presentation at Agile Testing Days went through a several changes.  The last major one was the time slot.  When I saw the slot I was assigned, the last track session on the last day, I was concerned about the path I had taken.  I realized that a heavy presentation when most folks would be already tired would probably not be ready for one more intense session.

Then Unicorns took over the conference.

No, really.  They were everywhere.  So, I made one more set of revisions - and added Unicorns to mine.  (Why not?)  This also gave me another idea I had been trying to get out - and unicorns helped me.  So, I ran with it.  Hey!  Its part of being flexible, no?

Sigurdur Birgisson was kind enough to use MindMeister to record my presentation as a live blog.  Its kinda cool.  Thanks Sigge!  Find it here.

Right -

The cool cats all do Agile.  Agile rocks.  But if the company does not "do Agile" then, what happens to the talented people working there?  Are they doomed to being Not Cool?  Can they move from the "Real" World to Unicorn Land of Agile?

There are flavours of the idea of what is Agile that, I believe, leads to its own problems.  We all know about the terms laid out in the Agile Manifesto.  We know how it is phrased.  We know the ideas around...
Individuals & Interactions over Processes & Tools
Working Software over Comprehensive Documentation
Customer Collaboration over Contract Negotiation
Responding to Change over Following a Plan
Fine.  So what does the word agile mean?  Well, being an American, I looked up the definition in Webster's Dictionary.  Here's what I found:

  1. Marked by ready ability to move with quick easy grace
  2. Having a quick resourceful and adaptable character
The question I have is that if this Agile... Stuff works, what makes it work?  Does it work better than, say, the non-Agile stuff?  Like, Waterfall?

Consider these questions - 

  • Why is it that some teams are successful and some are not, no matter the approach they use?
  • Why is it that some teams are successful using strong command and control models?
  • Why is it that some teams are successful using agile methodologies?
  • Are Agile Methodologies always, well, flexible?  Do they always make things better? 

From what I can see, this depends a great deal is meant by "Agile."  Now, the Agile Methodologies tend to have their own baggage and... stuff.  How can this be?  If we're Agile we're supposed to be "agile," right?

Alas, I fear I'm getting ahead of myself.

Consider this - the Lightweight Models and Methodologies (that became "Agile") were a response to the heavier, more formal models in place.  These were considered slow and unable to respond to change.  They bogged down the development of software and generally, based on some versions of the story, kept people from doing good work.

The thing is, no process model in place was ever put there, from what I have been able to discover, was put in place to do those things.  No model was ever put in place to crush the creativity out of the talented people doing the work.

Why Are Things the Way They Are?

Rather than rail against the horrible, crushing, models - which is what I did when I was younger, less experienced and generally less aware than I am now.  Well, maybe less understanding than I am now.

Try these questions instead:
  • Why are these process models in place? 
  • What happened that made this seem like a good solution? 
  • Have things changed since then? 
  • Have the problems resurfaced since then?

Those "horrible, crushing models" were an attempt to address something.  Those questions may help you find out what that something was.  Learning that may help you approach the great question of "Are these process models still needed?"

This is a slightly difference question, and I find often times less painful question, than "Are these process models still relevant?"

Both are important.  The answer to the first can inform your phrasing for the second.

Oftentimes people will fall into the trap of ritual when it comes to software.  "We had this problem and we did this ever since and that problem never came back."  Very good!  Now, did the problem never coming back have anything at all to do with the change you made in the model?  Have you encountered other problems?  Has the model gotten more convoluted as you make changes to handle problems?

At what point does your house of cards topple?

Do the steps added or changed to the process model still add value?  Are they worth the expense of executing them or have they become boxes to check-off?

Considering the questions presented, and the answers received from them, can help you take a step toward agile that I find terribly important.  The odd thing is, when I present it as my "definition" of what makes Agile, well, agile, I get nods of agreement, scowls (and sometimes proclamations) of "you're wrong!" and everything in between.

Here is how I look at Agile in an agile way.  Ready?

Doing what needs to be done to deliver the best possible quality and
high-value software product in a timely and cost effective manner.

Is something needed, really?  Or are you checking a box that someone says must be checked?  Does the information exist elsewhere?  If so, why are you repeating it? 

Now, all the Agile folks are saying "Yeah, we know this.  Waterfall is bad."

Consider how many times have you heard (maybe said?) "If you are doing <blah> you are not Agile."  Now, how many times have folks said "If you are not doing <blah2> you are not Agile."

Rubbish.

If what someone is doing really adds value and makes sense for the context of the organization, and they are delivering good software, how is that wrong?  How is that less-than desirable?

Consider these questions about your own software: 
  • Do your customers have a choice about using your product?  
  • If they purchased it, are they installing the upgrades/patches when you ship them?
  • Are they letting them linger?

When I asked the set of questions about customers with a choice and installing the upgrades at Agile Testing Days, a number of the very confident faces changed from "We're Agile, We're Good." to somewhere between "Deer in the headlights" and "Oh, oh.  How did he know?"

If the software we are making does not do what people, like our customers, need it to do, does it really matter what hoops we jump through to make it?

Consider this paraphrase of Jerry Weinberg's definition of Quality (with the Bach/Bolton Addendum) and think about all the stuff we talk about with Agile and ... yeah.

Quality is value to someone (who matters.)

If your software product does not fit that definition, does it matter how you make it?

Do stuff that makes sense to do and don't do stuff that doesn't make sense to do.  

Monday, November 26, 2012

At Agile Testing Days, Retrospective

I arrived in Potsdam Saturday afternoon, with my lady-wife.  I was planning on spending the week at Agile Testing Days, while my wife visited the sites with our daughter who was flying in separately from DC.  As luck would have it, we were sitting in the airport at Amsterdam when up walks Matt Heusser.  Cool.  We were on the same flight into Berlin.

Suffice to say that the entire week went that way - happy meetings. 

A series of happy meetings with people I had not met in person, but I follow on twitter, read their blogs, articles, books.  The list is legion - Gojko Adzic, Huib Schoots, Markus Gartner, Sigge Birgisson, Lisa Crispin, Janet Gregory, Paul Gerrard, Simon Morley, Jurgen Appelo, James Lindsay and ... yeah a buch. On top of that, reconnecting with people I already knew, like Matt.  I could not ask for better company.  I had many conversations with Dawn Haynes, Scott Barber, Tony Bruce and more.

There were many people I had heard of, but had little contact with their work - at least not very much, like Meike Mertsch.  Then the folks I simply met - Carlos Ble, Alex Schladebeck, Anne Schuessler, Chris George.

It was inspiring, invigorating, enlightening, sometimes frustrating, and exhausting.

In my blog posts from the week, I tried to capture the excitement I felt, the energy of the people around me and my general impressions of the various presentations I attended and participated in.

The organization of the conference itself was really, really good.  Jose Diaz, Madeleine Griep, Uwe and all the crew did a fantastic job of  making sure people were comfortable, had what they needed and feel welcome.  The events were great - the speakers dinner was astounding - with great conversation, good food and wine... and beer (Germany.  Duh.)

Then - Tuesday, talking with Jose - he said "Your family is here?  They must come tonight to the award ceremony.  It will be wonderful!"  Umm - yeah - it was.

Throughout the week, the conversations were astounding.  Hanging with crazy-smart people does more for teaching yourself, and learning.  In my case, sharing some small bits of copper among the silver and gold put out by others.

I tried hard to not be the smartest person in the conversation - far from it in fact.  In a gathering like this - its easy.

So, yeah.  I'm sitting in my living room thinking back to last week.  Frankly it amazes me all that happened.

I've been to some very good conferences and hung with, talked with and learned from smart people.  While I recognize that many of the people I talked with over the week were very like-minded to my own views on testing - not all were.  In fact, quite a few were not.  We talked, shared ideas and generally got to set up the opportunity for future learning opportunities.

Yeah.  This was a great conference for me.  

===

Addendum: Other folks I should have mentioned and simply did not - lets see - Jean-Paul Virwijk (yeah, Arborosa on Twitter); Rob van Steenbergen (rvansteenbergen & TestEvents on twitter); Ralph Jocham, Scott Ambler, Eddie Bruin - and - yeah - a long list - too long to really put here.  It was good.








Thursday, November 22, 2012

Agile Testing Days: Day 4 Live in Potsdam!

Thursday morning is here, the last day of Agile Testing Days in Potsdam, Germany.  I managed to over sleep and miss the start of Lean Coffee.  When I walked past, it seemed they had a good number of folks in the room, broken into two groups.  Broken?  Maybe "refactored" is a better term...

OK, that was a little lame.  I think I may be a little tired yet.

So, here we go with another day.  Today we have keynotes by Ola Ellnestam, Scott Barber and Matt Heusser.  There are a variety of track talks as well.  Rumor has it that at least one will combine unicorns with Harry Potter.

And, we are about to kick off with Ola Ellnestam on Fast Feedback Teams.  Ready? Set? GO!

--

So, Ola launches into a story of trying to explain what he does to his kids.  (Pete Comment: Yeah, if you have kids, its one of those weird conversations to think about when you deal with software.  Also, kinda digging the hand drawn slide deck.)  It was a good story about kids and understanding.  It also included the idea of feedback - when things (like games) are predictable, how much fun are they?  Unless you figure out the pattern and your younger brother has not...

Showers are a good example of "feedback loop."  Depending on how far the shower head is from the faucet handles, you may have a bit of delay - like the farther away the two are, the longer it takes for you to know if the water is the temperature you want.

Reminder - if you remove the feedback mechanism, you are not "closing the loop" you are kicking it open so the loop never responds.
Reminder - never presume that "everyone knows" - when you are the one who does not know.  

The velocity of the project (or aircraft) will determine the timing for feedback.  One can not trust a response loop of, of, a couple of minutes, when the feedback involves aircraft at 33,000 feet.  You kind of need it sooner - like - instantly.

Other times, consider how clear the communication is - the nature of the feedback can impact the understanding involved.  Consider, if our task is to help solve problems, does the solution always involve creating software?  Ola says while he likes software, he likes not creating software if there is another solution.  (Pete Comment: Yeah - I can dig that.)

Instead of letting the backlog grow perpetually - which tends to freak people out when they look at it - consider paring it down - prioritize the list so the stuff that is really wanted/needed is on it.  If the stuff that drops off comes back, then reconsider it.  Don't let yourself get bogged down.

The problem is a bit like a bowl of candy - when the user stories are all "compelling" (Pete: by some measure) it gets really hard to choose.  Limit the candy in the bowl to that which is important.  This can help people understand. Allow the user stories to act as reminders of past conversations.  When that conversation comes around again, perhaps the priority on that story needs to go up.

Ola tells a story about trying to pull from a build server - except there is a time difference between the time stamp on the build server and the machine he is working on.  Problems resulted - like incomplete understanding / noise / in the response - which causes confusion.

Classic example of noise in response - what Ola calls "Chinese Whisper Game" - which I know as the "Telephone Game" - yeah.  Start with one thing and by the time it gets told to everyone in the room and comes back to the first person, it is totally different. 

Instead of looking for improvements in "outer" feedback loops, look at the inner-most loop.  Feedback tends to happen in (generally) concentric loops, often centered around the life-cycle in the methodology in use.  If the initial cycle takes a long time to get feed back, does it matter how efficient (or not) the outer loops are?  Optimizing the inner loop as best you can, will give you opportunity to tighten the outer loops more than you can now.

This is true of classic Scrum cycles - as well as in other examples - like in Banking.  Banking tends to run multiple batch processes each day.  Yeah - that happens a lot more than some folks may realize.  Recognizing the results of overlapping batches may possibly impact each other, the result of that impact the nature and type of feedback.

Moving on to double feedback loops - Generally stated - it is better to do the right thing wrong than the wrong thing right.  For example - a radiator (room hear for those in the States who don't know about such things) has a thermostat to keep a room at a steady temperature.  Recognizing the door or window is open and may have a bearing on the results - if one is looking at how well (or poorly) the radiator is doing its job.

Bug Reports?  Yeah - those are records of things we did wrong.  We have an option, look at them and figure out what went wrong, or make sure we don't do anything wrong.  Reminder - the easiest way to avoid doing something wrong is to not do anything.

To move an organization, particularly a new organization, toward success, sometimes the easiest way is to reduce stuff that does not help.  It may be user stories from the backlog - or it may be existent features that are of no value that can be removed.  This will close loops that may currently only add noise instead of value.  It can also speed the feedback return so you can do a better job.

Interesting question - What about slow feedback loops - those that start now, but the event for the feedback will not occur for some time?  Well - good question.  Consider Ola's flight to the conference.  He bought round trip tickets on Scandinavian Air (SAS) - except there is a bunch of stuff going on with them right now, and his return ticket may not be "any use."  So, he invested in a backup plan - specifically a 1-way ticket on Lufthansa- just in case.  He'll know which one he needs when he goes home.
----
Right - so - I kinda took the morning off  to practice my presentation and - well - confer with really smart people.  So, after lunch - Scott Barber is up.

Scott Barber launches his keynote with a clip from 2001 A Space Odyssey - where he describes what was shown as not the dawn of man, but the beginning of development.  He tracks this through manual, waterfall development, into automation - via horsepower and steam engines, with internal combustion engines to follow.

Then we get electricity - which gives us computers and the first computer bug - from there things go downhill.

Scott's assertion is that neither Agile nor Context Driven ideas are new.  They are, in fact, how society, most cultures, most people, live their lives.  He then wonders why so many software shops describe software development in terms of manufacturing than in terms of Research and Development.  After all, we're not making widgets (which takes a fair amount of R&D before it got to the point where it could be mass-produced.

Ummmmm - yeah - does anyone really mass produce software - other than burning a bunch of CDs and shrink-wrapping them?

So, when it comes to context driven or agile or... whatever - can we really do stuff that people say we do?  Or maybe think we do?

Citing the fondue restaurant at CAST in Colorado.  And the dinner with Jerry Weinberg.

And discussing what testing development in 1960's - like in satellites and and aircraft and military stuff - you MUST know performance testing.  Why?  Because the tolerances were micro-scopic.  No titles - just a bunch of smart people working together to make good stuff.  Deadlines? Really? We have no idea if it will WORK let alone when it might be delivered.   Oh.  And they tested on paper - because it was faster and better than testing it on the machine.

Did this work?  Well, it put people on the moon and brought them back.

Then two things happened.

Before 1985 (in the US) Software had no value - it could not be sold - legally - as a product. Before then, the software had to do something - Now it just needs to sell and make money.  If it makes money - then why not apply manufacturing principles to it?

Scott then gives an interesting version of testing and development that is painfully accurate and - depressing at the same time.  BUT - it resolved in a rainbow of things that are broadly in common, except for the terminology.

DevOps, Agile, Lean, Incremental, Spiral, Itarative, W-Model, V-Model, Waterfall.

Yeah - ewwwwwwwwwwwwwwwwwwww

So - until around 2010 stuff was this way.  After that rough time zone - something happened...

Software production methods experienced a shift, where they split away, never to reuine.  This gives us these two models:
1. Lean Cloudy Agile DevOps - Or The Unicorn Land
2. Lean-ish Traditional Regulated Audible - Or The Real World

How do we resolve this stuff?  Simple - we add value.  How do we add value?  We support the business needs.  Yeah.  OK

FLASH!  Test is Dead - well - the old "heads down don't think stuff" is dead.

So, what about the test thing - the no testers at Facebook, etc.,  So what?  If there's a problem, next patch is in 10 or 15 minutes - the one after that will be another 10 or 15 minutes.  So what?  No one pays for it.  Oh, the only time FaceBook actually came down hard? Ya know what that was?

Justin Beiber got a haircut - and every teeny-bopper girl in the world got on all at once to scream about it.

In Scott's model - FaceBook is the model.  Don't worry about titles - worry about what the work is.  Worry about the talented people you are working with - or not working with.

Scott predicts that the R&D / Manufacturing models will reunite - except right now we don't have the same language among ourselves.

Maybe we need to focus instead on what the Management Words are.  If we speak in terms they understand - like use their words - we can get things sorted out. This helps us become an invaluable project team member - not an arrogant tester who acts like your bug is more important than $13M in lost revenue if it doesn't ship on time.  (That is straight off his slide)

Help your team produce business valuable systems - faster and cheaper.  Be a testing expert and a project jack-of-all-trades - Reject the Testing Union mentality.

Do not assume that you can know the entire context of business decisions.  However, you can take  agile testing and develop a skill in place of a role.

The ONLY reason you get paid to test is because some exevutive thinks it will reduce their time to a bigger yacht.

(Pete Comment:  Ummmm - Yeah.)
 ---
And now for Huib Schoots on Changing the Context: How a Bank Changes their Software Development Methodology.

Huib, until recently, worked with Rabobank International - a bank in the Netherlands that has no share holders - the depositors ownthe bank (Pete Comment: Sounds like a Credit Union in the States).

Huib worked with a team doing Bank Operations - doing - well, bank stuff.  The problems when he came in included testing with indefinite understanding of expected behavior -- not a huge problem, unless the experts can't agree.

BANG - Gauntlet is thrown - Agile is not about KPIs and Hard Measures and Manager stuff.  Its kinda scary.  Manager says - You need templates and ... Eewwwwwwwwww. Not for Huib.

So - the test plans are non-existant and the bosses want stuff that doesn't really work - (Pete Comment: ...and the junk that never seems to make sense to me.)  Instead, he asked if any of them had heard of Rapid Software Testing?  Ummmm - No.

So Huib began working his "Change" toward Context Driven practices, RST, Passion as a tester (and for other things in life), Thinking - yeah - thinking is really important for testers (Pete Comment: its a pity how many people believe they are thinking when in fact they are not.) - and to develop Skills over Knowledge.

With this, Agile practices came into play and acted as a "lubricant."  Lubricant help things work together when they don't automatically really want to work together - they kinda rub against each other - that is why there's motor oil in your car engine.  

Story Boards helped people talk - it helped people communicate and understand (to some level) what others were working on.  Before, no one was really sure.  Moving on - Passion became contagious.  In good form, Huib dove in to show the team that its OK to make mistakes - and he did.  Loads of them.  Each time it was like "OK, its good, I learned something."  Good move.

These changes led to the "Second Wave" - More agile testing, including shared responsibilities and pairing and... yeah.  Cool stuff.  Then some Exploratory Testing was introduced - by Michael Bolton himself. The thing was, Huib was a victim of his own sucess.  Some 80 testers showed up when he expected half that number.  Oops.  Then, a cool tool was introduced, Mind Maps.  They can help visualize plans and relationships in a clear concise way.  This lead to concurrent Workgroups to share work and distribute knowledge and understanding.

Yeah, some tools are needed.  But use them wisely.

What is ahead?  Likely Session Based Test Management - loads of Automation (as they really don't have any) - Coaching (yeah) - Practice (definitely)

What made it work?  Careful steps - passion - adaptability, building community, persistence (can you please stop asking questions? Why?) and - Yeah - the QA word - Question Asking!

What did not work? Plain straight training (don't do shotgun training and then not follow up).  Project pressure - yeah, you are not doing that weird stuff on this project it is too important.  You can't everything at once.  Out and out resistance to change.  We did that and it did not work. 

Huib's suggestions - RST training - Passion - THINK! - Question EVERYTHING! - Testing as a social science - Explore (boldly!) - continuous learning.
---

OK - Recovered enough from my own presentation to pick up for Matt Heusser's keynote.

PLAY IS IMPORTANT - ok that is not really the title, but hey - that was ummmm - a little hard to sneak in here.

So, we are considering play and ideas and ... stuff.  and shows a clip from A Beautiful Mind with of John Nash describing game theory, whilst sitting in a bar when attractive women come in and ... well - apparently beer is a great thought motivator.  Game theory presents that we can do work that will benefit us.  Yeah, that much we get.  Yet, Reciprocity means that we will act in the belief that in help one person, we will also be helped, by some measure.

Why are we doing this?  Matt pulled up four folks and did an exercise (before mentioning Reciprocity) and moving forward and - yeah - they act against the stand alone Game Testing theory, in hopes of benefit later - apparently.  And the expected outcome occurred - its just one of those things - People like good endings and it happened - Reciprocity worked in this case.

Matt is describing software testing as The Great Game of Testing.  Cool observation.

He's got a picture of a kanban board up - a real one - not a make believe one - The danger of course is that sometimes, there is a problem with the way work gets done.  The "rules" are set up so everyone is happy and gets stuff done within the Sprint - except QA becomes the bottleneck and why isn't QA done? Never mind that the stories were delivered the day before.

Instead, if we look at a flow process where there are "workflow limits" in place - so the QA column has spots for a few stories, no new stories can enter dev until the stories in dev get pushed -  So if dev can help QA clean their plate they can then push the stories that are waiting ...

So, sometimes things can work out.  Elizabeth Hendrickson's Shortcut Game is an example of what happens when you try and short circuit the activity list.  It demonstrates what happens when we do "extra work" to make this sprint's goals, but may negatively impact the next sprint.  That could be a problem.

The challenge of conferences, of course is to be able to implement the stuff you pick up at a conference.  Sometimes  you just need to do stuff.  Consider this - when you go to a conference, write a two page report with three things that could be done - like - are not physically impossible.  Add a fourth that would need help to get done.  THEN - do the three things and try the fourth.  You never know what might happen.

This ends the last day of the conference.  I need to consider the overall event.  Look for a summary post in the next few days.  Right now, my brain hurts.

Thank you Jose, Madeleine and Uwe!

Thank you Potsdam!

Auf Wiedersehen!

Finished with engines




 














Wednesday, November 21, 2012

Agile Testing Days, Day 1: Workshops

Monday in Potsdam was a lovely day.  Yeah, a little foggy, maybe a little damp outside, but hey - I was inside where there was good coffee, a variety of juices, waters and the odd snack or two.  A nice luncheon with great conversation following a very enjoyable breakfast with great conversation - Oh, and Matt and I had another opportunity to present Software Testing Reloaded - Our full day workshop.  This time in conjunction with Agile Testing Days.

As usual, we totally messed up the room - this time the staff of the hotel were more amused than horrified.  The folks wandered in after coffee and light snacks and found us playing The Dice Game - Yeah.  That one.

In retrospect, it was a great ice breaker to get people in the room, involved and thinking.  It was a good warmup for what was going to follow.  So, we chatted and conducted the first exercise, had everyone introduce themselves, asked what they were hoping to get from the workshop.

I think Matt and I were equally astounded when a couple of people said they wanted to learn how to test and how to transition from waterfall (well, V-model) to Agile.  We gently suggested that the people who wrote the book were down the hall and perhaps that might be better for them - and reassured everyone that if they were looking for something more that they could either re-evaluate their choice OR they could hang with us. 

So, after a couple of folks took off, and a couple more wandered in, we settled at 11 participants.  It was a lively bunch with a lot going on - great exercises, good interaction.  Kept us on our toes and, I think, we kept them in their toes as well.  

Somehow, we managed to have a complete fail in getting to every single topic that people wanted us to talk to or do exercises around.  Ummm - I think our record is perfect then.  You see, there is always more for us to talk on than there is time.  That is frighteningly like, well, life on a software project. 

We often find ourselves with more stuff to deliver in a given period of time than we can hope to.  If we promise to give everyone everything, we really can't deliver anything.  Well, maybe that is a bit of a stretch.  Maybe it is closer to say we will deliver far less than people expect, and less than what we really can deliver if we prioritize our work differently in advance. 

So, for Matt and I, we try to work our way through the most commonly occurring themes and address them to the best of our ability.  Sometimes we can get most of the list in, sometimes, well, we get less than "most."

Still, we try and let people know in advance that we will probably not be able to get to every single topic.  We will do everything we can to do justice to each one, but...

This got me thinking.  How do people manage the expectations of others when it comes to work, software projects and other stuff of that ilk. 

How well do we let people know what is on the cusp and may not make the iteration?  How do we let people know, honestly, that we can not get something in this release and will get it in the release after? 

I know - the answer depends on our context.

In other news,  It is really dark here in Potsdam (it being Wednesday night now.) 

To summarize, we met some amazingly smart people who were good thinkers and generally all around great folks to meet.  My brain is melted after 3 days of conference mode - and there is one more day to go. 

I've been live blogging on Tuesday and Wednesday, and intend to do the same tomorrow.  I wonder if that has contributed to my brain melt.  Hmmmmmmmmmmm.

Auf Wiedersehen.

Agile Testing Days: Day 3 - LIVE in Potsdam!

My Wednesday begins in Lean Coffee - yeah there was one yesterday, and I missed it completely (yeah, I overslept.) So, here we are, roughly a dozen people sitting around a table.  Folks suggest a topic they'd like discussed and everyone votes.

This format is not unique, in fact it is used fairly often at various conferences, and, well, meet-ups.

So far, it is far too fascinating to be able to describe adequately, so I'm going to not even try.  Its good stuff.
---
Right, so sitting in the main hall before Jurgen Appelo's keynote this morning.  I have a moment to summarize the Lean Coffee discussion.  So here we go:

Brilliant.

Yeah, that probably is too short, but you get the idea.  Conversation was happening so fast and furious around a handful of topics (we got through three I think - they kept getting extended...)  We talked on such topics as the challenges in training testers, training in softskills and ... hey wait - two on training?  Yeah.  And they are related.  There's a couple of questions that all of us struggle with - mostly because it is hard.  Deep discussion with a lot of good ideas.

And now it is time for Jurgen's keynote.

---

Let's Help Melly: Changing Work Into Life

Right, starts out with some interesting stats.  A recent study concluded that over half of American workers hate their jobs.  It seems that this is a global problem as over half the people in the world hate Americans as well.  Wait.  No.  He was kidding.  (I think.)  The problem is that most workers in the world hate their jobs.  They seem to think this is a new phenomenon.  Jurgen purports that this problem began some time back, like roughly, oh, 3000 BC (he may have said BCE, don't know - there is a guy with a veritable planetoid for a head sitting in front of me and I can't see the screen so well.)

He is going through some common ideas and tossing them aside.  Gotta love it when people do that.

Let's see - BPR, TQM, Six Sigma and the like, are extensions of the good old days of, oh, the Pharaohs.  There are loads of studies that show they don't really work, but that does not seem to change the attempts to "make things happen."

Partly because organizations tend to be organic - they are forms of communities that are ever changing.  Until people look into reinventing SOLUTIONS (not processes) things will never really get better.  This is essentially what Agile "Community" is to do (per Jurgen).

He's moved into setting aside SCRUM and Kanban - and is discussing Beyond Budgeting, Lean Startup.  OK - I'm liking the logical progression I'm seeing... This leads to this which conceptually leads to this which... yeah - and now Design Thinking.

The thing is, these are all attempts to correct problems where things are not working.  How do we make them work?  As a label, the stuff really is not that important.  The ideas are important.

How do we make things succeed?  Jurgen suggests we run safe-to-fail experiments, we steal (collect?) and then tweak ideas, collaborate on models and shorten feedback cycle

The PROCESS is what we see -scrum/kanban boards, discussions, people working together and collaborating.  The stuff in Binders, sitting on the closet are MODELS.  All that stuff that describes the formal book stuff does not promise success!

Jurgen is now citing Drucker, and is suggesting that just as "everyone should be involved in testing" then "everyone should be involved in management" (which by the way is similar to the word in Dutch, German and other languages meaning a place to keep horses.)

These ideas are core to his work on Management 3.0 (which is pretty cool by the way.)

For example, he cites a company with "Kudo Box" where people can drop a note to recognize someone's work, effort, help... something.  This can help lead to recognition - and acknowledgement of individuals (yeah - groups are made of individuals - so are teams and companies and... everything.)

Yeah, I can't type as fast as Jurgen is throwing out good ideas to fast to really do him justice. 

People are creative when they are playing - where they learn and get ideas - and wow - highly modified football table (foosbal for my former colleagues as ISD).  Lasers to detect scores, electronic score keeping, card readers to identify the players (which allows for cool score metrics)  to you video cameras to record goals and play back the BEST goals in slow motion.  Yeah - there's a company in Norway that does that.  Cool.  They also get really bright engineers.

Short feedback cycles are crucial - not just from customers but also from staff.  Jurgen now throws out a slide with a "Happiness Board" that is updated DAILY - so folks (like managers) can have a sense of the way folks are feeling RIGHT THEN - THAT DAY.

As opposed to "Yes, we are interested in how our people feel,  so we will send out surveys by inter office mail every 3 months."  Riiiiiiiiiiiiiiiiiiiiiiiiight.

So, in short - we CAN address problems, and make life and work-life better.  But we need to try not just whinge about it.
---
In Dawn Haynes' presentation on Agile and Documentation... with UNICORNS.

Dawn is opening with a brief story of her professional development progression. "I was a massive test team of ONE... surrounded by lots of hardware.  It felt good."  Many folks I know have been in that situation.  Some do really well.  Some folks have a harder time.

It is frightening when Dawn's stories are soooooooooooooooo similar to mine.  Oh yeah, this sounds familiar: "When will this be done?" "Done?  NEVER!!!!!!!!!"

Well, ok, maybe that is a little harsh.  It just struck me as an important point. Most folks need to learn to figure out how much has been done, and generally what still remains, according to some model.  Dawn then applied some thinking (way cool great step!)  and realized that you need to be able to remember more than one or two things at a time.

Her memory sounds a but like mine - you can handle two or three things in the "memory stack" and anything more than that scrolls off.  So, she began with a simple spread sheet that contained information on stuff to be tested - in a prioritized order.  (Pete Comment: Yeah, this is getting crazy spooky because she's describing some of the stuff I did early on as a tester.)

Her documentation allowed her to remember what she had done and what she still needed to do and sometimes dream up NEW things to try.  Now, this sounds good, except she managed to "melt" Unix devices.  Yeah, she rocks.

This is what documentation can do - help us remember and explain it to other people.

Now, some Agile pundits advocate NO documentation.  Real world people (not unicorn-land people) who advocate "There is no need for documentation" and things of that ilk.  Dawn explains, quite clearly and simply, that sometimes you need to be able to demonstrate what you did.  Hence, documenting work that needs to be done.

Mind Maps, simple spreadsheets, and other tools can record what is done - what the thought process behind them - and then the results.  Hey Folks!  There is a lightweight test plan/test result combination.  Keep an eye on what REALLY is needed.  You know, to communicate and ADD VALUE - that is kind of a big part of what Agile is supposed to be all about, no?

OK - I really liked this explanation.

And now... the questions/answers are getting interesting...Can these techniques be used for more than one person?  Yes.  Yeah.  It can.

Q: How does this work for inexperienced, maybe unskilled testers that don't know the application?
A: Non-skilled, untrained testers? I don't do that!

yup.
---
So I scooted over one room to pick up Simon Morley's (yeah, @YorkyAbroad) take on CI and multiple teams and people.  And he's underway NOW.

You get complex systems that are difficult to test which gives us slow feedback, because there is so much and it takes a lot of time and ... yeah. You get the idea.

When stuff gets rally hard to deal with, you invite other problems.  Simon is pulling in the idea of "Broken Windows" - which was a popular measure in crime prevention circles in the late 1980's and well into the 1990's.  Simply put, if a window is broken and not fixed, or at least not patched, then the message is that no one cares about it.  This leads to more vandalism, property damage, etc.  If this is not addressed, then more violent crime will creep into the area and things quickly spiral downward. So, fix the windows, even in vacant houses, paint over the graffiti, etc., quickly  Deal with them when they crop up.

In Software, the equivalent is "It was broken when we got it" or "That was broken 2 (or 3 o10) releases ago."  If they do not get fixed, the message is "No one cares."  If no one cares about something, then likely getting them fixed later, and getting similar things addressed WHEN they happen, will get kicked down the road a bit, until it is 10 releases later and obviously, no one cared or it would be fixed when it cropped up. (right?)  (Pete Comment: Some folks seemed a little confused over what the Broken Windows thing had to do with software quality in general or testing in particular. Ummm, think it through, please?)

So, when you have many people/teams working on interconnected products - or maybe multiple aspects of the same product, sometimes, folks need a tool.  Simon developed a series of spreadsheet-like tools to track progress and effort and results of tests.  These summary pages allowed a user to drill down into the individual reports that then gave you more information... yeah.  Pretty cool.

Then - by doing this, everyone in the organization could see the state of testing and (important point here) the Delivery Readiness of the product.  Thus, integrating information - and changing the overall message can help direct thinking and people acting on cues.  Sort of like saying "We are not doing Continuous Integration because we are exercising Delivery Readiness artifacts."  Yeah, it might be considered semantics, but I like it.  It brings to mind that the real key is to get people thinking in advance "When will we be ready?"

That is an idea I find lacking in many organizations.  It is not that we don't ask that question, but we ask that question well after we should.  We ask it too late (in many organizations) to address the stuff that REALLY needs to be considered before any code is written.

Why?  Just as a tool (like CI) is not THE solution (although it can help) the "Complex System" is more than just software - it is the PEOPLE that interact and impact the development of the software that makes a difference.  It is the people who are the solution, not the tools.

(Pete Comment:  That last point is something that can bear some thought, and that is probably worthy of more consideration.  I find this particularly true at "technical" conferences.)

---
LUNCH TIME!!!!
---
This afternoon, Markus Gartner kicks off his keynote with Unicorns & Rainbows... then shifts to Matrix memes.  Cool - of course his theme is Adaptation and Improvisation.

Right - All good rules should not be followed on faith.  He's 3 minutes into it and people are going "O Wow."

Using a common theme on mastery, Markus presents the concept of 10,000 hours of deliberate practice to achieve mastery.  Reasonable, no?  Add to that, the and self discovery needed to work through ideas that enable you to write meaningfully, engage in various exercises and ... well, thinking

Citing ideas from Weinberg's Becoming a Technical Leader (Pete Comment: excellent book) on how to be successful:
1. Learn from Failure;
2. Learn by Copying other people's successful approach; needed;
3. Combine ideas in new ways.

Now citing Computer Programming Fundamentals - which Jerry was a co-author in 1961: 
"If we have mastered a few of these fundamentals along with the habit of curious exploration, we can rediscover special techniques as we need them." 

(Pete Comment:  OK, re-read that statement then check the date - yeah - 51 years old. Yeah.  I'm not sure we have reached that level yet.)

Markus slides into questions on Management and the difference between Managing and Controlling  (Pete Comment: and Dictating).  The problem of course is that some folks have a hard time dealing with this because - well - Controlling practices, we may not actually allow for people to learn and master and grow, these controls are self-limiting.  The negative feedback loop that ensues will doom growth - personal, professional & organizational.  (See Jurgen's keynote.)

Clarity, Trust, Power and the ability to Search/look are all needed to resolve problems and allow the team to grow.

Consider that delegation relies on intrinsic motivation, pride of workmanship (Pete Comment: Yeah, you can be a craftsman, really) and desire to please customers.

Allowing for accountability is the key to allow teams and organizations to grow.  Yeah.  That is pretty significant.  The approaches to testing reflect the culture of the organization and the world-view of the management.

You MUST have trust among and between the team members.  WIthout this, there is no improvement.  Fear of Conflict is the next concern.  Conflict CAN be good - it helps us understand and resolve questions around out understanding.  Lack of Commitment - not the Scrum thing, but the question around what we do and commitment (professionalism?) to getting the best work done we can.  Related to that is Avoidance of Accountability - yeah - the "Its not my fault I did this wrong."  This is - well - silly.  Finally the inattention to Results - the question of "What did you DO?"

In an Agile environment, this kills chances of success.  If your team has these things (any of them) going pear-shaped, you get dysfunction.

Consider the idea of Testers need to be able to Code/Program.  If we can grow a mixed-skills body, where people have similar skills but in different combinations, we can move from Testers & Programmers to Programming Testers and Testing Programmers.

The difference between them can be negligible.  When we are open to it.  Communication and mutual support is needed, 

In the end, we each must decide whether we will take the Red Pill, and become fully Agile (or agile) or the Blue Pill and stay as we are.

---
After taking a break, chatting with people and getting a bit of writing done, BACK ONLINE - in Henrik Andersson's presentation on Excelling as an Agile Tester.  Yeah.  Sounds like fun AND WE'RE OFF!!!!!

Henryk is a good guy - Member of AST, has attended/presented at CAST, participated in Problem Solving Leadership and AYE - and presented at a pile of conferences.  Smart, outgoing and a good thinker.

In discussing Agile Testing, Henrik tells us that the acronyms tend to carry a fair amount of baggage with the,  The TDD, BDD, ATDD, and other acronyms tend to fall into the realm of Checking - in the model described by Michael Bolton.  In his estimation, the techniques embodied by those terms are actually design pricniples to facilitate clean, clear, simple design - and inspire confidence.  Frankly, it helps programmers know what to do and when to stop coding.

Why take those away from programmers?  Why add an extra feed-back loop to get a tester in the middle?

Henrik's key point :  DON'T TAKE THE VALUE AWAY FROM THE PROGRAMMER.

But don't worry, there is still a need for testers in an Agile environment (Pete Comment: or any environment for that matter.)

Citing the works of Weinberg - A Tester is one who knows that things can be different.  Citing the works of Bach and Bolton - Testing helps us answer the question "Is there a problem here?"

A tester should be driven by question that haven't been answered, or asked, before.

(Pete Comment - OK: Henryk is currently describing my views really closely, I am hoping there is not some confirmational bias on my part.)

So, what can an Agile Tester do?  How do you become EXCELLENT?

Pair: with a product owner on design of Acceptance Test ; with PO whendoing ET sessions; with PO to understand the customer; programmer on checking; with [programmer to understand the program.

(Pete Comment: Yeah, I find nothing to fault in this logic - see my above comment.)

And out comes James Bach's comment - "I'm here to make you look good."  And Henrick's corollary: I'm here to make us look good."

It is a subtle distinction, but an important one.  A tester in an Agile environment can cross multiple conceptual walls (Pete's description) across the entire team.  By working with the Product Owner as well as the Programmer(s) and the rest of the team to gain understanding, they can also help spread understanding and help manage expectations and guide the team toward success.

Sometimes gently, sometimes, well, kinda like Mr T on the A-Team television show.  (Pete Comment: Is that still running somewhere?)

When testers are participating in Sprint planning - they also need to remember to engage in Daily Test planning.  Henryk recommends planning these by Sessions (described by Bach) of 90 minute blocks of uninterrupted, guided/dedicated testing.  With Charters to guide each session - guidelines for what functions/portions are intended to be exercised each session - these charters can be placed on the Scrum board (or Kanban or... whatever) as you plan them.  Then also - put your BUGS on the same board.

Don't let them be HIDDEN - particularly in a bug system with limited visibility.  Look into recording it accurately - Then get them out in front of the world.  Other people may notice something or may possibly be able to provide insight into the results.  Most important, it lets everyone know what you are finding.

Taking the results/findings of the sessions back to the group, helps us understand the state of the product, the state of the project.  This can be done at the daily Scrum (in addition to the notes on the board.)  Grouping information needs to make sense.  The point is to give a nutshell - not the whole tree.  Make things clear without bogging things down in detail.

Understand how the User Stories related to which Function Areas and the corresponding sessions per, along with the "Health" of the system.

Generally, Henryk finds that giving information on the Health of the system of greater value than the typical "this is what I did, this is what I plan to do, these are the problems..."  This report may include session burndown charts, TBS (Test, Bug Investigation and Setup) Metric - along with the time spent in Learning, performance testing, business facing work and integration.  These pieces of information give a reasonable image of the state of the testing right then.  Getting to know the rhythm (Pete: there's that word) of the application and the project is crucial to understanding the application and the speed that information is being obtained.

Whew.
----
Closing Keynote for the day by Gojko Adzic - Reinventing Software Quality.

Opening Gambit - We are collecting all the wrong data, reporting it in all the wrong way and wasting time with both when there is much more reasonable, and better data to collect. (Pete Comment: That big CLANG was the armoured gauntlet being thrown down.)

Using his Specification by Example book as an example, Gojko launched into a story of how he found 27 problems that he considered a P1 - as in parts of words could not be read or stuff was mangled or... yeah.  Problems.  SO... He spent MONTHS going through the book looking for every possible problem and recorded them.  Then, he went back to the publisher with his "bugs."

"Don't worry about it."

WHAT? Look at all these bugs!  - ANd they showed him the reviews posted online, etc.,  All but one gave it full marks and rave reviews.  The one negative review had nothing to do with the production quality of the book itself. 

Why?  Because he was looking at this in a linear way - Looking at bugs in PROCESS - not performance in the market place.  Not Customer satisfaction - Not whether they liked the product or it did what they needed it to.  In his estimation, ET, SBT and the TDD, ATDD stuff are all detecting bugs in the process - not bugs in the market viability.

Now - one MAY impact the other - but do we have enough information to make that decision? Maybe - Maybe not.

His comparison then moved to driving a car.  The Lane Markers and seat belts help us move the car safely and make sure that we are not injured if something goes wrong.  They don't provide any information on whether we are going in the right way.  Other things may - Road signs for example - and sometimes those are missing or obscured.  GPS Systems can also help - and generally show your  distance from your destination - IF they know where they want to go in the first place.

Another Example - Blood pressure is generally an indicator of human health.  Generally, reducing blood pressure, if the blood pressure is too high, may be a good idea.  The way you reduce it may or may not help.  For example - a little red wine may help.  Maybe some medication may help.  Chopping off the head will certainly lower blood pressure, but may negatively impact overall health.  (Pete Comment: I wish I had dreamed that up.)

We talk about using User Stories and ... stuff to promise "good software."  Yeah, well, some user stories are good, some are... rubbish (Pete edited that word.)  A good User Story talks about what is going to be different. And how it will be different. Measure the change!

And moving on to the self-actualization table - as he applies it to Software.

Deployable Functionally - Yeah - the stuff does not break.  Good.  Does it do what we want it to do?  That is another question.  Is it a little over-engineered - like capable of scaling to 1,000,000,000 concurrent users may take more work than really needed.

Performance Secure - Yeah.  Does it work so it does not spring leaks everywhere? OK  Consider Twitter - ummm Gojko is pretty sure they do some performance testing - how much, who knows.  Still, the fail-whale shows up pretty frequently.  The

Usable - OK, so it is usable and does not blow up.

Useful - Does it actually do anything anyone wants?  If not - WHY DID YOU DO IT???

Contribute to Success - The Hoover-Airline example is one to consider.  Some years ago, Hoover Vacuum ran a promotion where if you bought certain models of their vacuum, they'd give you airline tickets.  Except they made it pretty open ended - like - intercontinental.  And They were giving away airline tickets worth far more than the vacuums they were selling.  On top of that, they were getting scads of complaints - and between trying to buy airline tickets and dealing with problems, the UK subsidiary went broke - and were in court for years.

Another problem - the "success rates" for training.  Typically we simply do it wrong and look for the wrong things to give us the information.  He cites Brikerhoff & Gill's  The Learning Alliance: Systems Thinking in Human Resource Development.  Generally, we fail to look at where the impacts are felt to the company and what the actual results, of training and software, actually are.  We are deluding ourselves if we are tracking stuff without knowing what the data you need is. 

So, Gojko provides a new Acronym for this consideration:  UNICORN:  Understanding Needed Impacts Captures Objectives Relatively Nicely.

The Q/A is interesting and way too fast moving to blog - check out twitter.... ;)

whew - done with engines for the day.

Thanks folks.



Tuesday, November 20, 2012

Agile Testing Days: Day 2 - LIVE in Potsdam!

Right.  Here we are in Potsdam, Germany for Agile Testing Days, after a day of workshops.

Here we go!

Scott Ambler is opening his keynote throwing down the gauntlet, planting his flag and a couple other euphemisms.  Starts out with the challenge that much of the rhetoric around the Agile community is incorrect.  How bad is "Waterfall"?  Try finding a way to do a financial transaction that does not encounter, at some point in its life, a software system created by a waterfall methodology.

Next point - interesting -Don't worry about Process, worry about getting work done.  Process stuff is BORING!  Worse, most of the folks who were early into Process were speaking from theory and not from practice.  Agreed, Scott.   

Instead of starting with something big and moving to the middle, or starting with something little and moving to the middle.  Skip the rubbish and start in the middle (duh.)  Yeah, I agree with that as well. Data Technical Debt is the killer in the world today, introduced by people who are not enterprise aware. Frankly, Scott is hitting a lot of points - many of which I agree with, and interestingly enough, Matt and I touched on in our workshop yesterday.  He touched on being at least Context Aware and recognize that some practices don't work in every situation.  He also is challenging the notion that some of the "rules" of Agile are really not as agile as folks would make it.  OK - he has me now.

One point, if you look at how Scrum actually works, it is linear, right?  You Plan, then Do, then... wait. That sounds like Waterfall.  Hold on, how can that be?  That sounds like not agile!  Advocating improving activities, software design and development is core to delivering in "the real world."  (Real world count is 4, maybe 5 at this point?)

His DAD process lays claim to getting the good stuff from SCRUM and RUP and making sure people are able to deliver good software.  Others in the room are noting that there is a demand in the presentation on needing "real data" without presenting any "real data" - OK - I get that criticism.

Can Agile Scale?  Can a team of 50 make the same decisions as a team of 5?  (I can hear rumblings about how 50 is "too many" to be Agile.  Yeah, I get that too.  Except large companies often act that way in practice, instead of what they should do or say they do.

In practice, organizational culture will always get in the way if given an opportunity.  Groups of 5 will have different dynamics than a team of 50.  OK.  He then talked about being "co-located" which most (maybe all?) forms of methodologies call "preferred."  The question is are they still co-located if the group is working in cubicles?  Are they really co-located if they are in the same building but not in the same area?  What about when you have people from different departments and or divisions are involved or ... yeah you get the idea.

OK - He just scored major points with me - "I don't trust any survey that does not share its source data."  He goes on to explain that the data from his surveys, and supporting his assertions are available for review, and free to download if you contact him.

Scott moves on to talking about scaling Agile and Whole Team and - yeah. He observes that sometimes there are regulatory compliance situations where independent test teams are needed and asserts that you read and understand the regulations yourself - and NOT take the word of a consultant whose income is based on... interpreting regulations for you and then testing stuff you need to do.

Scott moved on to questions of parallel testing for integration testing, then leaped back to his Agile Testing survey results.  He has an interesting slide on the adoption of "Agile Practices" like Code Analysis, TDD, etc.,  One thing I found that was interesting was the observation on the number of groups that kind of push testing off - like maybe to a different group?  Is it hard?  (well, yeah, maybe)

Other info graphic observed the number of developers not doing their own testing.  (Wait, isn't the point of "whole team" REALLY that we are all developers?) I am not certain that some of the results that Scott is presenting as "worrying" are really that bad.  Matt Heusser reports that his last client did Pairing & Reviews - so are informal reviews being done by "only" 68% a problem?  Don't know.  In my experience, Pairing is an organic review environment.

Overall, I quite enjoyed the keynote.  NOTE: You don't need to agree to learn something!  :)
 ----
Moving on to the first track session after a lovely round of hallway conversations.  Peter Varhol presenting a talk named "Moneyball and the Science of Building Great Agile Teams."

Opening is much what one might expect on a (fellow) American explaining the concept essentials of Moneyball (anyone remember how bad Oakland was a few years ago?) and the impact of inappropriate measures.  Yeah, (American) baseball has relationships to Agile - people need to respond to a variety of changes and - frankly, many people do a poor job of it.  Both in Software and in sports.  Essentially - to win games (or have successful implementations) you need to get on base, any way you can.

Thinking Fast and Slow (Kahneman) being referenced - yeah - good book.  

Peter is giving a reasonable summation of biases, impressions and how thinking and actions can be influenced by various stimuli.  For example "This is the best software we have ever produced" vs. "We are not sure how good this software is."  Both may be true statements about the same release!

One may cause people to only brush over testing.  One may lead you to aggressively test.  Both statements may impact your approach depending on the nature of your thinking model.

He's moved on to being careful with over-reliance on heuristics - which are by their nature, fallible.  They are useful, and you need to be aware of when you need to abandon one given the situation you are in.

He also warns against Kahneman's "Anchoring Effect" where one thing influences your perception of another, even when they are totally unrelated.  In this case, roll dice and come up with a number, then answer the question "How many countries are on the continent of Africa?"  The study he cites showed that when people had no idea, and were guessing, the higher the number rolled with the dice, the higher the number of countries were guessed.

OK - Really solid point: We want things to be better.  The result is that many people will tend to wish away problems that were in earlier releases.  Folks tend to HOPE that problems that are gone without confirming.

--

Leaving Peter's presentation I was stopped by, well, Peter.  We had a nice chat on his presentation.  Alas, the ending was a bit rushed, but, ah, some of the "pulling together" ideas were a challenge for some of the audience, but it was solid.

I then ran into Janet Gregory - very nice lady - who was kind enough to autograph my copy Agile Testing.  THEN!  I ran into yet another great mind - Markus Gaertner - who was good enough to sign the copy of How to Reduce the Cost of Software Testing - to which he contributed a chapter.

Then I realized I was late for the next session, where I am now - Cecile Davis' "Agile Manifesto Dungeons: Let's go really deep this time!"  As I'm late, I missed the introduction.  Alas, one exercise was wrapping up.  Bummer.

However, using children's blocks as an example, people working together, testing early, collaboration.  Cool idea.  I am going to need to remember this.

I like how this conversation is resolving.  I wish I had caught the beginning.

The Principles behind the Agile Manifesto boil down to fairly simple concepts, that can be a challenge to understand.  We need to communicate clearly so everyone understands what the goal is.  We need to have mutual understanding on what "frequent" means - how often do we meet/discuss? how often is code delivered?

What do we mean by simplicity?  If we reduce documentation, or eliminate formal documentation, how do we ensure we all understand what we have agreed to?  These are thoughts we must consider for our own organization - no single solution will fit everyone, or every group.

"When individuals feel responsible, they will communicate."

Yeah, that is kind of important.  Possible problem - when people feel responsible for the team, instead of for themselves, they turn into Managers, who are likely no longer doing (directly) what they really like doing - Or they burnout and fade away.

In the end, it is about the team . 

To get people functioning as a team, one must help then feel responsible as a team - not a collection of individuals.  Then, they can communicate.

--
LUNCH TIME!

--

After a lovely lunch and wonderful conversations, we are BACK!

Next up is Lisa Crispin and Janet Gregory - yeah, the authors of Agile Testing.  The book on.. yeah.  Cool.  Their topic: Debunking Agile Myths And they start with a slide of a WEREWOLF!  Then they move to a slide of MEDUSA - and they put on "medusa headbands."

Let's see....

Testing is Dead - Maye in some contexts.  When Whittaker said that, it addressed some contexts, not all.  The zombie (thanks Micheal Kelly) testers, unthinking drones need to be buried once and for all.  The others?  Ummm not so much

ATDD/SBE tests only confirm behavior - yeah, unless you are looking to distinguish between Checks & Tests (as defined by Michael Bolton.)  Knowing that difference is crucial.

Testers must be able to program (write production code). - And their UNICORN (#1) appears!  Do all testers need to write code?  Well, maybe at some companies.  Except in some circumstances, what is really needed is an understanding of code - even if what is needed is not really programming.  Maybe it is broad sets of skills.  Maybe the real need is understanding multiple crafts - testing and... other things.  One can not be an expert in all things.  No matter how much you want to.

T-Shaped the whole Breadth/depth balance is crucial.  Maybe technical awareness is a better description?

Agile teams are dazzled by tools - OOOOOoooohhh!! Look!  Bright! Shiny! New! Wow!  We need to have MORE TOOLS - or do we?  What is it that fascinates people with tools?  The best one help us explore wider and deeper.  We can look into things we might otherwise not be able to look into.

As long as they can help foster communication, in a reasonable way (I keep using those two words) tools rock.  Just don't abuse them!

Agile teams always deliver software faster. -With a DRAGON!  Looks like a blue dragon to be precise... yeah, the buzzwords are killer ... sprint, iteration, stuff people really don't understand.  Let's be real though.  Sometimes - like always - when you change something - like a team's behavior - the team is likely going to slow down.  It takes a while to learn how to do these new things

Alas, sometimes it takes longer to UNLEARN old behavior (like all of them) than it does to LEARN new behavior.

Allowing people to learn by experimentation is important, and can help them learn to perform well - you know in a flexible, responsive way.

The result of doing agile well is better quality software.  Speed is a byproduct!

---

Another break and I wandered into the Testing Lab where I found people diligently testing a Robot, which reacts to colo(u)rs - with the goal being to determine the rules behind how the robot responded.  There was a version of Michael Bolton's coin game going, and a couple of interesting apps being tested - one by none other than Matt Heusser!

Wandering back to the main reception area where I stumbled onto a handful of folks who were excitedly discussing, well, testing.  Trying to get caught up with email and twitter feed, I realized that Lisa Crispin was in the "Consensus Talks" (think Lightning Talks). 

Stephan Kamper gave a nice summary of how he uses PRY ( ) to get good work done.  This is a Ruby tool that simply seems to o everything he needs to do.  "Extract till you drop" is probably the tag line of the day (other than the Unicorn meme running thru things.  Pretty Cool.

Uwe Tewes from Genalto, gave a summary of how his organization engages in Regression, Integration and other tests including UI testing.It was pretty cool stuff.

---
And after a break - Sigge Birgisson's presentation - the last track session of the day on Developer Exploratory Testing: Raising the Bar.  He starts out well, with the "Developer's Can't Test" (myth) that he manages to include the now seemingly mandatory Unicorn image.  Yeah, Unicorns started showing up after the opening key-note.

His gist is that Developers want to be able to be proud of their work - to show that they are producing stuff that is not only good, but great.  The problem is, most developers have little or no training in testing methods, let alone Exploratory Testing.

The solution he used was to introduce paired work, with training on ET and Session Based Testing.  The people working together build understanding of what the others are good at, helping garner respect.  It was a huge gain for team cohesiveness ("togetherness" is the work Sigge used) along with encouraging developers build a more holistic view of the product.  Good stuff, man.

Using a fairly straightforward Session (Time Box) Method, testers pairing with developers are able to do some really solid work.  It also helped developers understand the situation, and be able to do good work.  Frankly, it is hard for people to keep their skills sharp if they are not engaged fairly frequently in an activity.  For Sigge, this meant there might be some significant breaks between when developers can actually be engaged in testing from one project to another - meaning they do testing work for a while and a sprint or two later, they are diving into testing again.

So with some simple mind maps (Smoke Testing Description for example) and a little guidance, they are able to be up and running quickly after each break.  Cool.

He's reminding us that we need to keep focused on the needs/concerns/value of the project.  How we do that will need to vary by the Context.

And in talking about Stakeholder Involvement, he flashes up a picture of a roller-coaster, and talking about keeping people involved, in the loop, and taking them for a ride.  (Groan)  But really, its a pretty good idea.

He describes the involvement of stakeholders, their participation in "workshops" (and not calling them "test sessions".  And focusing on paths that are reasonably clean, and branching out from there.  Yeah, there may be a bit of a possibility of confirmation bias, BUT - this allows them to start from a safe place and move forward.

With testers working with developers and stakeholder/customers, there is less thrash, and the ability to communicate directly, and manage expectations.  Again - a cool thought (I wonder how many more cool thoughts he'll have before the end of this session.)

Yeah, whining stakeholders - the ones that latch onto one thing that is not quite right - can slow things down.  Sometimes keeping people on track is a great challenge.  (Pete's comment: That is not unlike any other team in my experience.)

So, Sigge comes away from this convinced that Developers really CAN test.  Business Stakeholders/customers can test very well.  He reports no experience with Unicorns testing his applications.  (Pete Comment:  To me this is an incomplete consideration.  I would expect this covered in future releases/versions now that Sigge has seen the significance of the Unicorn Factor.)

--

Closing keynote of the day is just starting with Lasse Koskela speaking on "Self Coaching."  Yeah.  Interesting idea. When a speaker states "This doesn't really exist.  If you Google it, you get one book that has nothing to do with what I am talking about."  OK.  I'm interested.

And he got in the obligatory "Finland is in Europe" joke.

After admitting he is a certified Scrum Master, he claimed to be a nice guy.  (Pete Comment: OK, we'll give him the benefit of the doubt.)

Lasse begins with a series of skills needed for self coaching.

Understand the Brain - He talks about basic brain function - like if the brain detects a large carnivorous animal a reasonable response might be the brain sending a message that says "RUN!" There may also be a message that says "Be afraid."  At some point after running and after being afraid, another portion kicks in with cognitive response and maybe that will tell us that looking to see if the carnivorous animal is still chasing us and should we still be afraid.

After verifying the animal is no longer chasing us, a retrospective might inform/influence our threat/reward models.  This is turn can be informed by several considerations:
Status (is it current RIGHT NOW)
Certainty (is it plausible that this is not as definite or maybe more definite than we think?)
Autonomy (are we in control?)
Relatedness (what is our relationship to this situation - have we been there before? What about other people we know?
Fairness (pretty much what you might think)

The issue is that perceived threats tend to outweigh rewards in this - so it takes many good experiences to outweigh a single bad one.  This may be a challenge.

Reduce the Noise

In looking to overcome obstacles, we need to reduce - if not eliminate - the noise emanating from our own head.

Encourage good behavior - the visualization thing - and frame it in what you want to do, not what you are afraid of/do not want to have happen.  Funny thing - in golf, when folks t-up a shot - if that one goes bad and they need to t-up another, then the result is rarely the same mistake.  It actually tends to be the opposite. 

Part of the problem is once a bad thing happens, we tend to focus on that - a combination of "I can't believe I did that" (damaged self ego) to "DON'T DO THAT!" (undermine of intent by focusing on what you do not want.)

Ernie Els, the golfer, is a good example of how to sort this out.  He went from having a terrible year to rebounding back and leaving every other golfer in the dust.

Stopping the Brain - 

Cognitive Dissonance

That icky feeling when two pieces of information conflict.  For example "You did it all wrong!" when you believe that you completed the task perfectly. 

When our  (Ladder of Influence) Reality & Facts / Selected Reality / Interpreted Reality / Assumptions / Conclusions / Beliefs & Actions are not based in fact & are essentially false in nature, we are setting ourselves up for failure.  Getting out of this conflicting "box" reality is a huge problem - and is a significant portion of the problem set.

Changing that requires us to be aware of what is going on - we have something to do that may conflict with what we really want to do, then we are faced with a choice - you can either address one level of "right" with what is a negative reaction.

He is describing a "Frame" model similar to Michael Bolton's Frame model.

So - to summarize - Pause frequently and evaluate your own reasons; Check your own thinking; Obey the sense you have of right and wrong.

AND THAT WRAPS UP DAY 2!!!!

Sunday, November 11, 2012

What Makes Software Teams that Work, Work?

In pulling together some notes and reviewing some papers, I was struck by a seemingly simple question, and as I consider it, I pose it here.

Some software development teams are brilliantly successful.  Some teams are spectacular failures.  Most are somewhere in between.

Leaving the question of what constitutes a success or failure aside, I wonder what it is that results in which.

Some teams have strong process models in place.  They have rigorous rules guiding every step to be taken from the initial question of "What would it take for X?" through delivery of the software product.  These teams have strong control models and specific metrics in place that could be used to demonstrate the precise progress of the development effort.

Other teams have no such models.  They may have other models, perhaps "general guidelines" might be a better phrase.  Rather than hard-line metrics and measurement criteria, they have more general ideas.

Some teams schedule regular meetings, weekly, a few days a week or sometimes daily.  Some teams take copious notes to be distributed and reviewed.  Some teams have a shared model in place to track progress and others keep no records at all.

Some of each of these teams are successful - they deliver products on time that their customers want and use, happily.

Some of each of these teams are less successful.  They have products with problems that are delivered late and are not used, or used grudgingly because they have no option.

Do the models in use make a difference or is it something else?

Why do some teams deliver products on time and others do not?

I suspect that the answer does not lie in the pat, set-piece answers but somewhere else. 

I must think on this.

Wednesday, November 7, 2012

Weird in Portland, PNSQC Part II

For reference, I had never been to Portland before my trip out to PNSQC this past October.  I searched for hotels and eateries and transportation information and (being who I am) Irish Traditional Music Sessions and Pipe Bands.  I had been on the MAX - the light rail system - for 10 minutes when I realized I had failed to search for one key thing....

Weird.

Really.  Any city that has a website, bumper stickers, billboards and ... stuff dedicated to Keep X Weird gets huge bonus points in my book.  Massive points.

I met a whole passel of dedicated, creative, passionate, exuberant people who were excited to be in a place talking about software and quality and engineering and the like.  I've been to conferences before (yeah, the blog has a fair number of references to them) and have seen energy like this and have fed off it to get through a long week.  This was different.

Let's see, how was it different.  Well, I hung a lot with a bunch of people I had met but did not really know.  I also met people I had never met in person before - but had read their writings, blogs, articles and the like.

This was folks like Michael Larsen, crazy smart and all around nice guy with may more energy than I can muster.  Ben Simo, yeah, Quality Frog - another crazy smart guy who has lots of cool ideas and thoughts and is also way nice. The three of us went to lunch the Sunday before the conference started.  It was one of the most amazing conversations I can remember having in some time.

We covered the range of good quality Mexican food (we were eating at a place with fairly few non-Hispanics eating) to Tex-Mex to South-Western to - Oh yeah.  Software testing, software development, cool tech stuff to bands to music to ... what WAS that stuff Michael was drinking?  (a lightly carbonated fruit juice bevvy - pretty good actually.)

We experimented with artisinal chocolates (its Portland, EVERYTHING is made by an artist) on the walk back to our hotel, while discussing the amazing variety of food wagons that were parked (some looking like they were more permanent than others. 

Included in this was a discussion on the huge variety of opportunities for exquisite food and remarkably enjoyable people watching and meeting.  I know.  Weird, right?

The Conference

Instead of a blow-by-blow description of events and activities, I'd suggest checking out Michael Larsen's way-cool live blog posts.  The weird thing was that it seemed like every time I looked around in the session I was in - there he was typing away and way into the topic.

Michael -Dude - you are so inspirational it is crazy.  Here's what Michael had to say...

Day 1 
Day 2
Day 3

OK, so my personal remembrances of the conference - I had a nice coffee and walked to the conference site from my hotel - not the conference hotel, but nice and not too far.  I found myself focusing on nothing at all and simply drinking in how walkable the city is and how good the coffee was and ... why was I 4 blocks beyond the conference center?  Really.  Cool city with lots of things to see and small, comfortable parks.  Nice.

My Day 1.

So scurried back to where I was supposed to be, grabbed another coffee, registered at the front desk and promptly met Michael Dedolph and then Terri Moore and then - a bunch of very friendly people.

It was pretty exciting - the auditorium was packed for the opening key note by friend, colleague and sometime partner-in-crime Matt Heusser.  The fact is, there was not a seat left in the room, there were a bunch of people (myself included) were sitting just outside sipping wireless, power and coffee, and listening to Matt over speakers set up for that purpose.

I quite enjoyed Matt's presentation on Quality and manufacturing an software and... stuff.  I was astounded (and humbled) when he mentioned me toward the end. 

I found myself in some great conversations, bailed over to the "birds of a feather" lunch session on testing in the cloud, hosted/moderated by Michael Dedolph, I then wandered off to a couple of other presentations - then was drawn into Doc Norton's presentation on "Growing into Excellence" - it was really solid, dealing with encouraging and growing people to do better and... yeah. Good stuff. 

This set up Linda Rising's presentation on Agile Mindsets and Quality.  It was...  Wow.   Consider the similarities between people who are willing and able to adapt multiple views, consider a variety of information and approaches and select the appropriate solution based on the needs of the project.  Pretty agile, no?  How we communicate with people, starting in primary school and running through until finishing - high school or beyond - colors those expectations based entirely on what we praise and reward.  Thus, the question is, what do we want to reward?  Yeah.  I quite liked that.

Day one ended with a session with Matt on metrics.  It was kind of a fun discussion on "Santa Clause, the Easter Bunny and Quality Metrics."  It also was a good way to wrap up the day.

My Day Two.

Day Two started out similarly to day one.  Had a nice breakfast with Matt Heusser and Michael Larsen and Ben Simo, grabbed a coffee to go and headed out for a very nice walk to the conference center.  It was quite nice - and remarkably short when one pays attention to where one is going.

The morning keynote was by Dale Emory on "Testing Quality In."  Yeah.  Its one of those topics we dance around and best, and generally reject as naive and simplistic.  Unless one considers another oft-cited testing truism "Everything can be tested" - including requirements, design, everything.  Test stuff early and the product we get to "test" will be better.  In short, get testers involved early and participate as much, and as constructively and professionally as possible - and things can be better.

Ben Simo gave a really solid talk on Software Investigation.  It was interesting in the way that I tweeted very little - instead, I simply listened and observed.  Ben has a style of presentation that I enjoy and appreciate.  Frankly, I suggest that you read his stuff.  Its good.  Find his blog here. 

The "Birds of a Feather" session I went to was a roundtable discussion on a variety of topics.  Everything from Agile to Metrics to "How do we do things better."

I found myself in a conversation with a variety of bright people on software and perceptions and intent and goals.  Essentially =, Why do we do this and how do we know we're done?  More on that in another post.

I had a listen to Venkat Moncompo talk on UX.  As we were speaking on similar topics I was curious to hear what he had to say.  Now, Michael Larsen gave a really nice summary in his live blog post, above.

I then got up and spoke on my take on UX and testing and stuff.  The gist of my presentation was - Everything touches on UX.  Tests you can think of, interactions.  Most importantly, the reasonable expectations of a person who intends to use your software to a specific purpose - if they can't do that, the rest of the touchy-feely-kum-ba-ya stuff does not matter.  It was a good conversation and exchange of ideas - which is what I try to get going. 

That is perhaps the greatest thing I ran into at this conference - folks were willing to discuss a point, debate an idea and examine concepts while asserting themselves, politely.

I know, weird.

My Day Three.

The third and final day for me in Portland was an all-day workshop on Software Testing.  Really simple, eh?

Software Testing Reloaded.  This is a session developed with Matt Heusser, that we have run several times now, that looks at what makes testing, well, testing.

We look at some of the truisms that get bounced around.  We look at the "you must do to do good testing" assertions and, well, test them.  We come in with a set of goals and intentions, then add to them topics that the participants have as areas they want to explore, discuss and generally spend some time diving into.  

The first time we presented this - it was a leap of faith.  Now, it is just fun.  The weird thing is (yeah, there's that word again) that when we present it at conferences, we always end up with more people in the room at the end of the day than were there at the beginning of the day.  Its kinda cool.  (Shameless plug, if you are not doing anything the week of November 19, Matt and I are doing this workshopin Potsdam, Germany at Agile Testing Days.  Check it out.)

That evening, Matt and I met up with the SQAUG group - a new, home-grown testing and QA discussion meetup type group in Portland.  We had a great time with them, talking about Complete Testing and sharing ideas and doing some exercises around that. Good times.

Home Again Home Again Jiggity Jig

Thursday I needed to fly home so I could be at meetings at my client site on Friday. 

What I found enjoyable about this conference was a couple of things. 

First, and I've already mentioned this, sessions, hallway conversations and round-tables were very enjoyable. People were happy to discuss ideas and share views and be nice about it.  There were very few people I met where I thought "What a prat." In fact, everyone was very nice and polite.  I kinda grooved on that. 

The other thing that I really liked was how relaxed everything was.  Now, that is not to say "Not intense."  I came away each day with a feeling of "My brain is full." I was mentally drained and exhilarated at the same time.  Many conferences I find myself physically exhausted and just wanting to curl up in a corner.  Here, at the end of each day, I felt I could go a bit longer, even when the conversations around the table with adult beverages went into the wee hours. 

Yeah it was weird, in a really good way.