Tip:
Highlight text to annotate it
X
>> HENDRICKSON: Harry, my sponsor says it's time to start. So, we were just debating should
Harry get up and introduced me and I decided he didn't have to do because he is so comfortable
there with his laptop. It seemed cruel to make him get up. So, instead I'll introduce
Harry. This is Harry, in case you don't know Harry. Harry has been in Google for how long
now? >> Eight months.
>> HENDRICKSON: Eight months. Harry is the model based testing guy, among other amazing
things. And collector of testing related lyrics up on his website. And he's now blushing I
think, just a little bit. So, I'm here thanks to Harry. And I'm here to talk about Agile
testing, which of course makes me curious about what you guys are up to. So, out of
curiosity, how many of you are currently working on a project that you would describe as Agile.
Whatever flavor? Oh, well, one? Were you all just attracted by the title? Is this flexible
testing? We're going to do yoga and testing combined. Is--okay. So, gee--seriously one?
Wait a minute. I know you must be on a XP project, because I know you. This is Mark.
>> You know I've tried but [INDISTINCT]. >> HENDRICKSON: How many--how many of you
were trying but not there yet, on Agile? Oh, okay, okay. Not--okay, and we may have some
disagreements about how hard we're really trying on that. All right, well, I confess
I'm surprised because I do know a few people at Google and everybody I know at Google is
an XP here, extreme programming. So, how many of you have ever heard of extreme programming?
Oh, good. We're getting better. A few of you still haven't, so that's okay. I'll explain
a little bit about what extreme programming is. And I need to explain a little bit about
how I came to be giving this talk. At this point, I have given a version of this talk
several times and it keeps evolving. As my beliefs and understanding and perspectives
on Agile testing are evolving. I first started hearing about this Agile stuff several years
ago and initially I was extremely skeptical. It sounded like yet buzz word, yet another
flavor of the month fad that was going to go away. Anybody remember computer aided software
engineering? That had a real long life, right? So, I figured, I could ignore this safely.
But then I started reading and I became intrigued. Because it seemed like this Agile people had
answers for stuff that we've been struggling with in software for a really long time. Oh,
I better find out more. So, then I attended the talk by Kent Back. He's the guy who wrote
extreme programming explained. Embrace change, like the big industry changing book that was
published and I believed 1999. Well, in 2001, I heard him speak at Quality Week. And he
basically said that QA people are a throwback to tayloristic scientific time and motion
management kinds of things and that we are all irrelevant in the brave new world of extreme
programming. Now, I have to admire his courage because he said this to a standing room only
audience of quality professionals. He survived the experience. I have seen him since. It
was actually kind of funny because about 45 minutes in, he done with his--he was done
with his prepared talk. And he looked at his watch and he said, "Well, I'm done with my
prepared talk and I guess we're done because my time is up." And his track chair, the guys
who's responsible for managing the time and the talks, stood up and said, "Well, actually
we scheduled you for a double track session. So, you've got 45 more minutes to answer questions."
Yeah. >> And he did.
>> HENDRICKSON: Yes, he did. Well, as somebody who I am in fact test obsessed. There are
few [INDISTINCT] left for those of you who might not gotten yours, they're all in the
front row. This was in fact an enticement to get people to sit in the front row. I am,
in fact, test obsessed. I'm not just test infected. I live, eat, breathe and sleep testing.
I think it's fun. And I know that that makes me just a little bit odd. It is difficult
to explain the people who don't understand, you know, why anybody would think that software
testing is fun. How I can find it so exciting. And so, as a test obsessed person, I was surprised
to hear somebody who I thought was test infected say that testers were irrelevant. So I figured,
I got to go find out for myself what life is like for a tester on an XP project and
I got lucky enough to get a contract on an XP project where I was a designated tester
working with the development team on this XP project. And I had a blast. Have you ever
got in software--well, first of all, how many of you would identify yourselves as testers
of some flavor or another? Okay, so, those of you who are testers of some flavor or another.
Have you ever got in software deliver to you, to test and it wouldn't launch at all? This
has happened, right? Didn't happened once on this extreme programming project, not once.
Not only did I never get a DOA build. I didn't even have to wait for them to give me a build
because it was safe to check out whatever was the latest thing checked in the source
control, and it would work. Wow, that was a new experience for me. I was used to projects
where it could take weeks to integrate the software to get it to work at all. Much less,
basically, do what it's supposed to do. Yeah, there was stuff to find though, it wasn't
perfect. Reports of the demise of QA people were wildly optimistic. They needed me. I
felt loved and appreciated. Now, that's a weird feeling for a tester. Yeah. How I--to
feel loved and appreciated to do what I loved to do. I was hooked. Never again was I going
to go back to a traditional project. So, since that time, that was about 18 months ago, I've
been working with extreme programming teams. Now, one of the reasons why my believes--beliefs
and feelings and thinking about testing on extreme programming projects are morphing,
evolving, it's because my role is starting to morph and evolve. The project that I'm
on right now, I've crossed over to what some might say is the dark side. I'm a programmer
on this project. I'm just another programmer with the development team. And let's face
it I'm probably the most junior programmer on this development team. But I'm the most
senior tester. And what that means, is that I have a perspective on this project that
none of the other developers do. I'm the one who's most likely to say, I think we ought
to write a test for this before we move forward. We're doing test-driven development. How many
of you have heard of test-driven development? Okay. And you all know that it's not a testing
methodology. It's a design methodology, right? So, I'm the one who's most likely to have
the little tester paranoia kick in and say, "I think, we ought to write a test, to drive
this little tiny subtle aspect of this functionality." And I'm the one who's going to say, "Yes,
we can too test that. Let's just figure out how." Right? So, I have different perspective
on that project and that's starting to shift my perspective about how testers can effectively
collaborate with Agile developers on Agile projects. But in the meantime, here, we've
got this talk that I'm going to give about, Agile testing that represents what I think
today. And what I think tomorrow might change. If you would like to get the slides, I didn't
want to burn a bunch of paper to give to you. You can find them online, on my site, at Agile--qualitytree.com
AgileTesting-3.1.pdf., so, they're there. And we have to talk about traditional testing
before we can dig too deep into Agile testing. So, let's take a look at traditional testing.
With great optimism, the project manager releases the project plan. There's a big daunted on
meeting, they kick off meeting. Seven full, well, full color printouts of the Gantt chart
are distributed to every executive and it looks like this, roughly. We're going to spend
an analysis phase. First we're going to have an analysis phase. We're going to spent time
analyzing the problem because we know that if we don't think about it very hard, we know
we can't get it right the first time. We've all been in this meeting, right? Where we're
having this discussion, so we're going to analyze it and then we're going to spend time
designing and about six months from now, we think we might understand enough about the
problem to begin coding. So then we're going to do a whole lot of coding and then we're
going to go into the testing, the bug fix phase, stabilize everything and then we're
going to release. This sounds familiar? This never happens. If we lived in a perfect world
this might happen, we don't live in a perfect world. Instead, inevitably this happens. We
spend a whole lot of time analyzing and designing and coding our hearts out and then all of
a sudden at the very end they handed off to somebody in an independent test group and
say, "Test. Make sure that it's quality. It's your responsible to assure--responsibility
to assure the quality." And then we've got like three days to test and then we're going
to release. That looks kind of familiar, right? Okay. Now there's a whole body of knowledge
about how did you do testing in this environment that has been developed over the last, roughly
three decades maybe more. And these practices have worked in this context and I honor this
practices. We've done things like become the last defender of quality. Test departments
stand there and say, "It must get by me before you can release it to the poor hapless users.
I protect the user." Is that stand sound kind of familiar, right? So--so we're the last
defender of quality. We want to employ strict change management. "You can't check in that
bug fix unless I okay it," right? So we're going to be very strict. We're trying to control
the chaos. We're going to make sure that we've done our homework. We've got detailed preparation,
planning, test design--excuse me. We're going to do a whole ton of documentation because
when it comes to that end to that release, we know that we're going to have to bring
in outside contractors who don't have a clue about what we've been doing for the last eight
months and there going to have to get up to speed very quickly. So they have to have very
detailed test scripts. Sound familiar, right? We have to do these things. Strict entrance
and exit criteria: It can't come in to test until it passes the minimum acceptance test.
We will not accept it. It's not in the test phase until it is pass our gate. Have you
weight test automations starting with commercial tools that do lots of recording playback kind
of things? We attempt to enforce the process in some cases. That's where the whole QA title
comes in. Because QA's more than testing, right? QA involves quality of the process
as well. So we attempt to enforce the process. There's only one problem with this, it works
in a traditional context. It helps control the chaos, it's anti-Agile. This is why the
Agile hate us because we're trying to enforce process, we're trying to do strict change
management. But Agile is about embracing change. So, if we are accustomed to a traditional
test environment, how do we adjust our practices to contribute effectively in an Agile context?
Well, for starters the Agile context is very different from a traditional context. Instead
of releasing a big *** thing at the very end, we're releasing all throughout. Each
of this is a completed releasable feature as--or set of features as we go through. We've
got short adorations, lots of deliverables, and in fact that's what Agile means: Delivering
a continuous stream of business value. The implication of that for us testers is that
we have to deliver a continuous stream of information. We don't have the opportunity
to prepare for six months before we start working on stuff. We have to be ready to start
testing day one of the project. Fortunately the code is ready for us to test as well.
It maybe baby code. It doesn't do much but what it does, it dies and we can test that.
Of course there is a warning here. Just because some manager somewhere and you know who you
are says, "We're going Agile. Therefore, I want you to revise all of your test estimates
to account for the fact that we're going Agile." Well, that's not enough to call it Agile,
right? Just because we dumped the documentation, compress the schedule and code up to the last
minute does not mean that we're releasing a continuous stream of value. So we have to
be careful about adopting Agile practices and make sure that it really does fit the
context. Typically, organizations achieve agility by adapting one or more of the--the
Agile methods that are out there. And here are just four examples. The most well known
is typically extreme programming followed by scrum. Extreme programming is a set of
very well defined practices, and I happen to know that Mark is an expert on XP. You
can raise your hand, this is Mark everybody and he's a--he's a Googler. You have other
experts on XP amongst you. I'm not an expert on XP; I'm merely a practitioner of XP. The
thing that has continued to amaze me is the misunderstandings in the industry in general
about what Extreme Programming is. There's this myth that it's the license to hack for
cowboys--anybody heard this? Extreme Programming must be this loosey-goosy kind of thing. It's
actually the most disciplined development methodology I have ever worked with. Developers
write automated unit test before they write a line of code. So it's--it's--they got a
full sweep of a 100% automated unit test that get run before they check in anything. Continuous
integration insures that everything not only compiles all together, but actually functions.
They automate their acceptance test and they take responsibility for working with the customer
to automate acceptance test. So as a result, the unit test become executable specifications
of what the code should do and the automated acceptance test become executable requirements.
That's pretty disciplined, right? Now they don't happen to love documentation a whole
lot, and that's because documentation--it's overhead. If stuffs doesn't get to the end
user, if it never achieves--if it's never intended for a customer, if it's a byproduct
of the process rather than the thing that you sell or deliver, then in lean terms, it's
waste. Nobody likes waste, right? Now some waste is necessary. It's not possible to produce
software with zero documentation. Well, I take that back. I've seen groups try, it doesn't
work real well. You have to have some. But the question is how do we minimize that? So
that we've got the maximum value for the minimum cost and that's what Agile teams typically
look to do. Scrum, now where XP is really developer centric. It's got some stuff about
how you manage projects but its real strength is in providing a set of practices that a
development team can use to produce good software. Scrum is a project management focused set
of practices that focuses around big visible charts: how do you demonstrate progress, how
do you show your customer that you're on track. So each of the different Agile methodologies
has it's own strengths. It's got the audience that it speaks to, and a lot of organizations
had found that they've got a lot of value out of wrapping extreme programming with scrums
so that they get both the project management side and the development side. Other methodologies
include Marry Poppins [INDISTINCT] on Lean. She's written a whole ton about how you apply
the principles of Lean Manufacturing to software development. And what she wrote actually makes
sense. See, before I read her book, if you'd said you can apply Lean manufacturing to software
development I would have said, "No, you can't. Its different, software is different." Well
she figured out to do it because she's got such a rich background in both software development
and in Lean Manufacturing. Cystal is Alistair Cockburn's [INDISTINCT] slightly less rigid
model for doing Agile software development, very developer centric. So there's a whole
bunch of stuff that's been written about how to do Agile and how to do it well. Now, when
us testers are working with Agile teams though, instead of being that last defender of quality,
instead of having this stance, we have to shift our stance. Now we focus on supporting
the team. Typically, from the perspective of either supporting the developers in the
developer facing side of things, moving the project forward or in supporting the customers
in figuring out whether the software was acceptable. Because ultimately we're working for one of
those two groups of people and this is one of the areas where traditionalist don't really
like the way that XP kind of oversimplifies the world and says, "Look, there's developers
in there, the customers, who are the customers for whatever the developers produce." These
aren't necessarily the end user customers. Customer, in the XP sense of the word has
a special meaning. Its capital C customer and that person or set of people, they speak
with one voice to say, "This is what the software should do." And then they get to say, "And
I agree that you've done it. Thank you very much. Good job." So, as testers, we may be
supporting the developers in moving the project forward, we may be supporting the customers
in figuring out whether they got what they were trying to pay for. We want to focus on
shifting our practices with two key ideas in mind. One is, let's increase the feedback,
speed, the speed of the feedback loop, and two is reducing waste. Let's reduce the number
of byproducts that we have to create in order to get the information that our stakeholders
want. And we want to focus on testing provides information so that we can reduce risk because
that's ultimately what we do, right? No matter how we do it? So let's compare and contrast,
Traditional versus Agile. The Traditional attitude towards change is "We've got to manage
it and control it." The Agile attitude is "Hey, change happens. We might as well be
ready for it." The Traditional attitude towards planning, "We'd better do comprehensive upfront
test design because if we don't do it at the beginning, we're just going to get absolutely
bulldozed at the end." Whereas the Agile approach is, "Yeah, we got a plan. We got a plan a
little, but we're going to plan as we go." Documentation tends to be verbose. It serves
a couple of purposes: one is communication and two is "cover your assets on a Traditional
project". We want to make sure that we can't possibly be blamed when things go wrong. On
Agile, we want to do as much as necessary but no more. Handoffs, well, we've got formal
entrance and exit criteria on a--on traditional projects. On Agile projects, it's not a relay
race. We're going to collaborate because we're all one team. Test automation on traditional
projects tends to be very tool-centric focused on the commercial tools and it tends to be
built by specialists. There will be a separate test on automation department by contrast
on Agile projects, everybody is testing. And the developers are continually automating
their tests, we can leverage that. So let's look at these in just a wee bit more detail
to embrace change, we have to be able to minimize duplication and plan for maintenance. Right?
So at one point I was working with one of my clients to help them assess various testing
tools. And we were meeting with all of the major vendors and getting them to comment
on our project. Now our project was the somewhat special context. It was not an extreme programming
project but it was a very early stage project. This was the product that was expected to
be released in about three to four years in a medical device company. Now, the stuff that
were working on early on, we wanted to do automated tests that were end to end tests
of the whole system. And the system included stuff that a clinician would use that connected
to embedded stuff that was the guts of the thing and then that connected to proprietary
hardware that ultimately controlled treatment to a patient. So there's a whole lot of moving
parts in this thing. And we wanted to make sure at the earliest stage as possible that
all the stuff was working well together and you could have a whole lot of a cycles through.
Make sense. Everything is test early on, right? The way we were going to drive this was all
the way from the UI down to the hardware and then back to the UI. We felt that that was
a good way to find this information out. So one of the questions that we had for every
vendor that we talked to was--talked us about the maintenance because we know that this
user interface that's here right now, it just a quickly tossed up mock-up. And we know it's
going to be changing over the course of the next three years. So what's it going to look
like to help us for you what's going to look like to maintain tests in your environment.
One of the vendors had the gall to say to us "Oh well, you should nail down the user
interface first." Can you imagine, a three-year project and they want us to nail down with
the user interface looks like first? Now, the user interface is important because it's
the thing that the clinician is going to use but don't you think it's a little bit more
important to nail down how that hardware is going to behave and how that controller software
is going to behave. And you know maybe the UI should wait until closer to the end and
we'll have a professional UI designer figure out what the UI should look like. But they
wanted us to nail down the UI first and didn't think it's worth us using their tool to automate
unless we were going to that. We said thanks playing, we're done. And we went with another
vendor were we able to more easily modify our tests as we went along because they had
better support for maintainability. So as you are working through the process of adopting
strategies for testing, you got to bear in mind that change is going to happen, it's
going to happen every month. How can we make sure that whatever it is that we're using
is going to support that change. Speaking of change, we want to plan ahead but not too
far ahead because anytime we plan further ahead then the project is ready to go. We
risk wasting our time because we were going to plan the stuff and then whenever going
to have those stories be implemented. So on Agile project, we don't want to plan ahead
further ahead to the next generation because we don't know what's going to be implemented
after that. Right? Stuff was on the list today might come off of the list tomorrow. This
is one of the reasons I tend to use very informal mechanisms to due planning and communication.
I prefer using white board, sticky notes, Wikis, is this the kind of stuff you guys
use? Yeah. Largely? Okay. As opposed to databases or big Microsoft, sorry, I said the M word,
charts and stuff yeah. Okay. Because the more informal it is, the easier it is to change
when the universe changes around us. When I do my test documentation, I want to keep
in simple and capture the essence. There are folks out there that said the best practice
is to detail every single step. Well, if you do that and you've got thousands upon thousands
of test cases, you end up producing thousands upon thousands of pages of documentation for
any given project. Documentation that no customer ever sees the value of. In the worst case,
I once went into one organization where it was the worst case I'd ever seen of over documenting
because nobody was using the documentation. I mean, nobody, not the testers. Not the people
who wrote it. They had a whole separate team set up for designing the test. And those people
who were tasked with documenting test cases. These people spent 90% of their time in mercury
test director documenting tests. They would then sign the signoff form to send those tests
over to an entire group of people whose job was to execute the test cases. I had the opportunity
to talk at length with the people who were charged with executing the test cases. I said,
"How's this process working?" They kind of did the look around. You know the look around,
to make sure that nobody is listening in. And closed the door and they said, "We don't
use them." The people who were tasked with designing those test cases haven't actually
run the software in 3 years. They don't know what it does anymore. This is an organization
that was spending millions every year on documentation that no one ever read. That's waste. So, I
prefer, first of all I prefer it when my work gets used. It's just you know, it's a work
of mine. I like the stuff that I produce to have value. So, I want to make to as light
weight as possible. I want to reuse as much of it as possible. If I come up with a checklist,
I'm going to reuse that checklist in this many places as I possibly can. And you will
know that I don't mean copy and paste it everywhere, right? I mean that's a whole other sin in
tech docu--in test documentation. Have you ever seen an organization, I know you guys
don't do this, so I'm not afraid of steping on, you know, putting my foot in my mouth.
Have you ever seen a test organization that takes the requirement specification? You know
that big formal document that says, "The system shall blah, blah, blah, blah, blah. The system
will blah, blah, blah, blah, blah." They copy that into a new document and then they run
a macro to do a global search and replace on the system shells to make it say the system--verify
that the system does. This is a sin. We now have an un-maintainable document that nobody
knows what it means, right? Okay, so, I don't ever want to do that. I don't want to copy
and paste. I want to centralize. You guys are the knowledge repository of the internet.
You know about centralizing and distributing as well by the way. But, you know, centralize
that knowledge. Okay. Here is one example of light weight test documentation that is
sanitized but essentially from a real project. Here, we've got a test number that part is
optional but, you know, if you want to be able to say test number for blah, blah, blah.
Test number, a category, a test name that was all the detail that I had for those tests.
I didn't bother with any detailed steps. There is a cause to this. Very few people would
have been able to run these tests unless they'd already been on the project. But it was enough
information for us. Though they last executed, notice all of a sudden this document is now
serving 2 purposes. Oh, oh, I love leverage. This is good that we can make one document
serve 2 purposes. The 2 purposes are test design and test results tracking. Last executed
the result in the configuration on which it was run. So, this is one example of very,
very light weight documentation. Another example is a mind map. This is built in a program
called MindManager. There's bunch of mind mapping software out there. Pick your favorite
flavor. The reason I like MindManager is that if you happen to be working on a hybrid project,
where on the one side you've got the Agilists and on the other side you've got the traditionalist.
And the traditionalist keeps banging on the Agilists for more documentation. This is great
because MindManager will spit out the most formal looking word documents you've ever
seen. You give it a template, it takes something that looks kind of free form like this and
turns it into something that's good for military use, very handy. So, you can work in this
and you can spit out the formal stuff for you. So, this is another example of light
weight styles. All right, the next thing we want to do is reduce those hand offs. This
is not a relay race. We're all one big team. That's one of the basic Agile concepts. We're
all--we're a team. We're not a series of independent departments. We're all responsible for the
outcome. And one of the biggest lessons I had to learn on XP projects is that it's way
too easy. If there is a designated test run on XP project, it kind of--almost do the traditional
thing, it's sort of scary how easy it is for this to happen. And say, oh, that's a test
thing. Elizabeth's going to do that. I don't have to think about it. And the next thing
you know--even though this is an XP team and we're all test infected I'm sitting over here
hopelessly behind in my testing tasks because none of the stuff I'm doing is being tracked
with the rest of the project. So, we have to guard against this even with XP teams,
even with Agile teams. And make sure that the testing tasks are considered at the same
weight of priority as any other kind of infrastructure development task. Co-locating testers and
programmers is an excellent idea. It does not by itself guarantee communication. There's
this bizarre social thing that happens on 2 projects now. I've been in a bull pen environment.
Now, this is typical of Agile projects. That you'll have this big room, that's kind of
like a war room or bull pen and everybody is all seated in there. And on 2 projects
I was part--I was in the bull pen. I was with everybody else except I had this table that
was kind of off to the side. I mean we're talking kind of. Like no more than 3 feet
off to the side. And entire days would go by without anybody from development ever talking
to me. Now, in this bizarre social dynamic, it's too easy to become isolated. So, I became
aggressive about sitting outside my designated area in the rest of the bull pen. And this
is what is Agile testers we end up having to do. Just to remind people that we're here,
and we're here to help. And really they saw me as there to help. They didn't saw me as
an impediment to progress. In fact, I went--on one of the teams I had a couple of weeks where
I was out of town and then I came back. And the day that I came back, one of the developers
heard the door close. Very small office and so everybody heard, everything hear the door
closes. Did the prairie dog thing of popping his head up above the cubicle saw me and said,
"Elizabeth is back! Elizabeth is back!" So, I mean, I was--I was loved and appreciated
and felt all warm and fuzzy and part of the team. And yet, I still have to make sure that
I reminded the team that I was there. Have you ever been in the meeting were somebody
says, "Okay, now every get out your razorblades because we're going to sign in blood on this
one." Yes, some of you have heard this. One of you--one of you back there has apparently
been on healthy projects your entire career because your back there going, "No, never
seen that." [INDISTINCT] you've got a very sarcastic sense of humor. I've been in that
meeting too many times and it's painful, frankly. Because I mean, I understand the intent. The
intent is to say really. Are we really sure we're ready to go? I understand the intent.
But there's a flip side to that intent, isn't there? Because I'm going to hold you to it
because somebody is going to come after me because my signature is on the QA manager
line and if there is a failure in the field I know that it would be my fault, because
my signature is on that line. Well, what--this is just a ritual with a purpose and a dark
side. So, let's take the purpose and find other ways to achieve the purpose without
all of that blame stuff that gets in the way. So, the purpose is to get agreement that we
are actually done. This is a conversation between the people who produce the software
and the people who were supposed to accept it. The people who produced the software saying,
"We're done." We can't think of anything else that we're supposed to do in order to meet
the criteria that we think that you have given us. We're done. We've met the criteria. We
hit the bar, on we go. And the other side is supposed to say, either, "Yeah, looks good.
Okay, I accept it. I agree with you. You're done with this it's time to move on." Or they
say, "No, this does not meet what we thought we asked you to do." It's a very simple conversation
that results in either an accept or a reject. Well, so let's make it a simple conversation
that results in an accept or a reject. One of the things that you can do to do that is
have a ritual of exploratory testing all together as a team, a whole team. With the customers
there and the developers there and everybody else who was involved and is interested there.
At the end of every iteration, to go through the features and see if you get agreement
about whether or not the features are ready to be accepted, right? It's a lovely ritual
and it's one that makes people feel all warm and fuzzy because they get to see stuff working.
And by the way if your iterations are a month or less in length, which they should be. You
get to have that warm fuzzy feeling on a regular basis. How cool is that? You could also have--I've
seen it be very successful to have the primary customer in the XP sense of that special capital
C word do a demo once a week for all the interested parties on the business stakeholder side of
things with all the engineers in the room, so everybody's one big happy family gathering
around the table all together and there's a demo and there's a projector and cool stuff
is happening. And they get to get that feedback in some of the--most of the feedback is usually
positive because after all these Agile teams are employing good development practices so
you don't have embarrassing demos where the thing blue screens in the middle of the demo.
It basically works and there's some [INDISTINCT] and you get that feedback and you get it live,
how cool is that? So instead of signing in blood, let's find other rituals that accomplish
the same goal of having the conversation, "Do you accept it?" "Yes, I accept it" or
"No, I don't." Without all of the blame stuff thrown in. Now, test automation on Agile projects
is really interesting because they start automating before they even start coding. Coming from
a traditional background, this was a shock to me but I now practice test-driven development--I
love test-driven development. I think it's the coolest thing since forever, I don't know.
Anyway, typically what you'll find is that developers on XP projects anyway, not all
Agile but XP projects are using some flavor of X unit. Whether it's J unit, JS unit, J
web unit, pick your favorite unit depending on the technology that you're working with.
They are already doing automated tests in that technology in the--in the code--in the
language that they are already writing the code in. Typically Agile projects will then
employ something like FIT/Fitnesse domain specific languages. Some kind of acceptance
level, system level test automation and this requires a collaboration between the people
who understand the system from a business standpoint who can specify what the test cases
should be in terms of business rules or in terms of behavior that they expect to see
and then the people who are, are able to write the fixtures that will support the automation
to make that happen. So typically that's what you see on Agile XP projects and this means
that we as testers have the opportunity to collaborate with the developers to write the
test automation. We don't have to put up with web pages that don't have ID's for anything
and we're trying to figure out, okay I can get an array of all the input fields and then
I can iterate down to the--you don't have to do that anymore because the developers
already have put in all the testability hooks to make their unit test work, how cool, yes.
Okay, you guys aren't excited as I would expect about that. May be that's because you just
never had to deal with completely untestable code. Does anybody ever had an untestable
code? Okay well those you know five of you who have experienced untestable code you should
be like excited about the prospect of the testability hooks are already being built
in. Okay, the other thing that we can do is use that test automation to help support early
exploratory testing. So, common wisdom has it that you can't start testing until there's
an externally available interface against which to test. Which means that if you're
doing a gooey base thing, you can't actually start testing until there's a gooey, right?
Nah, yes you can. Now, you can do manual--I mean manual testing. You can do manual testing
against something that's completely not ready for external interface--or for internal interaction
by using automation to help drive the app to a particular point. Think with it manually
through whatever interfaces you've got available to you in whatever creative and let's face
it we're all testers. Well, yes it's actually we're all testers no matter what your title
is, you are a tester. We are all testers and we all occasionally have that moment of inspiration
when we think I'll bet if I do this, it will break. You can do that before the code is,
from a traditional standpoint ready but only because there's already automation built up
that you can leverage to get the app to the point where you can do that thing right. Then
you've got other, other automation in place to help reset the app to another state so
that you can do more malicious, horrible, evil, terrible things. By the way, I didn't
mention--I did tell you all that I'm test obsessed what I didn't tell you is that I
am in fact an evil person or at least so I've been told. Just never to people always to
software. All right, so this is my current view of Agile testing that there are three
basic activities that are happening and all of these are happening within the adoration
for the features that are being implemented now. Whatever now means for you, one is automated
acceptance or story testing, you maybe wondering why is that all the way in the left as though
it's the first thing that's happening because in some organizations it actually is. As weird
as that may sound, some organizations do story driven development, which means they define
the acceptance criteria in terms of stories, I'm sorry in terms of story tests. So that
they can run the story tests watch it fail. So you watch this thing fail. You know that
the story isn't implemented yet then you create a unit test, an automated unit test that would
implement some little itty bitty tiny sliver of that story and you watch it fail because
if it passes you know the test is bad. If the test passes when there's nothing written
whatsoever that should make it pass it's probably not a good test. Just like if the test automation
passes when the software under test is not installed that's a bad sign. Okay, so now
you're doing a unit to automated unit testing typically that's the developers. I've had
some people say, "Well, we're doing Agile stuff because we're doing automated unit test"
but it's the testers who are doing the automated unit test and the developers are developing
the code. I will submit to you that, that it well better than no automated unit test
probably isn't the most Agile approach that they could be taking. So the developers are
automating unit test and then we're augmenting that with manual exploratory testing not manual
scripted testing. See, if we know enough to script it manually so that we could hand it
to anybody who knows how to operate a computer, I will suggest to you that's a test that should
be automated. If it's that clear put which leaves us free to do the exciting stuff that
we haven't thought about yet until we put our hands on the keyboard and the malicious
side of our mind takes over and we think of the evil things we could do to this software.
Okay, so that brings us to the end of that. That's my, my current thinking on this that
is shared by the way amongst various people who are part of the Agile community. There
is a huge Agile community, not now maybe you all are shy and you just all have been very
involved in the Agile community but just in case for those of you who didn't raise your
hand there's a huge community out there. There is an XP mailing list, there is a test-driven
development mailing list, there is an Agile--well now one Agile conference in the US, other
Agile conferences throughout the world, if this is an interesting thing to you, you have
internal resources. Places that you could go internally to find out about the great
broader world of people who believe that this is a better way to develop software than that
slide I showed you at the beginning with the testing all compressed to the very end. Okay,
further readings some books and this list in the slides which again are on my site.
Some thank you's to some folks who reviewed early versions of this work and then once
again you can find the slides on my site at that location. So, how far over my time slot
did I go? We have time for questions, how marvelous, any questions, Harry.
>> HARRY: So what kind of bugs you think it has the using test that is going through the
early process. What are you seeing in the future?
>> HENDRICKSON: Two kinds make it past or it kind of falls into big categories. One
is, end to end or sequence related, okay three big categories see I'm going to change my
mind even is I'm answering the question. All right, I'll stick with three categories, end
to end tests. So things that involve multiple moving parts in the system that you aren't
likely to be able to catch with a single unit test because it involves dependencies between
multiple parts of the thing. So like integration tests or system level end to end test. Sequence
tests, you go through it once this way, you go through it again this other way and doing
it the first time sets something in some weird state that results in failing the second time
through so that's a sequence oriented test, the thing you're tweaking a sequence. And
the third thing is data related where we didn't anticipate the data base could have nulls
for example, because we thought that we're going to put in not null allowed on that column
but we're dealing with legacy data and it turns out that historically the nulls were
allowed. So those are kind of the three big categories of stuff. Yes?
>> [INDISTINCT] do away with those [INDISTINCT] end to end?
>> HENDRICKSON: Because there's stuff that's much easier to catch with the unit test than
with end to end tests. So if there is a method way buried down here, it lives down here in
it's little world and it takes input from other stuff and every now and again other
stuff will throw in a null. And then it will barf because it doesn't know how to handle
nulls. But from the user interface which is about 17 layers of obstruction away from that
one little method way down here. All of those that user interface it's really hard to get
that null down to that one little point down here from all the way up here. So what we
find is that each of the different layers of testing represent different questions that
we want to answer about the software. The unit tests answer the question, "Does it do
what I as a developer expect it to do?" And has it changed?" Because that's the other
thing the automated unit test become change detectors. So let me give you an example of
a bug that was caught by automated unit test. It was testing Regular Expression stuff, have
you ever done Regular Expression stuff? It's like gnarly to get that stuff just right.
And so we had a series of test that tested passing these strings, make sure that we get
these response back from this method that's doing Regular Expression Transformations.
We had to make a change to implement a new feature. We made that change and all of a
sudden three of our automated unit test broke because we broke that Regular Expression.
So that was a change detector for us, if we hadn't had the automated unit tests we probably
wouldn't have caught that until it became a surprise in the field. Yes?
>> You said that Agile teams [INDISTINCT] kind of behavior?
>> HENDRICKSON: Yeah. >> What about global teams?
>> HENDRICKSON: Okay. So the question was if Agile teams typically work in a bull pen
kind of area, what about global teams? I confess this is one of those areas where it is very
difficult to have that same level of camaraderie but some organizations have had success. One
of the ways that they get success is by doing a whole lot of phase time so that they build
up the camaraderie and the trust. Now if you look at the open source community although
not all open source projects practice Agile, you will find that there is a fair amount
of camaraderie and trust within open source projects and yet they are completely distributed.
So somehow they achieve that the key goal, I think is that camaraderie and trust and
you have to achieve that. One way of achieving it is by having everybody in the same room
but there are other ways of achieving it. Sorry it's a very general answer but video
conferencing helps. >> [INDISTINCT] has a bigger problem going
at [INDISTINCT] testers and developers, how do you get by and go through [INDISTINCT]
>> HENDRICKSON: Who has the biggest problem going Agile testers or developers, yes. It
depends--it depends on your organization. So in actually my husband works for a company
now, he's a product manager so he's on the marketing side of things and he's trying to
push for Agile. So where the push comes for Agile depends on your organization and by
the way both the testers and the developers are resisting him in this case as well as
the manager. Because they don't--they don't see it yet. They don't--they don't quite get
the benefit to them yet whereas he sees it very clearly from a business standpoint. So
I think that you just have to take each individually and find out what their concerns are with
some, in some cases like let's take XP. If you go into an organization and want to do
XP in some cases the developers are very comfy in their corner offices. They've gotten a
whole--they work really hard to get a door and all of a sudden you're saying that you're
going to relocate them into a bull pen environment and there going to have to pair. That means
that there going to have to have somebody looking over their shoulder every minute that
they're coding. Well, for some developers that's a really scary thing. Now I can tell
you as a developer who is in the middle of a project right now where I'm pairing it's
also the best professional experience that I've had because we're producing as a team
good code and it feels good to write good stuff. So, anyway. Yes?
>> [INDISTINCT] advocates and [INDISTINCT] regular testing who don't regular test and
I think [INDISTINCT] trying to help you whether [INDISTINCT] regular test. What I'm finding
[INDISTINCT] other difference that their--have people with regular tests [INDISTINCT] as
result, great all the time and so these people say, We'll I've tried [INDISTINCT] tests and
[INDISTINCT] how did you [INDISTINCT] that? >> HENDRICKSON: Okay. So there's a principle
that somebody is smarter than me put forth that I don't and I don't remember who it was
but on the XP mailing list, I remember seeing this principle, if it hurts do more of it.
Now that sounds countering intuitive, right? Because if it hurts we want to do less of
it because we don't want the pain but if you do more of it--the fact is that you have to
find ways out of self-preservation you have to find ways to reduce the pain. Unless you
just become numb from hitting your head against the wall. But most people I find have enough
self interest that they're not just going to go numb, they're going to find ways to
address the pain. So where you've got this test that are incredibly fragile and you know
I'm guilty of writing fragile tests, too. In fact anybody who's ever done system level
tests quickly discovers that it's much easier to write fragile tests than to write robust
tests, right? Run them more often and every time they break, fix them. Don't let them
languish in the unfixed state. The tests should always be green, the--well, the test that
you believe should pass should always be green and commenting out doesn't count. No fair
putting the X in front of the test keyword, right? So just the act of fixing them more
often is going to yield even if it's only in one person's head but hopefully it's a
shared team knowledge base of what tends to make test fragile for our context at which
way you start to learn what not to do because it's going to make things fragile in can start
applying Mox better and using various techniques to break the dependencies so that you do end
up with tests where when they fail, your first reaction isn't, "Oh, the test broke" as opposed
to "Oh, the code broke." Hey. >> Hey, so in this vision you're laying out
you mentioned that someday that the distinction between testers and developers actually goes
testers and developers go away? Maybe, I realized that's heresy. Okay. So let's go back to the
1980's when the big QA consultants--I was not one of them it's not my fault don't blame
me but they were all going around the quality consultant saying "Tests should be independent."
You all remember that mantra? Tests should be independent, test must report up to a vice
president who was completely independent from the vice president development because if
you don't do that you're being irresponsible and you must not be quality aware. You've
all heard that schpill, right? I bought into it for a long time, I don't buy into it anymore.
And the reason is because test is never independent no matter who they report to. Test reveals
information. Test yields information. If there is no audience for that information, they
don't need us, right? So, it doesn't matter that we're there. So presumably there's an
audience for that information. Now, if the audience for that information is development
then I don't care who my boss's boss's boss's boss's boss is. I work with development and
if the audience for that information is a product manager I don't care who my boss's
boss's boss's boss's boss is. I'm an agent for product management, right? So, there's
a distinction between developer and tester go away maybe and maybe that's not such a
bad thing for the developers and for testers. I tend to be one of those because I'm the
tester who codes and I confess I'm a geek. Now, there are also testers who specialized
in understanding the domain. They support the customers. They come up with good test
cases. Typically this is one of the hardest problems with XP's having the customer write
good acceptance tests, right John? >> Got it.
>> HENDRICKSON: I'm picking out John. Hi John! Because two years ago I think it was you invited
me to speak at BXP on how can we get customers to write acceptance tests. So this has been
a point of pain for XP teams for a long time. And the answer is get somebody who knows something
about testing to help the customer not be the substitute for the customer and that's
the key thing. We don't become a substitute. We testers don't become a substitute for anybody.
We support the development effort and/or we support the customer side as things specifying
and accepting the code. So, yes, I think I'm starting to think and this is heresy but I'm
starting to think that testing is a skill set and some people love testing like me.
I love testing and yet I'm finding that the lines are blurring a little bit and it's becoming
just another big skill set out of what I offer a team. Yes?
>> [INDISTINCT] >> HENDRICKSON: If the responsibility is blurring
between the tester and the developer who's ultimately responsible? Some of you know what
I'm going to say right? Let's say it in unison, the team. See this is one of the biggest mind
shifts for Agile and one of the biggest areas that's difficult for management to accept
because management typically wants to know who do I blame? At the end of the day who
do I come back to if it breaks? Who's the box stops here? Where is here? Because I want
to know who's desk that's on so that I can blame them, right?
>> Follow up question, how do we make sure we're going to pass the quality of the test
was sufficiently good? >> HENDRICKSON: Okay, so how do we make sure
that quality of the test was sufficiently good? Well, that's a slightly more complex
answer you can use code coverage analysis tools to figure out what coverage you're getting.
It--there is no one person who is responsible for the quality of the unit test. The responsibility
belongs to that all together in unison, the team. And this is one of the reasons why extreme
programming teams believe in shared code ownerships. So, it's not like this is my section of the
code and that is your section of the code. We all work on the code and the code starts
to look like anyone one of us could have written it. Yes?
>> Yes, I have a couple of [INDISTINCT] to what it is--first of all the team...
>> HENDRICKSON: This does not surprise me that somebody would object to the heresy that
I have just committed. Yes? >> The team first, I think [INDISTINCT] responsibility
of the team it sounds very good but in the end you're basically putting responsibility
to [INDISTINCT] I think it's very easy to get away [INDISTINCT] because of the team
not just me, I think it's important to name one person. Another thing is that you say,
>> HENDRICKSON: That was supposed to be green. >> Well, okay. But in practice, I'm sure [INDISTINCT]
when you said that but you practice it perhaps, right? You know that [INDISTINCT] in the morning--as
right in the morning. It used to be fixed and you just don't have time because you have
other [INDISTINCT]test that you need to fix first, you know, these tests are going to
stay right to work in a couple of weeks [INDISTINCT] what is the reality [INDISTINCT] I think the
solution is that where you say [INDISTINCT] what can the test and framework do to help
you keep track of those things to make sure that they don't slip [INDISTINCT]
>> HENDRICKSON: Okay. So am I allowed to invoke the M word again, Microsoft. They shipped
with a product that according to popular press and we all know how messed up the popular
press gets reality but according to popular press and we all know how mess up the popular
gets reality but according to the popular press they shipped to a 63,000 bags and there
vilified in the press for doing that, and they gotten a whole lot of bad press for having
ship stuff that's got security flaws and, you know, there's a whole debate that we could
go into there but that number 63,000 just fascinated me because I wonder what kind of
enterprise class bug tracking system, you have to have to manage 63,000 bugs. At which
point it dawned on me that if you have to have that much effort in tracking bugs, maybe
you need fewer bugs instead of better bug tracking systems. So going back what you are
saying about failed tests, if you got that many failed tests, see you're incurring technical
debt, now this is a term that all of us have experienced whether or not we put a label
to it anytime you're working with code that's fragile. Where you know if somebody changes
that much, well, bad things are going to happen so we don't touch that module unless it's
absolutely necessary. And if it is absolutely necessary we get out the full, you know, fire
protection suits because we're going to have to go into the breech to get into this code
that kind of bad scary codes, well let's not find special ways of managing that pain, let's
fix it, right? So anytime I hear about test having to stay red, I'll remind you that that's
a choice. Now, maybe a choice that you're making with a full perspective on all of the
business pressures that your under but I'll suggest it's a very expensive choice because
that means that you can't trust your test runs they go red and the first thing you're
going to say to yourself is, "Oh, that's just the test breaking," or we know about that.
Once upon a time on a project I had a test week that was kind of like that I had the
bunch of things that were failing for reasons that we didn't care about, this is feedback
we didn't care about. And I went through and looked at we had 75 failures or something
and I looked at the first 38 of them and there were all of this one reason that we didn't
care about. I didn't look at the remaining ones and it turns out I missed a horrible
bug because I got lulled into thinking that all of my bugs were in that class. So the
test must continue to reveal interesting feedback and information, otherwise you'll lose the
value of even the good tests, right? I suggest that if there's really one that your going
to fix a couple of weeks move it into the different test week with big red flags all
over it saying this is a known bug and were going to ship with this issue. You ask one
so I'm going to go over here and then I'll come...
>> [INDISTINCT] >> HENDRICKSON: Oh, okay. Yes?
>> [INDISTINCT] >> HENDRICKSON: Oh, thank you. Yes.
>> [INDISTINCT] slip because if you stop testing at that point well then your code is going
to start slipping too because if you test, your design changes when you test and if you
start letting your design slip [INDISTINCT] and I so I have to say overall, it's a very
bad thing [INDISTINCT] >> HENDRICKSON: Sir, I'm so glad that you
said a broken windows so Malcolm Gladwell in the Tipping Point wrote about the broken
window phenomenon where they found in was in New York City that neighborhoods that had
broken windows intended to have worse violence and worse crimes and they did simple things
like erasing graffiti and repairing windows and the crime rate went down just because
it appeared that people cared more. The pragmatic programmers Andy Hunt and David Thomas wrote
in this in their book the Pragmatic Programmer as drawing the analogy with the little breakages
and code that we kind of say, "Oh, well that's okay. We'll deal with that one later, right?"
Well, it turns out it has the same effect because than we say that's okay to more things,
stuff that we didn't have to say that's okay to. Yes?
>> [INDISTINCT] >> HENDRICKSON: You look them back and say
it works and why you are getting change it? So refactoring, just in case you don't know
the official definition of refactoring. Refactoring is changing the implementation without changing
the behaviour, essentially cleaning up. Now it may seem to some managers believe that
this should be unnecessary because if it works, it just works, don't touch it but the fact
is that by refactoring you get more maintainable codes so the next change is easier to implement
to reduce your technical debt when you refactor. Yes?
>> [INDISTINCT] question about moving cycles, it sounded like [INDISTINCT] team should be
in sort of a longer process and what if changes were coming out every week or two weeks, how
does this solve [INDISTINCT] >> Okay the question is, what if changes every
coming every week or two weeks, how does this work with Agile, well that is Agile. The changes
happen that frequently and I'm sorry if I left you with the impression that this longer
stretches of time for the Agile projects that I've been on are iterations are typically
two weeks, they're very quick turn around kind of things and they should be, it should
be small incremental changes so you get that continuous stream value so it works very well
with Agile which I was giving you the puzzled look because that exactly what you mean Agile.
Hurry, it is now time, I thank you all so much. Wow, thank you.