Tip:
Highlight text to annotate it
X
STEVE MILLER: All right.
I'd like to welcome everybody to today's webinar entitled
Agile Testing Challenges--
Overcoming Quality, Process, And Team Issues.
We've got an all-star panel today including Lisa Crispin,
Bob Galen, Matt Heusser, and Steve Millar, your's truly.
What we're going to cover today is we're really going to
talk about agile testing.
We talk a lot about agile development, but not a lot of
attention gets laid out for agile testing and the unique
challenges that we face as agile teams.
So we're going to first talk about misconceptions that
people have about agile testing.
We're going to look at the top agile testing challenges.
We're going to talk about maintaining quality as we do
sprints, and we know that sprints have a lot of speed.
We're going to talk about establishing
the whole team view.
We're going to talk about best ways to use testing tools.
And then any questions that you have during the session
today, we're going to try and get to as many as we can, so
make sure you use that Q&A panel to do that.
A few housekeeping notes.
Make sure that you use the Q&A panel that comes with
GoToWebinar to ask any questions.
We're going to keep our eye on that panel, and in fact, if we
start to get an overwhelming number of responses within the
Q&A panel, we may cut some of our pre-planned questions
short and go ahead and focus on the things that you really
care about so that we can have some very lively discussions
around that.
Notice also that we're going be sharing the event live.
We're using the #agiletesting for that.
We'll also be conducting a followup session at the same
hashtag directly following the presentation today.
Also another thing, we're going to be recording today's
session, and we'll be sending out the recorded session
within 24 hours following the event, so make sure you keep
an eye on your emails for instructions on how to access
the recording.
My name is Steve Miller, and I wanted to
welcome everybody today.
I am the Vice President of ALM Solutions here
at Smartbear software.
I have a little over 26 years of experience in both software
development, software architecture,
and software testing.
I am a graduate of the University of Alabama--
roll Tide--
and I have a Bachelor's of Science degree majoring in
corporate finance and minoring in computer science.
Now, one of the things I'm asked a lot is, Steve, what's
one of your favorite misconceptions about agile and
agile testing?
And a lot of people think that because agile is so
lightweight, there's really no need to trace back your tests
back to your user stories and requirements.
And that's a big misconception, because no
matter what type of testing you do, you definitely want to
be able to go to a place where you can see that you have
enough test coverage to fully test all of the requirements
and the user stories are out there.
So traceability, in my mind, really breeds transparency,
and we all know that agile is all about transparency.
Now I'd like to introduce Lisa Crispin.
Lisa is the co-author, with Janet Gregory, of Agile
Testing, and we're really fortunate to have Lisa on
board today.
I also wanted to let you know that we're also going to be
giving 10 signed copies of her book away today for people who
attend today's webinar, so we're very fortunate that
she's been gracious to provide that.
At least also enjoy sharing her experiences via writing.
She presents a lot, she teaches a lot, and she
participates in agile testing communities around the world.
If you need to get in touch with Lisa, make sure you visit
her website at www.lisacrispin.com.
So without further ado, I'd like to ask Lisa what her
favorite misconception about agile is.
Go ahead, Lisa.
LISA CRISPIN: Thanks, Steve.
There are so many misconceptions, but my
favorite is that people think "agile" means "faster," and
possibly because we use terms such as "sprint," which
implies speed.
But I love Elisabeth Hendrickson's
definition of agile.
It means we deliver business value frequently at a
sustainable pace, and in order to learn how to do that at a
sustainable pace and keep our technical debt under control,
we need a huge investment of time up front to learn all the
practices and principles that go into allowing a team to
work at the frequent iterations, and keep their
technical debt low so that they eventually can work fast.
But then, if you focus on speed, you're
going to be in trouble.
Focusing on quality does allow us to go fast, but it's a very
long term thing.
STEVE MILLER: All right.
Thank you, Lisa.
I also wanted to introduce Bob Galen Bob is an agile coach.
He's both a Scrum Master as well as a Scrum Product Owner,
and he's been very active in the Agile Alliance as well as
the Scrum Alliance.
He also has a book entitled Scrum Product Ownership--
Balancing Value From The Inside Out.
You can also reach Bob at bob@rgalen.com.
Now Bob, I'll ask you the same question.
What are some of the misconceptions that you've
heard about that you'd like to bring light to
regarding agile testing?
BOB GALEN: Well, there's a lot of them.
And thank you, Steve, for inviting me in.
As Lisa alluded to, there's quite a few of them, so
picking a favorite--
I don't even know if "favorite misconception" is the right
terminology, but one that comes to my mind is this
notion that agile testing is 100% a technology player, a
technologist or a programming play, and automation play.
And that you simply throw things together early on, and
you automate everything, and you go and you push a button
before you leave at night, and you come in in the morning and
magic happens, and information flow happens.
And you use things like the DDs--
ATDD and BDD.
I almost feel like I'm stuttering sometimes.
But you put all of this alphabet soup together, and
that's really important.
Automation is, I think, an incredibly important part of
agile testing, but it's not the only part.
Tooling is not the only part.
Automation's not the only part.
It's factoring in the people, the testers, the brains.
We were talking in our pre-meeting yesterday about
what the most important tool is that a tester can have, and
I think Matt said the brain, and I want to amplify that.
It's much more complex than that.
Automation plays a part, but I think good agile testing
requires testing professionals to come in there and really be
passionate about the craft of testing.
So that's my favorite, or lack of favorite, misconception.
STEVE MILLER: Very good.
Thank you, Bob.
OK.
I also wanted to introduce Matt Heusser, and he is a
consultant and software tester, and he
writes a lot as well.
He writes for Software Test And
Quality Assurance magazine.
He's also the lead editor on "How To Reduce The Cost Of
Software Testing." He also was the first master of ceremonies
for the Great Lakes Software Excellence Conference back in
2006, and he's also currently a principal consultant at
Excelon Development.
Now, Matt, I'd also like to present that
same question to you.
What's your favorite
misconception about agile testing?
MATT HEUSSER: If I can build a little bit on what Bob was
saying earlier, on the one hand, we have this idea that's
kind of dogmatic.
We're going to boil the ocean, and we're going to implement
agile, change the way we do business and exactly do it
according to this book that we read.
And we're going to do it right.
Which is this non-flexible, non-responding to change way
to approach an agile conversion.
And the second, I guess I'd say, bad implementation is
we're going to tailor our approach, so
we're going to do things--
we're not quite going to do this, and we're not quite
going to do that.
You call it tailoring, and what you end up with is
exactly what you did before, only you've inserted the word
"agile." So you get agile Gantt charts, and you get
agile business requirements, and agile testings.
It's pretty much what you did before, except now you have
maybe iterations or maybe cards going across the wall.
And both those end up with comments like, "We tried agile
and it didn't work." And that just makes me sad.
STEVE MILLER: All right.
We don't want you to be sad.
OK.
So now what I'd like to do is I'd like to go ahead and have
some frequently asked questions that we get a lot of
times regarding agile and agile testing.
And just present these questions over to the
panelists, and then have them provide some input and some
feedback based on their years of experience.
Because we have a really rich set of our panelists today.
They've had a lot of really great experiences that we want
to make sure that we're able to tap their brain for what
they think about this.
Now, all of us have probably, at some time, participated in
both agile and waterfall development, and we are very
sure that the testing needs of those two different
methodologies is very different.
The reason for it is agile is much speedier.
You have much lighter-weight requirements.
You're doing continuous integration, so you're doing
builds constantly.
And that really does present a lot of challenges for the
agile testing team, because the whole way that you
approach testing really is under fire, and it's much
different than you can do it over in waterfall, when you
have much longer time to plan things out, and you do it in
more of a staged environment.
So I'd first like to ask Matt, what are some of the top agile
testing challenges that you've come across, and how have you
dealt with those?
MATT HEUSSER: Well, I'll try to stick to testing.
But to start with, this applies to all types of agile
software development.
I mentioned that double edged sword earlier.
And when you want to start doing agile development,
you've got to go somewhere between this dogmatic
implementation, boil the ocean, we're going to do
everything according to the book.
If you try that, usually in my experience, as soon as
anything goes wrong--
and it will, because we're all human beings--
the people that are opposed to the initiative will come out
and say, ah ha, it didn't work.
And then the second problem is not changing enough.
So the first thing is, what we do first?
And typically for testing, where I see that is how do we
shrink the duration of time it takes us to test the software
to make sure that it's good enough to release?
Of course, we can never be sure, but to have some
confidence that that regression test window--
I think the pop index called this the cadence, from we want
to start figuring out when we can release this thing, to
let's put it out there.
In traditional shop, that might be weeks, and we might
want to get that down to hours.
That's a significant challenge.
And as soon as we say, OK, we're going to shrink that
window, we can't stop developing software in order
to do that.
So now we have to figure out how we're going
to invest our time.
Do I just get this done, release out, or do I build
some automation and some hooks and some investments to allow
me to test faster?
Those are a couple that come to mind.
STEVE MILLER: OK, great.
Thank you.
And Lisa, did you have anything to add to that?
Because we all know that it's kind of like drinking from a
fire hose sometimes whenever you do an agile testing,
because builds are coming continuously, you're always
having to test with code that has changed.
How do you address that?
LISA CRISPIN: Well, building on what Matt said, I think one
thing that's very hard to learn is how to slice features
into small enough pieces so that you can work
incrementally as well as iteratively, and start with
some simple, vertical slice through a new piece of
functionality or feature.
Get that coded, tested, automated, everything
explored, everything that's needed, and then start
building on that.
It takes time to learn that, and so another key is building
time to learn into your first several months, or even your
first couple of years of work.
Another challenge for larger companies
is specialized testing.
Perhaps only a few people out of 50 Scrum teams know how to
do performance and load testing, or know how to do
security testing, or some specialty that's just not
widely known.
And how do you spread those people around?
So lots of challenges that require a lot of experiments
to try to solve.
STEVE MILLER: Thank you, Lisa.
And then Bob, to follow back up on this too, we had
somebody graciously put into the Q&A panel that doing agile
is like modifying the plane during your flight.
Because basically, you're coming out with these small
user stories, you're tweaking them as they go, and that has
an impact on testing because you have to get yourself to a
point where you can actually test those new changing
requirements.
So, what are your thoughts on that, Bob?
BOB GALEN: I mean, it's fair.
I think that question is valid, and both Lisa and Matt
were talking about it.
I want to bump us up a level a bit and
talk about agile testing.
One of the biggest challenges is it's not
just about agile testing.
It's not just a test group problem or a tester problem.
If a company or an organization is going agile,
then they're going agile organizationally.
They're going there as a whole team.
So for example, given that question, well, architectural
decomposition is different.
Product composition, UX, is different in agile.
So from a whole team perspective, it's not just
dump it on the testers and let them figure it out.
Not that anyone was implying that, but it's OK the UX folks
need to maybe incrementally change, and the architecture
folks, and the development folks, and the BAs need to
incrementally change how they do work, how they flow, how
they collaborate with test in order for it to be effective.
It's not in a vacuum.
It's not an across the walls sort of play.
It's a whole team play.
And I want to emphasize that I think the key challenge is
that the organizational leadership needs to play a
part in that.
You don't just let the teams flounder and figure
out how to be agile.
I think leadership needs to understand the agile's not a
speed play, to Lisa's point.
That it can be a speed play, but that's not the intent.
So, you need leadership awareness.
You need leadership support.
You need organizational leadership, guidance and
understanding, I think, in order to be effective.
And then, they need to influence the entire
organization, the development organization, the product
development, the project managers, et cetera.
Otherwise, you have these little groups that are
struggling in and of themselves to deliver value.
STEVE MILLER: Very good.
Thank you very much.
One of the other things that I was wondering about, based on
your years and years of experience and managing these
agile testing processes, have you identified any specific
best practices or processes that help you when it comes to
quality and making sure you can keep up with the speed of
delivery of the sprints?
Is there anything specifically that stands out in your mind?
Lisa, if you wouldn't mind taking that
first, that'd be great.
LISA CRISPIN: Well, I think one key is not so much an
agile testing process as just a team process in general.
One of the most common problems I see with new agile
teams is that they over commit.
They want to make the business people happy.
They think they can get a lot of work done in a week or two
weeks, or however long, and they bring in too many
stories, work on too many stories at the same time, and
they get to the end of the iteration, and testing's not
finished, or maybe the coding is not finished.
And so, it's important to under commit, take only a few
stories, focus on finishing one story at a time, including
all the testing activities, and limit
your work in process.
And so, I think that is really key.
And I think once people get good at some of the agile
practices, they still have trouble understanding the
customer's requirements and specifications.
And that's another big problem.
You may not have any, technically, bugs, but you
didn't deliver exactly what the customers wanted.
So I think things like the Three Amigos approach, where
you get the developers, product owner, and tester
together to discuss issues.
The whole team approach in general, working together and
doing some kind of specification by example or
acceptance test-driven development, where we actually
start by thinking about what we're going to test, and then
we start writing the code to make those tests pass, and to
get examples of desired behavior and undesired
behavior from the customers, and translate those into tests
that drive development.
STEVE MILLER: OK.
So it sounds like prioritizing your user stories is really
something that we should wrap our head around, because as
you said, no matter what type of development you're doing,
whether it's waterfall, or RAD, or agile, there's still
just a certain number of hours in a day, and we have to
prioritize things.
So I can definitely see that.
And I really like your idea about using UAT to determine
whether we're delivering the things that are really
necessary for the clients to be successful, so I think
that's very good feedback.
So Bob, what do you have to add to this, the best
practices and processes, here?
BOB GALEN: I was yelling, "Amen, Lisa," when she was
talking about taking too much on.
I think a lot of immature teams have different levels--
agile teams have-- and test groups, and agile team
themselves have different levels of maturity.
And I think one anti-pattern or one pattern I see in young
or immature agile teams is they take way
too much stuff on.
Their hearts are in the right place, so they're not
malicious, but they're just taking way too much on, and
then they struggle.
The second thing that Lisa didn't say is they struggle
asking for help, in the team and then outside of the team,
saying, whoa, we just took way too much on.
And they may not trust that people can handle requests for
help in the team.
So team trust, team buy in, team cohesion comes into play.
But I do see that as a problem.
And then you're going for speed, and you're not going
for quality.
And you want to flip that bit around.
You want to go for the Three Amigos.
You want to go for collaboration.
I would rather a team delivered half of their
commitment from a sprint and did it well than didn't admit
that they were struggling, or tried to over commit and slap
dash it together at the end.
Now, you do see that in a lot of teams, because they get the
value proposition wrong, they get the
intent proposition wrong.
And then just quickly, they have to transition that.
They have to flip that bit.
As a coach, I often ask for half of what they think.
So I take all of that pressure for speed away, and I'm like,
I'll take half your commitment if you do it well.
Now figure out by reducing width, how can you do it the
most effectively and efficiently as a team?
And the team starts getting that muscle memory around
throughput and quality, and then they can take more on, if
they choose to.
STEVE MILLER: Absolutely.
I think that's really relevant, because I think a
lot of people have the idea that you just start doing
agile, and all the problems of software development goes
away, and all of the problems of testing goes away.
And we all know that it's like anything else that you do.
It takes maturity, it takes people really honing their
skills and the processes to really get the most out of it.
So Matt, we'll follow up with you as well.
I'd like to get your input on this as well.
Have you used any best practices or processes in the
past that have been more effective than others?
MATT HEUSSER: So, I belong to a community called the
Context-Driven School Of Software Testing, and we
actually censor the term "best practice." We aren't allowed
to use it in public.
People have been removed from the community for using that
term, in my community.
Not everybody belongs to it, and that's OK.
This is agile, not context-driven.
But I do think that practices can be better or worse in a
given context, and there are a number of things that are
generally good which are generally helpful, especially
in an agile context.
So for example, is the team actually, really doing
[INAUDIBLE]?
Are we really having kick off meetings before anybody writes
any code to build a shared mental model of what the
thing's going to do?
Do we have real agreement over what the thing is going to do,
to the point that when I do find a bug, maybe it's 3/4 of
the way into development of that story, the developer
says, yep, yep, yep.
That's right, that's a bug, we've got to fix it.
Or do we fall into this trap of, well, there's no story
test for that, so we've got to have a new story?
And the product manager's going, no, it
needs to go live Friday.
It really need to go live.
We don't want to have this explosion of stories that were
created because we weren't on the same page.
I think those are all really good things, and those are the
foundational aspects of the agile context that I think
we're sharing here today.
One more thing I'll mention is this mentality, especially for
a team that is new to this way of developing software, that
velocity is king, and we're going to make shortcuts to our
master, King Velocity.
And of course, it doesn't take long, usually within the same
iteration, within the same month, certainly.
You're working over time, you're making mistakes, and
you're actually slowing yourself down, and you can
feel the pain.
Even in the short term, that approach doesn't work.
And getting that culture change to the point where
people recognize that can take a little bit of work.
An extended client a while back, the technical team said,
we don't have time to develop this automation--
I think it was adapting unit testing-- within this sprint,
we're not going to be able to do it.
We're sorry.
And the product owner said something like, I believe in
the value of this practice.
Write a story up for it, and I'll fund it.
Right?
I mean, it's OK, because I know that's going
to go faster later.
And the team didn't believe it.
It was like they couldn't see it, and they didn't do it.
And I think that mentality of cutting corners certainly
doesn't help you maintain the quality, and could even hurt
your speed of iterations.
STEVE MILLER: Great.
And I promise I won't say "best practices" again within
your organization.
[LAUGHTER]
STEVE MILLER: Thanks for that.
So, a couple people have asked along the way if we're going
to be [INAUDIBLE] session, and we are going to be recording
today's session, and it will be distributed to everybody.
So I just wanted to make sure, for those of you that joined a
little bit late, that you are aware of that.
So we're having some really spirited discussions today
about agile testing, and the panelists are kind enough to
go through based on their experiences.
One of things that I think people struggle with is that
the agile team makeup is very different than more of the
waterfall type team makeup, because it's a very
collaborative event where everybody is working towards
the same set of goals with smaller feature sets, and
starting to do as much testing as they can
as the product develops.
So it always brings up at the question that I hear a lot,
and that is how does the whole team work together so that
everybody's on the same page when it comes to quality and
the testing that needs to be done?
And I've even had the question, should we be
breaking it out so that we manage the development sprint
separate from the testing sprint, so that everybody's
working on their own velocities, on my own burn
downs, that kind of thing.
So I wanted to get feedback from the panel about how
they've approached that in the past.
And we'll start with Bob.
BOB GALEN: No, don't break out development
and testing, please.
So, two mistakes, Steve.
One is, I knew you were in trouble when you said "best
practice." My heart went out to you.
I'm like, that's probably the wrong thing to suggest.
And no, we don't want to break things up.
I think agile is about teams, not about functional silos,
and hand offs, and things like that, and I think you need to
get that in the DNA of the team.
I'll start just here, some things come to mind.
What signs do you to look for?
So if you have a team versus not?
So one thing, like an anti-pattern, is in an
inexperienced team, if you hear the term "why didn't test
find that," meaning the testers on an agile team,
you're not operating as a whole team.
You're operation in silos.
If you're in a sprint planning meeting, and developers are
throwing, let's say, 2 point estimates for a story, and
testers are throwing 20 point for testing a story, and the 2
points wins-- not that winning is the option--
then where is the right direction?
Because development is driving over tests.
We're not listening to each other, we're not digesting the
complexity of the story, then you're not operating as a
whole team.
And if developers don't willingly pitch in to test--
because it's the right thing to do from a throughput
perspective, from a team cohesion perspective, from a
getting things done, getting stuff out the
door point of view--
when it makes sense, without mumbling or whatnot, then you
don't have the whole team view.
Now, the counterpoint is look for those things, try to
establish them.
I think they're incredibly crucial for the team getting
throughput.
Velocity is not the point.
It's working together cohesively as a team,
delivering high value, and doing it well.
Delivering high quality as well.
And the best way to that is to operate as a team.
I think leadership plays a really strong part in that,
the conversations.
Every day, every minute, every conversation, you have an
opportunity to move from test to team, to move from testing
or development or BAs or project management to team
activity towards done.
And I think you get there in baby steps, and then really
amplify the team in sprint reviews, just in every
activity in the agile team.
And if you're dogged about it, I think you get there.
STEVE MILLER: Yeah, absolutely.
And that was more rhetorical than anything about
breaking that out.
But I think a lot of people have the view that they should
be breaking that out, and I wanted to make sure that we
illuminated that today, that that's not the best approach.
Matt, I wanted to follow up with you as well, and find out
what types of things that you've done in the past when
it comes to a team concept, how your teams work together
to try and hold up the quality and the testing that's
necessary within the agile team?
MATT HEUSSER: There's a couple of things that occur to me.
One is to focus on throughput.
By that I mean, where are the bottlenecks, and how can we
make system flow faster?
As soon as you have a bottleneck, then every team's
going to be slowed down to the speed of that bottleneck.
Now, the classic response to that is to complain about
bottleneck, right?
Those testers are slow!
[INAUDIBLE]
story, because those testers are slow.
Or those [INAUDIBLE]
are slow.
We've got [INAUDIBLE] years of business requirements, and the
stupid developers can't even get the darn thing.
We just need to get them to work on the software.
And when you think about that from a whole system approach,
complaining about it doesn't speed anything up.
And asking them to multitask by throwing more stuff in
their inbox doesn't speed anything up, either.
So what we need to do when we find the bottleneck is isolate
it, and lift it up, and figure out how we can make that, for
lack of better term, "resource" go faster.
So if test is slower, than we need to re-shift
responsibilities.
For example, we might have the developers-- because this is
what developers do, is they like do code.
Writing the system level test automation.
So the testers can define it, but the developers can write
it, or contribute to it, or developers can write hooks
into the code to make scripting easier.
Whatever we can do.
Shifting responsibilities around to even out those
bottlenecks will provide the maximum
throughput for the system.
And I mean, that's straight out of lean manufacturing
theory, and I think it's well grounded in
what we actually do.
It's not borrowed from some metaphor from some other way
of building stuff.
I find that when we do that, then when someone is a
bottleneck, we have extra bandwidth because we're
waiting for them.
We ask how can we help?
And culturally, that can get reinforced from
management on down.
STEVE MILLER: Thank you very much.
It's funny, somebody just chimed in that finding
bottlenecks is not about blaming.
And that's the whole thing about transparency in agile,
is that it's not a blame game.
It really is a team concept to make sure that everybody's on
the same page and moving to the same deliverables.
It's kind of funny, you guys can't see this.
Us as organizers of the webinar, Lisa's in the
background, chatting over to me, going,
let me try this one.
Let me say something about this topic.
She is absolutely chomping at the bit, so
Lisa, it's you're turn.
I want to hear your thoughts on this.
LISA CRISPIN: Well, I definitely agree with what
Matt and Bob said, but I think that the whole team approach
begins at the beginning.
And as Bod pointed out, you need to be a good leader.
And it's important to get the team together to say what is
our commitment to quality?
And that has to be meaningful.
We don't want to make excuses later on, to say, well, we had
this commitment, but we've run up against an obstacle, so
we're going to not meet our commitment.
In the case of my current team, we decided that our
commitment was to write code that we would be proud to take
home and put on our refrigerators or give to our
moms, to deliver the best quality software that we
possibly could.
And when we run into problems, something might be hard to
test, or we don't understand something, or we have an
automation issue, or whatever that problem is, we can't just
throw up our hands and say, well, that's too hard, and so
we're going to let that slide.
And also, because we have everybody on board the team--
Bob mentioned earlier, user experience, designers,
business analysts, DBAs, everybody involved in
delivering software, and all the domain knowledge,
investing time and learning our domain--
this allows us to push back on the business to help simplify
the stories.
And there's been some talk about velocity on the Twitter
trend for this, and I think there's way too
much focus on velocity.
But one way to improve your velocity--
the way my team does it is that we think we understand
the purpose of what the business people want, and of
the features that they want, and the business problems
they're trying to solve, and then we
think of simpler solutions.
And we can do that because we have such a variety of skills
on our team, on our whole team, and we can push back and
say, well, how about if we did this?
This does 90% of what you want, but it's half the cost.
Would this be OK?
STEVE MILLER: All right, thank you very much.
Now, one of the common questions that I'm seeing
coming through the Q&A panel right now really has to do
automation, and what are the best ways that you can use
automated tools, especially because you're doing some of
this continuous integration, that kind of thing?
So, it would be good to get the panel's ideas on what
they've done in the past from an automated perspective.
And Bob, if you wouldn't mind leading that discussion, it
would be great, because it's a common thing that I'm seeing
popping up in the Q&A panel.
BOB GALEN: So, I'll use a current team that I'm
interacting with as an example.
One, they're going all in on automation.
They're using the a variety of tools,
mostly open source tools.
And what I find is they're in the middle tier, and they're
just writing automated tests, and when I looked under the
covers, they're writing automated tests over user
stories that are incredibly ill-defined, like they don't
have acceptance tests on the user stories, and they're just
very ambiguous.
So, they're not focusing on the Three Amigos, as Lisa
mentioned earlier.
They're focusing on writing test code as quickly as
humanly possible.
I want to segue into the other panelists, but I would like to
make the point, don't lead with automation.
Even though automation is vibrant and rich, and I always
invested in it, and it's always saved my butt at the
end of the day, et cetera, et cetera, et cetera, I don't
think leading with automation is the right thing.
I think leading with collaboration, leading with
well-defined, let's call them, user stories, leading with
discussion over simplification, a focus on
writing good acceptance tests and good user stories, and
getting that under your belt, particularly
if you're new team.
And making sure that you have that not as a best practice,
but you're good at it, and it works for you and your team in
your cross-functional organization.
I would rather a team focus on that before they
start writing code.
And I just want to throw that out there.
And then start writing code based on those user stories,
based on that experience, based on the fact that you're
collaborating well as a team.
STEVE MILLER: All right, Matt.
Just for the record, I think I heard Bob say
"best practice." [LAUGHTER]
[INTERPOSING VOICES]
BOB GALEN: Oh, no, I'm ejected.
All right.
I'll hang up now, everyone.
I'll see you later.
STEVE MILLER: OK.
So, I want to turn this, then, over to--
MATT HEUSSER: I think you're allowed [INAUDIBLE].
STEVE MILLER: What's that?
Go ahead.
MATT HEUSSER: I think you're allowed to use the term
rhetorically the way Bob did.
I also think that Bob isn't on the Excel spreadsheet of
listed people who are publicly associated with
context-driven.
STEVE MILLER: OK, sounds good.
So Lisa, based on your experience, Bob was mentioning
that you don't want to lead with automated.
I mean, it's not the end all, be all that's going to solve
your testing problems when it comes to agile, but as Bob
mentioned, there's definitely a place for it.
It'll save your but, as he said, in the
end a lot of times.
I wanted to see if you could follow on with your thoughts
on automated testing tools, and also, we were talking a
little bit about user stories and defining those well, and
making sure that we have enough meat in there that
people can actually do the testing they need to do
without a lot of undo rework.
So, I'd like to get your comments on that.
LISA CRISPIN: I have a lot of observations to share on that,
but I wanted to check.
I think Matt might have wanted to chime in on this last point
we were talking about.
Did you have something, Matt?
OK, he's not coming on.
So, I think the key for success with automation test
tools, luckily nowadays, we have so many choices.
There's so many great drivers and frameworks for testing at
all levels of the application.
And I think the key, though, is for the whole team to
choose the tools together.
And even more key, the mistake my team made was in first
focusing on choosing the tool rather than focusing on what
do we want our tests to look like?
What's going to work with us in terms of the
design of the tests?
Are we going to use some kind of [INAUDIBLE] format?
Some kind of natural language?
Do we need a tabular format?
Things like that.
And then pick a tool that does that for us, or pick a
framework that does that for us.
So too many people start the wrong way around.
We're dazzled by all the tools out there, and we
want to pick one.
So first decide on your requirements for the tool, and
then pick the tool.
And then the whole team needs to research that together, and
do small experiments to pick the tool.
We recently needed a new framework and driver, because
we had some code that our existing GUI test tool could
not test, and we didn't want to release code to production
that doesn't have regression tests.
So we actually did some bake offs for different tools or
different frameworks.
The first framework, we invested several weeks into
it, and then decided it really wasn't right for us.
Some teams would be tempted to say, look, we did all this
work, let's just push ahead.
It's kind of the Vietnam syndrome.
But we just said, OK, that didn't work, we're going to
try something else.
And if that one doesn't work, we're going to try something
else, because there are other alternatives.
So, I think experimentation is really key, and getting the
whole team, all those skills involved, that you need to
make the automation successful.
STEVE MILLER: That's really interesting.
So what you're really suggesting is that you have an
iterative approach to tool selection.
I haven't heard anybody really mention it that way, but I
thought that was very interesting.
Matt, it looks like you've been chomping at the bit to
get into this conversation, as well.
What do you have about [INAUDIBLE]?
MATT HEUSSER: Sure, yeah.
A couple of things.
If you're a programmer, I think test-driven development
automated unit tests is the bee's knees.
It's a blast.
I think you end up going--
I think, and I don't have the empirical research behind
this-- but you get higher quality, and you're time to
market gets sped up once you know what you're doing.
Of course, you have to actually know a little bit
about testing theory beyond overly simplistic ideas.
But I think KD great at the developer level.
That's usually, at that point, not really a testing tool.
It's really more of a software engineering tool.
It helps you do things like drive out your design by using
your code as a client first and isolate your component.
When I talk about testing, I want to know
if it works or not.
Unit testing doesn't really tell me that at the high level
customer level.
So that's a testing tool I almost always recommend if it
fits with the kind of
technologies the team is using.
When it comes to actually customer-facing tests, tests
that the customer would understand and want to look
at, I mentioned the kick off earlier.
We get the Three Amigos in the room, and we figure out what
this team is going to do, and we write down examples that
the customer can understand about when an order comes in.
And it's more than 30 days late, we'll set it up for the
order came in, and it should have been 33 days ago, and we
can do that in Excel.
And you can hand that to a tester, and when he gets the
build, he can go through those examples and
say, yep, it passes.
And you can be done at that point, and you
could call that a win.
If you want to take that and put it into a tool, like
Fitness or like Cucumber--
there are entire families of these tools--
and automate it, that's cool, too.
Right now, one of the teams I'm working with is using
Fitness, which for a .NET framework end-to-end, or a
Java framework, you can get underneath the GUI, and go
into the business logic, and expose those
business logic tests.
And then your GUI test is just, if you don't have a
really complex GUI, OK, when I put these values in, do they
go across the wire and go to this
underlying business logic?
I already tested it And when the values come back, are they
the right values that come back?
And it can speed up GUI testing, and then we've got
this regression test that we can run that
runs under the GUI.
I think that's a great tool.
So I totally agree with Bob in that it can be a mistake to
overly focus on tools, but you asked the question,
so there's my answer.
STEVE MILLER: OK.
And Matt, I'm going to jump to the people's questions that
are part of the webinar today, but it'd be really cool if
you'd share with the team what you told us yesterday when we
were preparing for this, when somebody at one of the
conferences that you were attending asked you what your
favorite tool was.
I just thought that was a really cool story that you
told yesterday.
If you wouldn't mind sharing that with the webinar
participants, that would be great.
MATT HEUSSER: I wasn't prepared, but I'll try.
That was at a conference room, I think, in March this year,
and this company was hiring contractors and consultants
part time, remote kind of stuff.
The kind of work that I would be really interested in.
And I went over to them between sessions and
introduced myself.
And the first question asked was what tools do you use?
And I kind of knew the answers that they were looking for.
They were Selenium, Cucumber, Quick Test Pro, TestComplete.
And in context, I can see how that's a reasonable question.
But as the first question to ask me, I think that they
actually said, "We're a best practices shop," as I was
trying to figure out my answer.
So I said something like, well, my brain, pencil, piece
of paper, just kind of looked at them.
And I think that was like the litmus test, quick answer that
I could give.
And maybe a litmus test for them, because they said,
"Thank you," and kind of walked away.
STEVE MILLER: All right.
Appreciate that.
That was a funny story that we were talking about yesterday.
And now, what I want to do is I want to go
ahead and turn it over.
We have about 15 minutes left, so rather than go through some
of these questions that we had ahead of time, I think it
would be a really good thing to go ahead and turn it over
to the webinar participants.
So definitely make sure that you utilize the Q&A panel over
in GoToWebinar, because we're going to answer a few of the
questions that come through.
I'll just spot some that I think would be interesting to
the group, and I'll go ahead and present that.
Anybody on the panelists that would like to accept these
questions, you could either chat to me to let me know that
you want to take it, or you can just speak up.
However you want to do that, because that is a best
practice here.
All right.
So, what we're going to do now is Mary had asked the question
a little earlier as we were going through some of this.
She said that her team works on providing complex business
objects reports, and their biggest challenge that they're
facing with today is data validation.
And rather than functional security or performance
testing, the bane of their existence right now has been
data validation, and Mary wanted to know if there's any
suggestions on how to approach that type of testing.
We're not going to ask for best practices.
We're just going to ask for approaches.
And Matt, it looks like you said that you've got an
answer, so go ahead.
MATT HEUSSER: Yeah, so I want to make sure I understand the
question correctly, because I'm assuming that the database
has the right data in it, but we are applying some business
rules to that data.
We're giving some sums, and some averages, and some
transformations in the business objects platform,
which is really code, right?
It's really complex SQL-like stuff, or maybe we're using
some dropping and dragging to create this
query report thing.
But it's code, and we want to know that the answers that it
produces are correct.
I think that's the question, right?
STEVE MILLER: Yep.
MATT HEUSSER: And if that's the question, the first piece
of advice I usually give is code it twice.
So you give the same requirements to two different
people, and you have them both implement this report, and
then you run the reports side by side, and you make sure
that they're the same.
Anywhere they're different is interesting.
Doesn't mean it's wrong, but it's interesting.
One of those is obviously not right.
And sometimes when these reports get very, very, very
complex, you can do a little bit of simplification, like
just give me a count of the number of rows that come back,
or give me some checksums, or you can make the second
person's job a little bit easier.
Harry Robinson has done a lot of writing on model-based
testing approaches, and this is inside of a database where
all you're trying to do is simulate a query, I think, is
one of the cheapest ways where you can do
a model-based approach.
So that's where I'd start.
STEVE MILLER: OK.
And are there any templates, anything like that, that might
help with data validation?
I mean, we all know that if you're dealing with dates
where we have to worry about the bounds of the dates--
whether there's 30, 31 days in the month, whether it's a leap
year, that kind of thing--
when we're dealing with numerics, there's always
standard tests that we want to make sure that we execute on
numeric fields, depending on whether they're currency or
integers, or whatever.
Anybody else on the panel that may have had experience with
that kind of data validation and being able to follow
through on some of those things that
might have been helpful?
BOB GALEN: I don't have an answer there, but I'd like to
flip the question around a little bit.
We're talking about testing techniques, and those are all
fair, but I want to quickly add that I would turn this
around and make this the team's problem to solve.
I would make it part of the [INAUDIBLE] criteria for the
stories that we have a data validation
challenge or a problem.
It has different levels of complexity depending on the
features and the functionality the team
is trying to implement.
I would make the business aware that data validation is
hard, and it's going to cost the team more time,
potentially, to test that well.
And on a per story or per use case basis, or whatever, the
team has [INAUDIBLE]
the testers.
The question almost sounds like it's a test problem, so
we need test techniques.
And it is a testing problem, but make
it the team's problem.
And testers can contribute to that, but there's a heck of a
lot of design for test that the developers can do,
probably, to make data validation easier.
So I just wanted to throw that out there, of flipping it into
the team would be one of my reactions, as well.
MATT HEUSSER: If I could just add a little bit
more on top of that.
So, one of the ways I've seen that manifest over time-- for
instance, with something like business objects--
is that you start to get reusable libraries, where you
can say, give me the people that had eligibility between
the gate or insurance domain, for example.
Or tell me if this particular user ID is
eligible on this gate.
And then instead of trying to recreate that in
SQL, which is, ahh!
Right?
You get these reusable functions over time.
And you can get that when you have the developers and the
business people in the room when you're planning how
you're going to test it.
If you do testing in isolation, the tester's
probably not going to say, hey, maybe I could write a
reasonable framework [INAUDIBLE]
Testers don't say that, developers do.
So I totally agree that the whole team can help you get a
better long term solution.
STEVE MILLER: Absolutely.
And I was alluding to that earlier on that follow up,
too, so thanks for bringing that up.
And that really dovetails into another question
that Danny has, here.
He wanted to know what the role of Agile testing from the
developer role versus the QA role, because it seems like
some overlap there.
And how do you make sure that the handoff is good?
Because with automated testing, it's almost like a
lot of the automated testers are almost programmers, and if
you do test-driven development, a lot of that
takes these developments skills.
So I guess Danny's question is, how do we differentiate
between the developer and the QA role, what the overlap is,
and how do we manage the handoff?
So, Lisa is absolutely chomping at the bit for this,
so go ahead, Lisa.
LISA CRISPIN: Thanks.
I heard Jeff Patton speak recently at the Mile High
Agile Conference, and he likes the term "positions" rather
than "roles," and he likened it to a sports team.
For example, if you're playing football,
and you're the kicker.
And you just kicked off, and the runner from the other team
is about to go by you, because everybody else missed him, you
don't stop and say, well, I'm the kicker, it's not my role
to tackle this person.
So, we all have to wear different hats at different
times, and so, we shouldn't get too hung up on whose role
is it to do what.
We all have to jump in and make that a team
responsibility.
And when I hear the word "handoff," I cringe a little
bit, because hopefully we're collaborating.
The testers are helping the developers, right?
The acceptance test up front and the [INAUDIBLE]
development, and working in small iterations of writing
tests, writing code, doing testing, doing the exploratory
testing, writing more tests.
And that should be something that just goes on in tiny,
little iterations until the story's done and all the
activities are finished.
And on my team, because it's such a testing-intensive
application that we have four developers and three testers,
the developers often will take on testing task cards, because
we've got to get that story done.
Just last sprint, we were very busy with testing, and one of
the developers did all the manual testing
for one of our stories.
And we had a mind map of post-test cases to carry out,
which he followed, and he also did a really great job of
exploratory testing, because he's had a lot of experience
doing that in the past eight years.
So, it's again, like we've been saying, making it a team
problem, and we've got this testing to do.
And in terms of overlap and unit test, if people are
talking about automated test, they say, well, we don't want
to repeat what testing the unit test did.
And I don't really find that to be much of a problem.
Unit testing is so different from testing at higher levels.
If there's a tiny bit of overlap, it's not necessarily
a bad thing.
STEVE MILLER: Great.
I'm laughing here because those of you that are
attending the webinar can't really see the behind the
scenes things, but Matt had mentioned that, most likely,
Lisa would say that you don't handoff, you collaborate.
And no faster than he could type that, it was coming out
of her mouth.
So everybody over here was dying laughing about that.
So, I think have about 6 more minutes left
in the webinar today.
We had a question, I think, that's going to be asked by a
lot of different people, so I wanted to make sure we got to
this question before we closed out today.
And the question comes from Ross.
And he says, do you have specific recommendations for
remote teams?
Because a lot of people either have teams that are overseas,
or they may even be in the US, but they are geographically
dispersed from where you're at.
So do we have any recommendations of the panel
as to how we can better collaborate between the team
members and how we can work better
geographically dispersed?
So anybody that wants to take that, feel free to jump in.
It sounds like Matt wants to take that.
MATT HEUSSER: Yes, thank you.
So, the last time I had one of those day job things, I work
for a company called Social Text.
We made software to enable massive collaboration and
communication, and the entire engineering staff was
completely, massively, physically distributed, by
which I mean everybody worked wherever there was
internet and power.
So we had to figure out how to make remote work work, because
that's how we were structured.
Now, the company mostly hired out of people who were
contributors to open source software, and that's how a lot
of open source software is built, because there are no
physical offices.
So there were a few techniques we used from there, like
continuous IRC, and we Skyped a lot, and we used GNU Screen
as a screen sharing school.
So if you have headphones on and you're in Skype, and you
have something like GNU Screen, which is ASCII-based,
there's no GUI, it really feels like you're [INAUDIBLE]
program, like you're right with the person.
But everybody was remote, right?
So it wasn't like there was one guy off in a corner who
lived two hours away and only came in every couple of weeks,
and everybody else was on site.
Everybody was remote.
Everybody remote, I think, works.
Everybody on site can work.
It's when you try to just take a couple of people and you put
them in a corner that you run into a problem, because they
don't see what's on the whiteboard.
They don't pick up the back chatter in the hallway.
They don't pick up the back chatter by the water cooler.
And that's a problem.
So what most people say when they ask the question is, our
company's growing by acquisition, and we have an
office in Belarus, Russia, and we have an office in Ireland,
we have an office in Boston, we have an office in Portland.
And for that, I highly recommend trying to have
integrated product development teams in each space that
actually [INAUDIBLE]
They own that they own it.
They can push the production by themselves.
And then you manage the seams of the components and how they
interact, not the devs are in one place, and the testers are
somewhere else, and the business analysts are
somewhere else.
I haven't particularly, really, genuinely seen that
work well yet.
Also, one more thing I'd add is core office hours where
everybody can talk together.
Four hours a day, maybe 10:00 to 2:00 in your [INAUDIBLE]
time zone, everybody's there.
You've got to have a decent time when people can chat, and
some high bandwidths where they can do it.
Some kind of instant feedback mechanism, not I'm going to
slide a business requirement under the door and see you in
six months.
STEVE MILLER: All right.
Thank you very much, Matt.
So, we're nearing the top of the hour here.
I wanted to just remind everybody that you can join us
after the event on Twitter.
#agiletesting is the hashtag that you'll need.
Also, make sure you keep an eye on your inbox
for upcoming webinars.
This is just a continuing series of webinars that we're
doing here at SmartBear, and we're really doing all of
these to help empower you and help you do your job better.
And we certainly appreciate you joining us today.
We know that everybody's time is valuable, and we do
appreciate everybody joining today.
Now, we got tons of questions today, a lot of them that I
couldn't even address here, because they're so numerous.
We do plan to package these questions up and provide
answers to these over the next month, so keep an eye out for
that as well by going out to smartbear.com, and certainly
keep your eye open for that.
We'll be providing that.
Now, also remember that we also recorded today's session,
so we'll also be sending to you the recorded session.
Feel free to pass that along to others that weren't able to
attend today, and we certainly appreciate you joining today.
And a special thanks to the panelists here, both Bob,
Lisa, and Matt.
We had some very spirited discussion today, and you guys
are great presenters, and we do appreciate your time.