Alan and Brent talk testing…

Brent Jensen and I found some time together recently to talk about testing. For some reason, we decided to record it. Worse yet, we decided to share it! I suppose we’ll keep this up until we either run out of things to say (unlikely), or until the internet tells us to shut up (much more likely).

But for now, here’s Alan and Brent talk about Testing.

Subscribe to the ABTesting Podcast!

Subscribe via RSS
Subscribe via iTunes

Swiss Testing Day 2014

As you may have noticed, my blogging has slowed. I’ve been navigating and ramping up on a new team and helping the team shift roles (all fun stuff). I’ve also been figuring out how to work in a big org (our team is “small”, but it’s part of one of Microsoft’s “huge” orgs – and there are some learning pains that go with that.

But – I did take a few days off this week to head to Zurich and give a talk at Swiss Testing Day. I attended the conference three years ago (where I was impressed with the turnout, passion and conference organization), and I’m happy to say that it’s still a great conference. I met some great people (some who I had virtually met on twitter), and had a great time catching up with Adrian and the rest of the Swiss-Q gang (the company that puts on Swiss Testing Day).

I talked (sort of) about testing Xbox. I talked a bit about Xbox, and mostly about what it takes to build and test the Xbox. I stole a third or so from my Star West keynote, and was able to use a few more examples now that the Xbox One has shipped (ironically, the Xbox One is available in every country surrounding Switzerland, but not in Switzerland). Of note, is that now I’ve retired from the Xbox team (or moved on, at least), I expect that I’m done with giving talks about Xbox (although I expect I’ll use examples of the Xbox team in at least one of my sessions at Star East).

I owe some follow ups on my Principles post, and I promise to get to those soon. I’ll define “soon” later.

Some Principles

I’ve been thinking a lot less about testing activities lately, and much, much more on how we to make higher quality software in general. The theme is evident from my last several blog posts, but I’m still figuring out exactly what that means for me. What it boils down to, is a few principles that reflect how I approach making great software.

  1. I believe in teams made up of generalizing specialists – I dislike the notion of strong walls between software disciplines (and in a perfect world, I wouldn’t have separate engineering disciplines).
  2. It is inefficient (and wasteful) to have a separate team to conduct confirmatory testing (e.g. “checks” as many like to call them). This responsibility should lie with the author of the functionality.
  3. The (largely untapped) key to improving our ability to improve software quality comes from analysis and investigation of data (usage patterns, reliability trends, error path execution, etc.).

I haven’t written much about point #3 – that will come soon. Software teams have wasted huge amounts of time and money equating test cases and test pass rates to software quality, and have ignored trying to figure out if the software is actually useful.

We can do better.

A few links:

Riffing on the Quadrants

In 2003, Brian Marick introduced the concept of “Agile Quadrants” to describe the scope of testing[1] (later expanded on by Lisa Crispin and Janet Gregory[2]). Several people (including me) have expanded and elaborated on the quadrants to describe the scope and activities of testing.

Here’s a version of the testing quadrants.


One challenge I’ve seen called out before is that there are varying definitions of what a unit actually is. Many unit tests in existence today are actually functional tests, and some may even be better defined as acceptance tests.

I’ve noticed that we see the same behavior on the right side of the quadrant – items in Q4 sometimes slide up to Q3, and items in Q3 may slide down to Q4 depending on how the data (and definitions) are used. But note that while items may move up or down depending on definition or application, we don’t see activities move from right to left or left to right.

If we look a little closer, there’s a clear equivalence class between activities in Q1 and Q2 vs. activities in Q3 and Q4. The activities on the left are confirmatory tests, and the results (whether they’re pass fail, or an analysis failure) tell you exactly what to do – while the activities on the left require analysis or investigation. (noted indirectly here) Furthermore, almost everything on the right side is about (or can be about) data! As test organizations move towards more data analysis and investigation, this is important. Also note that in many contexts (especially, but not limited to web services), the activities in the right half of the quadrant can all be done in production.


Now, this is just a riff, so I’m not entirely sure where I’m going (or if there is a huge hole in my logic), but I’m beginning to see both a strong correlation between activities on the left side and on the right side of the quadrants, and a distinction in skill sets as well. Deep data analysis is quite a bit different from feature development, (and unit and functional testing skills as well).

I remain a fan and supporter of generalizing specialists, and I’m seeing a much bigger need for data specialists and people who can investigate through ambiguity to help build great teams (and yes, we’ll need people that can smoke jump into activities in any of these quadrants as well).

More riffing later as I continue to explore.



Australia 2013-2014

No testing related content in this post – just a quick trip report of our family vacation to Australia that I can point people to as necessary.

Sydney (part 1)

After a long flight, we landed in Sydney, where we immediately set out to explore a bit (I’ve found that walking in the sun is a great way to beat jet lag). We ate dinner at Darling Harbor, and got to bet at a reasonable time so we could get up for our flight the next day.

The Outback

Our first (extended) stop was in Ayers Rock (Uluru). Ayers RockThanks to the time change, we had no trouble getting up early to watch the sun rise and light up Uluru. After taking a few pictures, we drove a bit closer (we rented a car for this portion of the trip) so we could do the hike around the base. Uluru Trail

We got an early start, but the heat was well over 100f by the time we finished around 10am. Still, we were rewarded with some once in a lifetime views and scenes.

More Uluru

The next day we went on a similar exploration of the area known as Kata Tjuta. Here, we saw much more wildlife (including our first views of wild kangaroos). Again, we went out early to avoid the heat, and spent the hottest part of the day indoors. The next day (or maybe the day after), we checked out and drove (sort of) towards Alice Springs.

Alice SpringsPath to rim walk

We took a detour on the way to Alice Springs to visit Kings Canyon. With the kids (and given that we had a long day of driving ahead of us), we just hiked to the first lookout – but if I ever, for some reason, end up at Kings Canyon again, I would love to do the rim walk – the picture at the right shows the “entrance” to the rim walk – a massive carved staircase that takes hikers up to “rim level”.

On our way out of the park, I took a little grief from my travel companions for an unavoidable murder of a large (~2 foot) lizard who happened to be basking in the middle of the road as I drove. Fortunately, I redeemed myself a few hours later when I managed to miss a kangaroo who bounced across the road in front of me.

Our buddyWe arrived safe and sound in Alice Springs, and spent a few days relaxing and exploring a bit. We found our way to Desert Park, a zoo of sorts, filled with animals and plants native to the central Australian desert. It was there where we were able to meet the guy on the left. We drove directly from Desert Park to the Alice Springs airport.


We landed in Cairns early evening on December 24th, and had a small in-room Christmas celebration before grabbing some food and spending most of the rest of the day in the pool.

We spent the next two days snorkeling the Great Barrier Reef, and had an amazing time. The second day was only the boy and I, but we were able to track down both a ray and a shark (a two-foot long non-man-eating shark, but it was exciting anyway). After another day (or two – who remembers?), we were off to Brisbane.


Brisbane was essentially our gateway city to the beach, but we made sure we had a full day to soak up what we could. The Queensland museum had an exhibit by Cai Guo-Qiang that Museum 1was quite interesting. Pictures to the right and below (and more info here). We also spent a big chunk of time at the South Bank park swimming and lounging – and enjoying some great restaurants.

Museum 2

Gold Coast / Sunshine Coast

Next up was five or so days each on the Gold Coast and Sunshine Coast (both fairly short train-rides from Brisbane). The kids and I spent three days at Dream World and White Water world amusement parks (where I somehow was talked into riding this monstrosity.Kill me now

Other than that, we mostly played on the beach (or in the pool). I finished up reading a few books I’d been working through, and downloaded and read half a dozen novels as well.


I went to Melbourne last year as a quick side trip after giving a few presentations in Sydney. This time, with family in tow, I didn’t hit quite as many restaurants as last time, but we were able to tack on the incredible Melbourne zoo, and the aquarium. LemurI especially enjoyed hanging out with the lemurs, but I was impressed with just about every animal exhibit (note the lack of basketball in any of these pictures – that’s an inside joke for PK and KK).KeithTortoise

The aquarium was equally impressive – especially for the shark lover in our family. I’ll let the picture tell the story here.

Ba-dum   Ba-dum...

Sydney (part 2)

We spent the final four days of our trip back in Sydney. This time we stayed at the Holiday Inn (located right between the Opera House and the Sydney Bridge – with a rooftop pool that views both!). We forced the kids to walk across the Sydney Bridge (and back), and explored the aquarium (almost déjà vu, but there were quite a few different fish and sharks in the Sydney aquarium).Bridge from Opera HouseOpera House from pool

More bridge shots

We saw The Illusionists at the Opera House (in the concert hall), and the kids and I went to see a play in the downstairs playhouse the next day. We tried to see as much as we could before we left, but I’m sure we missed some must-see places (both in Sydney, and in our other adventures).

At this point, or body clocks are all back in the correct time zone, and we’re all slowly getting back into the rhythm of our west coast lives. It was a fantastic trip, and one we will remember for a long, long time.


I’m back at the job after a long break (including a month vacationing in Australia – trip report coming). I spent a chunk of time after the Xbox One ship figuring out what the next step in my software career was going to be.

In the days up to the Xbox One launch, I hinted on twitter that after One was out the door, I was on to something else. At the time, I had no idea what I’d be doing – while development and improvement on Xbox One will continue for years, there will never be another opportunity to ship the initial One hardware and operating systems. I still have a ton of passion for the consumer space, and I fell in love with the Xbox as a living room entertainment device over the past two years, but I knew I was ready to do something new.

I’ve been at Microsoft for nineteen years now (yes, I’m old), and despite that, I try to keep an eye of what’s going on in the industry and consider non-MS opportunities if they’re the right choice at the right time. The career advice I give frequently (and follow adamantly) is to follow the Three P’s – the Person, the People, and the Product. This means that you should work for a person you respect and will enjoy working with; with people (a team) that works the way you like; and on a product (or technology) you are passionate about. Surprisingly, I found these in a company outside of Microsoft – along with all of the goodies that would make the move a good one for my family…but in the end, the relocation wasn’t the best move for me now, so I ended up passing (and hopefully didn’t burn bridges in the process).

In the end, I followed my Three P’s advice and chose to work follow my Xbox manager to a new team, working with people I know and enjoy working with, and on a really cool (and unannounced) technology. I’m really happy with the choice I made and I’m excited to get back in the groove with a new team and new project.

Oh – and Happy New Year.

Year End Clearance

I’m on vacation, and this post is auto-generated. See, you can trust automation sometimes…

Another year gone by, and another few dozen posts. Here are the top viewed posts of the last year (note – not all of these were written last year – this is just what people read the most last year).

In order of views:

Thanks to everyone who reads, comments, or reacts on Twitter to my rants and ramblings. Plenty of big announcements coming up, as well as more thoughts on what I see happening with testing in the future.

Happy New Year.

Death and Testing

I’m heading off for a long vacation today, so this is likely my last post of the year. It’s been a crazy year, and I thought I’d end it with something that a lot of my recent posts have been leading to (e.g. this post on tearing down the walls).

Some of you will hate this. I’m not sure if it’s because you’re afraid of your job, or because it’s so far out of your world that it doesn’t make sense. For you, let me tell you about the audience who will read this and say, “ho-hum – tell me something I don’t already know”.

Testing as we know it, is changing – and the changes are big (for some folks, at least). Testing as an activity will always exist. But I see test as a role going away – and it can’t disappear fast enough.

Test Is Dead (or is it)?

A few years ago, my friend James Whittaker angered testers world-wide with his talks on Test is Dead. James premise was that for a whole lot of software out there, it’s more cost-efficient to get data and issues from customers actually using the product than from a test team pounding on the product. Unfortunately, way too many testers failed to see the context and big picture of the main point and screamed loud and wide about how quality would suffer without testing.

But James was far more correct than people give him credit for.

Hiring testers to pound quality into a product after it’s been developed is a waste of money. The more I think about it, the more I realize that Test Teams are an artifact of older, predictive (staged / waterfall-ish) software development models. The notion that one team (or part of a team) is responsible for writing code, and another part of the team is responsible for making sure it works as expected seems wasteful to me (even after spending the majority of my career doing just this).

Test teams are dying. You may disagree, but I encourage you to take a critical look at whether you believe a non-integrated test team is the most cost efficient way to evaluate a product. Yes, I know there are companies formed entirely around evaluating already-developed products – but I believe that the days of testing-quality-in (the equivalent of the people who clean up after animals in a parade) are ending (or ending, at least, for the quality software products of the future).

Testing is NOT Dead

Before you skewer me with your protests (and some of you will, despite my best efforts to explain), let me reiterate. Testing is not going away. The testing activity will thrive and improve and evolve. It’s just time to think about getting rid of the old ways of thinking about who does the testing and where it happens.

Plain and simple, we (the software industry) have been wasting money for years letting testers play safety-net for lazy developers. It’s much more cost effective and efficient for developers to write unit tests and some or all functional level tests – and maybe more. Other developers on the team (many of them former testers) should focus entirely on “ilities” such as reliability, performance, security, etc. as well as evaluating end-to-end scenarios (epics) spanning multiple features. (related paper from me here – it’s a few years old, but I think it foreshadowed this change nicely).

There are other activities – like data collection and analysis (feel free to insert the buzzword “Data Science” here if you wish) and other tasks we need software teams to do – but we don’t need separate teams for those tasks. We need people to do those things, but those specialists can work anywhere. Why is it that I’ve seen just as many performance or reliability teams reporting to development managers as I have seen reporting to test managers? Some of my colleagues talk about the need for testers to “transition” to data science or other roles. In reality, it may be a transition, or it may be different people. It doesn’t matter (what remains true, is that we will have an increasing need for people who can analyze data). I think one of the only reasons we don’t talk about having a specific “Data Science Team” is that we already have enough disciplines in software engineering, so we default to shoving them into the best suited discipline for analysis roles.

Whole Team Approach?

We need integrated teams – not separate teams. We need teams of generalizing specialists to make great software. The disciplines and teams get in the way of efficient creation of quality software. Some say that the programmer/test relationship provides checks and balances – but I see no reason why those discussions and positive conflicts can’t happen on a team where everyone works tightly together.

I’ve seen a few mentions of the “whole team approach” to software creation, but that term (to me) doesn’t adequately describe how great teams (can) work. In a team full of generalists, everybody can do everything, but nobody does anything particularly well. That doesn’t work if you’re trying to make quality software. In a team of generalizing specialists, everyone does something (or a few things) really well – but the team members are adaptable when necessary. Teams need specialists in design, user interface, performance, reliability, deployment, internal infrastructure, data analysis, and scads of other areas. Teams also need people who specialize in testing – specialties that include end-to-end evaluation and risk analysis, as well as coaching developers writing unit and functional tests.

But the separate team for test is dead. It’s a waste, and we need to stop thinking about putting our specialists on separate teams.

Testing is a Specialty

I’m looking at a twitter feed full of comments from testers worrying about what this means to their careers. A good tester, as part of a team of generalizing specialists is worth every bit of salary and value as any other specialist on the team. Test “specialists” who can’t provide this level of value, however, probably should worry about their career. Test may be dead, but good testing will live and be valued for a long, long time. But we need to stop thinking that we need test teams to pound quality into a product, and figure out how to fully integrate great test minds into great software teams.

There’s a subtlety here that goes beyond semantics. Jobs (most of them, at least) aren’t at stake here. The great testers of today will be great test specialists on software teams tomorrow. It doesn’t’ matter if they code or not, as long as they provide value and contribute. I’ll repeat that, because I can almost guarantee a comment about how stupid I am for proposing a world where testers need to code. Test specialists won’t (necessarily) need to code. Many of the “test automators” today will write features, as well as functional tests (using their mad automation skillz) – and they’ll coach the programmers on how to do the same – leveraging the specialties of each other.

A glance through any testing community will show dozens of definitions of what testing is, and what testers do. It’s too much. It doesn’t matter. We need to stop building organizations where one team writes code and another team tests it. Testing needs to happen all of the time, and at the same time as development. For this to happen, we need to get testers out of their test teams, and fully integrated into software teams.

A Better Future

Today, testing articles, posts, and tweets are filled with whining (or mere complaints) about the sorry plight of testing, the lack of testing respect, the difficult of measuring testing, and other worries of the testing community I’ve been hearing for the past 20 years.

But a new world where testing is merely a specialty within the software team eliminates these worries. There’s no more us versus them. We have a team, and we each supply different specialties that contribute to the team’s success.

Preemptive post script

I’m leaving (on a jet plane), and I half-rushed this post out. I half (or full) rush most of my posts, but this one is important. I’ll try to respond to comments (and hate mail) when I land, but I expect I’ll have follow ups and re-writes coming. I’m positive there will be much speculation on what I didn’t mention, so I’ll deal with that soon as well.

More will come.

HWTSAM–Five Years Later

Five years ago this week, How We Test Software at Microsoft hit the shelves. The book has done well – both in sales, and in telling the story of how Microsoft approaches software testing.

An unfortunate aspect of writing a book like this is that after five years, most of the book is obsolete. Sure, there are some nuggets that stand the test of time, but so much has changed – and is changing, that it’s hard for me to recommend much of it as practicum. I don’t know if a new version is warranted – or necessary, but you never know what will happen.

To be clear, I’m still proud of the book, and think it has a lot of great ideas that will help many companies – it’s just not an accurate reflection of how Microsoft tests software. Here’s a section by section recap / overview of what’s changed….

The New World

Section 1 – About the Company

Now that we’re a Devices and Services company, the description of our product groups is flat out wrong. Beyond that, our organizations are much different – our test dev ratios are changing (from nearly 1:1 at the time of writing to nearly 2:1 dev:test today).

Five years ago, a few teams used adaptive (agile) engineering approaches, whereas now, nearly every team in the company is using or experimenting with some flavor of adaptive development. We ship almost everything faster. Testers are moving to analysis of end-to-end scenarios and data analysis, while “traditional” functional testing is moving (where it belongs), to the programmers. The career path examples we showed in the book have all changed – and will likely change again as the company continues to adapt and grow.

Section 2 – About Testing

The techniques described in the about testing section are all still relevant (although these days, I could easily increase the chapter on test design by about 50 pages and keep it full of good ideas). What’s changed most (and what will continue to change) is how much of the activities in this section are carried out by developers (and how much more of will be carried out by the developers in the coming years.

I’m still particularly happy with the chapter on Model-Based Testing (and thanks to Michael Corning and Harry Robinson for their help with the ideas for this chapter).

Section 3 – Test Tools and Systems

The section on bug reports is good – as long as you have the need to track bugs formally. Many teams can (and do) get by with fixing bugs as they find them – with no need to document the bug. I’ve also become even less of a fan of tracking test cases, but the examples are probably still valuable for those who need to do this.

It’s hard to believe that five years ago, I failed to recognize the critical relationship between test automation and test design (they’re in different chapters!). Although the technical information is correct, I would like to take The A Word and shove it into chapter 10 as an addendum.

The customer feedback systems I wrote about have expanded massively – and are much more of a focal point these days – both in the data we gather, and how we use it.

And finally (for this section), what blows me away is how much different our approach is to testing web services. I don’t think Ken will be mad if reveal that this chapter is embarrassingly obsolete. This isn’t because what we were doing then was bad…it’s just that we’ve grown so much in our approaches here that our five-year old processes seem obsolete in comparison.

Section 4 – About the Future

I suppose this section is still relevant  – and most of the future as I saw it in 2008 is reality today (although much of this section, in hindsight, could have been better written.


In a perfect world, I’d write another HWTSAM. Maybe someday, I will, but it’s not on the roadmap for today. There are too many big changes coming (both planned by MS, and spinning in my head) for me to put a stake in the ground. Or maybe, software is changing so much that trying to capture how a company does it at a point in time is impossible. I’ll need to think about it.

For those of you who bought (and read) HWTSAM – thank you. There’s still some good stuff there, but read it with a grain of salt (or at least from a lens knowledgeable that it’s a half-decade old book).

Book Stuff on Leadership

I intended for my last two posts to cover the main points of my STAR West keynote, but I neglected to mention a few of the books I referenced in my talk (which not coincidentally, are some of my most recommended books for those studying leadership.


I first read Leadership on the Line as part of a class I took several years ago, and many of the concepts have stuck with me. My favorite quote (slightly paraphrased) is, “Leadership is disappointing people at a level they can tolerate [absorb]”. Too many people try to lead by pleasing everyone. Good leaders are comfortable making decisions that not everyone agrees with, and know that teams are more productive when there is a slight level of discomfort or apprehension (as opposed to the extremes of complacent or overwhelmed).


I am a huge Patrick Lencioni fan. Lencioni writes business novels – stories about business that explain some of the principles and ideas he believes in. His Getting Naked is a peek into how he runs his consulting business, and mirrors the way I approach leadership and influence. Humility and learning in context are great approaches for anyone diving in to a new project, and I have found every one of Lencioni’s novels to be a page turner. Lencioni (in his one non-business novel publication) is the first person to clue me in on the fact that product quality is directly related to team health – something I’ve been reading more about over the past few years, and something I believe in whole-heartedly.

imageIn my experience, there are a lot of parallels between good leadership, and good consulting (bearing in mind, that the definition of a consultant has changed remarkably over the past several years).

I could write a whole post on how this book relates to leadership. And, in fact, I did (and if you’ve read this far in this post, go read that post now for more info).



Several years ago, I had an opportunity to hear Peter Spiro speak, and to also have dinner with Peter and his daughter while on a business trip (we were both speaking at the same internal conference). In his talk, and at dinner, he recommended The Feiner Points of Leadership. I read the book (and have re-read it several times), and I always seem to find ideas I can use (and ideas that are weirdly relevant to whatever I’m going through). The writing is conversational, and the advice is practical (opposed to the theoretical hand-waving in most leadership books).

There are at least a dozen other leadership books on my bookshelf that I like, but if you were to see my beat-up ragged copies of these books, you would know they’re the most used (and most loaned) books in my personal library.