Pretty early in your requirements writing days you get handed two pieces of information.  Requirements are the what, not the how. There’s this bright red line drawn in the sand (whether it gets crossed all the time depends on the organization). Okay, I get it.  Don’t tell development how to implement, or how to solve the what.
My question for this Wednesday: What’s the line for quality assurance then?
Maybe it’s because I come from an editing background, but I could totally geek out on quality assurance stuff. Your quality assurance teams are a last line of defense. Really, these are the folks that catch the could-be embarrassing mistakes before they get shipped client-side. For many organizations, you’re only as good as the last thing you did. So buy these guys a drink or lunch, because when they’re rocking, they make the company and its teams look pretty darn good.
How do we as product managers give them the ammo needed to expertly test our lovable products and services though? I know there’s a lot of tension with organizations toeing the line between requirement and specification. I find it interesting that, from what I’ve seen and read at least, there isn’t such a line for working with quality assurance.
We want to make sure products and services are thoroughly tested, but we don’t want to tell assurance teammates how to do their jobs either. So, what’s enough and what’s too much? When do you go beyond giving what is genuinely needed and what is unnecessary help? Pragmatic Marketing’s instructors will say that you don’t want to create “Development Factories” — heads-down, no creativity to be found development.
I want quality assurance teams that know the products. That understand their range and complexities enough to test the full scope of impacts a single requirement may create. Those are the resources that have a voice, understand the end users and can identify those pesky holes hiding in plain sight.
- You have to draw a line somewhere. Just like development factories, I genuinely believe you can turn quality assurance into reading comprehension. If you spell everything out so rigidly, where is the ability to really learn the product and bring in the creativity and skills that only quality assurance can? Some books would say that product management should be writing test plans and some go as far as writing the test cases, too. Really? But, I’d never write a design document for a developer. Are we treating these teams equally?
- Requirements must be testable. At a base-level expectation, whatever the requirement is quality assurance must have information that allows them to discern whether the requirement has actually been met. Providing some expectation of quality goals and obvious constraints is a great start. For example, if I’m writing requirements about the ability to upload files, being able to provide information about maximum file sizes (constraint) and how quickly I’d expect the system to be able to upload files (quality goal) would be helpful.
- Chose your words wisely. Not to freak you out, but consider your requirement to be The Grail in “Indiana Jones and the Last Crusade” — you can give life and purpose with the words you chose or you can create frustrations of epic proportions. There are too many requirements written, that if you give them what I will now call “The Grail Test,” you’ll understand that: user-friendly, fast, functional, clean, etc. are words that don’t really have meaning. Words and how people understand and define them are relative.
Give a number. Draw a picture. Do whatever it takes for quality assurance and your other team members to know exactly what you mean. I have eight years of graphic design experience, so if you tell me you want something to be user-friendly, I’m going to have a whole different expectation-set than someone who doesn’t. As a product manager, you might have to work with other teammates to obtain what some of these constraints and quality attributes are, but you should make sure you’re adequately defining the expectation set and not letting others do so based off individual opinions and assumptions.
Instantaneously, the questions come as requirements are released into the wild. One thing I really struggle with is that some of the questions I get about how something needs to be tested is dependent on things I don’t know at that point. Sometimes, how and what I’d suggest a requirement be tested would depend on how the requirement was actually implemented. If this is the case, when are these pieces communicated to QA and who has ownership of defining those testing parameters?
It comes down to communication and collaboration, because getting high-quality products out the door is a team sport when done best. Sometimes quality assurance gets shut out of opportunities where being involved in the process can shake out questions early, allow team members to proactively get testing parameters identified and get contextual knowledge about how things are moving forward that are critical for success.
Personally, I keep a notebook after requirements reviews of the types of questions that are asked from quality assurance. Over time, I’ve found some commonalities that I now try to proactively address.
Get all members of the team involved early, often and always together. The next time you have a meeting with a developer to go over prototypes or any requirements-related questions, send an invite to an assigned quality assurance member.