Test Driven Requirements

{Disclaimer – this is a cross-posted article appearing both on the Triton-Tek blog and my personal blog Budding BA}

My job at Triton-Tek is not an easy one to define, but predominantly I am a Business Analyst and Project Manager on Agile web development projects, see my previous articles here and here.  However there is one major project on which I have been working since day one that has required a lot of Quality Assurance (QA) support from me so that the developers get quick feedback on their work.  This work as a QA, has been incredibly beneficial to me as a BA in crafting the requirements for the project.  In an Agile environment, the approach is very much “just-in-time”, and the same applies to requirements gathering.  We will have a set of stories for the release, and a plan for the current iteration, but the specific ins-and-outs of the stories are still very foggy.  So as a BA I will add details to the stories as they come up, but often it is an iterative feedback loop that defines the requirement.

Agile User Story completion cycle
Agile User Story completion cycle

Because of this emphasis on QA in my role I started to read about how to do this in an Agile fashion, and the results are fascinating.  The connection between the QA role and the BA/requirements role in Agile projects is very strong, and the Agile testing techniques I have learnt have helped my requirements analysis and definition as a result.

This mindset change starts with the “Agile Testing” book by Lisa Crispin and Janet Gregory.  It is quite a large book, but very practical and describes in detail the daily hands-on role a tester must play on an Agile project.  I will not review the book here, that is another post on its own, but I am adding the Agile Testing Quadrant below as it gives a good overview of the different areas of testing.

Agile Testing Quadrant from "Agile Testing" by Lisa Crispin & Janet Gregory

Essentially the book drills home the many different angles from which the product must be tested in order that you can be sure of as solid an end result as possible.  Critically the developers must practice Test Driven Design (TDD) for this all to work.  This is accomplished by starting with failing unit tests (because the functionality does not yet exist) and then coding the solution so that the tests now pass.  So for example, they would start with a test titled “User can enter a username and password to login”, and then code the user login functionality so that the test passes.  I learned of TDD upon starting at Triton-Tek and understand what its purpose was, however since reading this book my eyes have been opened to how the same approach can be used when gathering requirements, and how that can feed directly in to the code, simultaneously creating automated unit tests and setting the acceptance criteria for when a story can be considered “done”.

Now, before I continue, I want  to stress that this is not a new discovery on my part, there is a community around Acceptance Test Driven Design and Behavior Driven Design (BDD) that talk about exactly this process.  However, as a BA with no coding background, the personal progression in understanding of how this all ties together is one from which I think a lot of BA’s can benefit.

Going back to the quadrant above, the area we are focused on is Q2, the functional and story tests that can be automated.  The book specifically talks about using the FitNesse tool to automate functional testing.  This tool allows you to create understandable “business” phrases, using a certain format, that the developers can then take and use a fixture to directly test their code to make sure the functionality works as the business expected.  The great thing about this approach is that it expands on the quickly jotted down acceptance criteria on a story card so that the developer is immediately presented with the list of tasks he or she needs to complete in order to call the story done.

From a BA perspective, this approach is hugely helpful.  Since reading about FitNesse, I spoke with the team at Triton-Tek and Matt Hidinger recommended SpecFlow as that uses the more business and domain friendly BDD approach, building on the Cucumber syntax, and integrates easily with TFS and the .NET world in which we work.  In BDD, the tests are created using the following syntax:

Given [condition]

When [action]

Then [result}

So an example would be:

Given the user has already registered

When the user enters the correct username and password on the login screen

Then the user will be logged in to the system

These “Given .. When .. Then” statements can then be used by the developer as the basis for their unit tests ensuring the functionality of the story meets the business / user requirements.  This is a very simple, but powerful tool for a number of reasons critical to a BA’s work:

  1. The language is easy to understand, so it can form the basis of supporting requirements documentation.
  2. Anyone can write them, so you can pull in stakeholders and product owners to directly contribute.
  3. They encourage analytical and practical thinking about the requirement.
  4. Reduces the lost in translation problem of transferring business requirements to the developer.

So now with my BA hat on, when I am expanding on the requirements for a User Story, I can start walking through the many business rules, user scenarios and use cases using the “Given .. When .. Then” statements.  This always end up uncovering hidden problems or complexities, that may not have been discovered until development had begun.  It is also a fantastic way to keep the product owner or client representative involved, the statements are easy to understand so any business user can create them, meaning that they are setting their own acceptance criteria.  This is very important in a consulting environment such as ours where any misunderstanding of what functionality a user story should actually achieve can have large monetary and client/contractor relationship ramifications.

Now with my QA hat on, the biggest bonus to this approach is that I have simultaneously created automated functional tests, and encouraged TDD behavior from the developers.  This should allow me to focus on the more fuzzy (and in my opinion, interesting) exploratory and usability testing, which is where I can add the most value to the product and the team.  Overall, everyone wins, and most importantly the requirements are delivered seamlessly from the business to the development team.

Resources on this topic:

Test Driven Requirements

When User Stories are not enough

I recently attended an interesting webinar by MKS on why User Stories are sometimes an inadequate representation of requirements.  I thought it raised some excellent points, and certainly got me thinking.

Starting with a refresher on what User Stories were, the presenter Colin Doyle emphasized how they are NOT a requirements document.  Instead they are a placeholder for continued conversation on that particular piece of functionality and used especially for prioritization.  They are the tool through which requirements are communicated and discussed rather than delivered in the form of hefty requirements documents (see this excellent article from Mike Cohn on User Stories).  As Agile prescribes, in teams where face-to-face, daily communication is possible and the Product Owner and key stakeholders are readily available, the User Story format works very well.  Each story follows the INVEST criteria and are first and foremost independent of other stories, and the non-functional requirements or constraints do not dominate.  With each iteration, the product develops incrementally with the gradual evolution and folding back in to the project of the inevitable changes that will arise as the product comes to life.  If there is a need to review the requirements and see the “big picture”, then the User Stories in combination with Test Driven Design (TDD) or Behavior Driven Design (BDD) practices, should provide all the details you need.

Colin Doyle argued the point however that there are 3 key issues with this method, all centered around the scalability:

  1. Too many customers for the product resulting in conflicting requirements and a significant effort to understand customer environments and needs.
  2. Complex products with significant interdependencies, lots of exisiting requirments and the lack of a way to see the big picture easily.
  3. Large organizations where “scaling of face-to-face conversations is an N2 function (Brooks)” and may include offshore or off site teams.

Therefore he recommended the careful introduction of some traditional requirements engineering practices to help solve these issues.  In particular using modeling techniques such as use cases, decision

Use Case Model example
Use Case Model example

trees or event-response tables to get a handle on the big picture.  This will help the project in number of ways.  Firstly the developers will understand the full scope more clearly and so guide their TDD/BDD test cases and hence their code.  It will also help the Product Owner who may be unable to remain fully informed for large, sprawling products.  These requirements artifacts will also form the basis for a high level dependency analysis from which relatively independent User Stories and backlog items can be derived.  This should ensure that you arrive at a position with nicely traceable project, from the high-level product owner level down to the actual code.  The presenter was not arguing for a bastardization of the Agile philosophy by doing this requirements analysis upfront, he stressed it should be performed in a just-in-time, iterative manner, and the requirements artifacts changed and adjusted as the product develops.

I am receptive this approach.  I don’t think that the Agile community should unconditionally shun traditional requirements engineering practices as “old-school” and no longer relevant.  My concern with the approach is how easy it would be to slip in to a gather requirements up-front process.  The models and artifacts could become weights around the neck of the project requiring constant revision, whilst at the same time being the excuse why something was done a certain way.  I believe that the Agile methodology, with it’s primary emphasis on conversation over documentation, promotes an environment of “ask before you act”, to ensure that the correct path is chosen.  If there is too much documentation available, this can be viewed as the definitive answer, and reduce the amount of communication within the team and clarification with the product owner.

But that is not say I am ruling this out, and in fact I think I will try and incorporate this approach in to the next complex project I work on.  I think that so long as the culture of the team remains inquisitive and conversation intensive, these models can be good aides to discussion, just like user stories.

When User Stories are not enough

The BA Role, Testing and Support

So this has been a very slow start to my blogging life, I realize this, but essentially since I first got my new job and have subsequently started, really not a whole lot has been happening to blog about. I have just completed my 11 weeks of training with the 15 other new new-hires which has been a mix of methodology and industry training (of which there was a lot to cover).

What is clear however is this company has a very structured and well documented (oh my god the amount of documentation, the intranet is huge!!) approach to the implementation of their software. My role, as it turns out, is actually a combined role of Business Analyst and Customer Support (CS). So I do go in and gather all of the requirements for the software implementation and create a Business Requirements Document from which to work, but instead of passing this client relationship and knowledge on to the CS as it was historically done, I will now also see the client through the test and audit phase after the software has been loaded with the data, help them understand the reports the software generates and will be on-site for when the software goes live.

So I understand that this is not a 100% business analyst role, not that there is really a clear standard for that role anyway as every company seems to use the title and role differently, but it did get me thinking over what percentage of typical Business Analyst’s actually do work on their projects through testing and go-live in the same way?

It seems to me that if you are in a consultancy specializing in Agile software development or as a freelance business requirements gatherer then the answer is probably very little. You have a specific job that you are called in for and any additional involvement is not so explicit. But if you are working in the belly of the corporate beast as I am, then your role probably does have a more holistic aspect where you are seeing projects through to conclusion.

At this point what this actually means for my job is completely unclear as I haven’t even started working on an account. However at this juncture I am looking forward to being able to see my clients through to go live as I feel it will give me a good sense of achievement to actually see the results of the earlier requirements gathering work. I also foresee an excellent learning opportunity because only when the software is up and running and being used by the clients users can you really see how it is used in real life, and where you may have made some errors in the BRD and set-up.

There is an article on the Business Analysis Times that discusses this area – http://www.batimes.com/index.php?option=com_content&task=view&id=248&Itemid=1. I also stumbled across a posting on this on the Requirements Networking Group website (http://www.requirementsnetwork.com/node/1199) however this is in direct relation to Agile and SCRUM development of which I have no knowledge at this point.

If anyone has any input on what happens in your role in relation to testing I would be very interested to hear.

The BA Role, Testing and Support