Thursday, August 30, 2012

A few Cucumber tricks

I've been a fan of TDD since I discovered it a few years ago. Thankfully to my friend and colleague Augusto a few months ago I've discovered the beauty of BDD and ATDD: I got so fascinated by this practice I started an Open Source project for an Eclipse plugin featuring a rich editor for Cucumber feature files. It's called Natural and I think it's pretty cool if you want to check it out.

After having used Cucumber for a while and many, many mistakes I now have got some experience I wish to share with you.

One of the key features of feature files is that they represent a live project documentation, something developers write and maintain and everyone can understand. We believe so much in this concept that we wanted to give our product owner and business partners a nice view on those files, that's why we used Relish to publish them in a pretty colored format. At the same time we had to face some problems while practically use these files for automation testing.

After a few development cycles we were so excited by Cucumber and ATDD that our automation test suite counted hundreds of tests: it was consuming too much time to be completely executed on each commit by our CI system.
My suggestion is to track in your feature files the development cycle each test has been developed: this can be easily achieved by using a tag, something like @sprint-7.
This way you can instruct your CI system to execute the current sprint automation on each commit and schedule a full test nighttime (and launch time possibly).

Another issue was we were developing automation test before the functionality was implemented, so following TDD principles. By doing this our CI was reporting a build failure until the functionality was completely implemented. This is not a problem as it is expected to have failing tests during development, but the continuous failure messages received by the team members brought us to the point we were starting to ignore those messages, vanquishing the purpose of the automation suite and the CI system itself. The solution I suggest is to temporarily annotate the tests that are expected to fail with @future, instructing your CI to skip any test annotated as such: the developers can still execute those tests, but the CI will just ignore them. Once a feature is completed it will be developer's responsibility to remove such annotation.

With hundreds of tests we had problems in having the tests properly organized so here comes the folder structure and naming convention. Please consider those files not as test, but as your application documentation: as such you want to organize then in a manner they are easily accessible by a business person. My suggestion is to organize the tests in files describing different features, grouping the features into folders representing functional areas. Considering company search and report search capabilities I can imagine a search.feature and an advanced-search.feature, both inside a company folder and the same file names in a report folder.

Defect and user stories tracking, if desirable, can be once again achieved by using tags like @US1234 and @DE9876, but I would rather avoid cluttering the feature files with such information as they will tend to distract a business person from the real value of those files: application documentation.

Automation test coverage is a report we found business users find quite valuable, but it was tricky to achieve considering our application was a web one. Nevertheless we managed to have it running on our CI thankfully to Maven and its amazing set of plugins.

I've prepared a template Maven project for those reading this post to use as a starting point and guideline if you like the solutions we adopted: it will definitely help me in the future to avoid redoing all the steps from scratch! Please, read the README file before asking for clarifications :-)