I recently spoke with a colleague from another department in my organisation who wanted to know what problems we were currently experiencing with our test automation. It was something I hadn't had to articulate before. As I answered, I grouped my thoughts in to four distinct contexts.
I'd like to share what we are currently struggling with to illustrate the variety of challenges in test automation, even within a single department of a single organisation.
Maintenance at Maturity
We have an automation suite that's over four years old. It has grown alongside the product under development, which is a single page JavaScript web application. The test suite has been contributed to by in excess of 50 people, including both testers and developers.This suite is embedded in the development lifecycle of the product. It runs every time code is merged into the master branch of the application. Testers and developers are in the habit of contributing code as part of their day-to-day activities and examine the test results several times daily.
In the past four months we have made a concerted effort to improve our execution stability and speed. We undertook a large refactoring exercise to get the tests executing in parallel, they now take approximately 30 minutes to run.
We want to keep this state while continuing to adapt our coverage to the growing application. We want to continue to be sensible about what we're using the tool to check, to continue to use robust coding practices that will succeed when tests are executing in parallel, to continue to keep good logging messages and screenshots of failures that help us accurately identify the reasons.
There's no disagreement on these points. The challenge is in continued collective ownership of this work. It can be hard to keep the bigger picture of our automation strategy in sight when working day-to-day on stories. And it's easy to think that you can be lazy just once.
To help, we try to keep our maintenance needs visible. Every build failure will create a message in the testing team chat. All changes to the test code go through the same code review mechanism as changes to the application code, but the focus is on sharing between testers rather than between developers.
Keeping shared ownership of maintenance requires ongoing commitment from the whole team.
Targeted Tools
Another team is working with a dynamic website driven by a content management system. They have three separate tools that each provide a specific type of checking:- Scenario based tests that examine user flows through specific functions of the site
- Scanner that checks different pages for specific technical problems e.g. JavaScript errors
- Visual regression tool that performs image comparisons on page layout
The information provided by each tool is very different, which means that each will detect different types of potential problems. Together they provide a useful coverage for the site.
The scanner and visual regression tool are relatively quick to adapt to changes in the site itself. The scenario based tests are targeted in very specific areas that rarely change. This means that this suite doesn't require a lot of maintenance.
Because the test code isn't touched often, it can be challenging when it does need to be updated. It's difficult to remember how the code is structured, how to run tests locally, and the idiosyncrasies in each of the three tools.
All of the tests are run frequently and are generally stable. When they do fail, it's often due to environmental issues in the test environments. This means that when something really does go wrong, it takes time to work out what.
It sounds strange, but part of the challenge is debugging unfamiliar code and interpreting unfamiliar log output. It's our code, but we are hands-on with it so infrequently that there's a bit of a learning curve every time.
Moving to Mock
In a third area of my department we've previously done a lot of full stack automation. We tested through the browser-based front-end, but then went through the middleware, down to our mainframe applications, out to databases, etc.To see a successful execution in this full stack approach we needed everything in our test environment to be working and stable, not just the application being tested. This was sometimes a difficult thing to achieve.
In addition to occasionally flaky environments, there were challenges with test data. The information in every part of the environment had to be provisioned and align. Each year all of the test environments go through a mandatory data refresh, which means starting from scratch.
We're moving to a suite that runs against mocked data. Now when we test the browser-based front-end, that's all we're testing. This has been a big change in both mindset and implementation. Over the past six months we've slowly turned a prototype into a suite that's becoming more widely adopted.
The biggest challenge has been educating the teams so that they feel comfortable with the new suite. How to install it, how to configure it, how to write tests, how to capture test data, how to troubleshoot problems, etc. It's been difficult to capture all of this information in a way that's useful, then propagate it through the teams who work with this particular product.
Getting people comfortable isn't just about providing information. It's been challenging to persuade key people of the benefits of switching tack, offer one-on-one support to people as they learn, and embed this change in multiple development teams.
We've had various suites in our mobile teams but their shelf life seems to be very short. Rather than pour effort in to maintenance we've decided on more than one occasion to start again. Now our strategy in this space is driven by quick wins.
We're working to automate simple smoke tests that cover at least a "Top 10" of the actions our users complete in each of the applications according to our analytics. These tests will then run against a set of devices i.e. four different android devices for tests of an android application.
Our challenge is alignment. We have four native mobile applications. At the moment the associated automation is in different stages of this boom-and-bust cycle. We have useful and fast feedback, but the coverage is inconsistent.
To achieve alignment, we need to be better about an equal time investment in developing and maintaining these suites. Given the rate of change, this is an ongoing challenge.
*****
That's where we're at right now. I hasten to add that there are a lot of excellent things happening with our automation too, but that wasn't the question I was asked!
I'm curious as to whether any of these problems resonate with others, how the challenges you face differ, or if you're trying solutions that differ to what we're attempting.
Getting people comfortable isn't just about providing information. It's been challenging to persuade key people of the benefits of switching tack, offer one-on-one support to people as they learn, and embed this change in multiple development teams.
Smokin'
The final area we are using automation is in our mobile testing. We develop four native mobile applications: two on iOS and two on Android. In the mobile team the pace of change is astonishing. The platforms shift underneath our product on a regular basis due to both device and operating system upgrades.We've had various suites in our mobile teams but their shelf life seems to be very short. Rather than pour effort in to maintenance we've decided on more than one occasion to start again. Now our strategy in this space is driven by quick wins.
We're working to automate simple smoke tests that cover at least a "Top 10" of the actions our users complete in each of the applications according to our analytics. These tests will then run against a set of devices i.e. four different android devices for tests of an android application.
Our challenge is alignment. We have four native mobile applications. At the moment the associated automation is in different stages of this boom-and-bust cycle. We have useful and fast feedback, but the coverage is inconsistent.
To achieve alignment, we need to be better about an equal time investment in developing and maintaining these suites. Given the rate of change, this is an ongoing challenge.
*****
That's where we're at right now. I hasten to add that there are a lot of excellent things happening with our automation too, but that wasn't the question I was asked!
I'm curious as to whether any of these problems resonate with others, how the challenges you face differ, or if you're trying solutions that differ to what we're attempting.