Test Automation Facilitates Scale

Helping a Non-profit Client Move Away From Manual

Despite the awareness of automated testing, a survey by Sauce Labs showed that only 26% of respondents have more automated than manual testing. Currently, 67% of companies are deploying at least weekly and nearly half (46%) want to deploy faster.

Our client is no different. Company Y is a well-known builder of software for non-profits. As of 2016 they were the #1 online and mobile fundraising platform, trusted by 2,500+ of the world’s top nonprofits and social enterprises. Their software allows non-profits to create fundraising opportunities for year-round initiatives as well as very specific, real-time disasters and crises. The platform provides crowd funding, peer-to-peer fundraising, event registration, and website donations, all under one roof.

With hundreds of millions of dollars raised across over 300,000 campaigns the momentum was only increasing. Company Y needed to automate its testing to continue to provide the quality platform that its growing user base counted on to realize time and cost efficiencies in fundraising.

Basics: What Is Test Automation?

Test automation uses special software to control the execution of tests, report the execution status, and compare actual outcomes with predicted outcomes. The potential benefits of test automation for Company Y included:

  • Reduced time
  • Reduced cost
  • Increased speed
  • Increased coverage
  • Repeatability
  • Reusability
  • Maintenance of test suite

Reduced Time: This is one of the biggest benefits, especially when it comes to regression testing. Regression testing is the retesting of the application when new features have been introduced or a change is made to an existing feature that has been previously tested (the change can be as a result of change request, defect fix, or refactoring). The aim of regression testing is to ensure the application still works as expected and in order for us to verify this, we need to run all test scripts associated with the change. There is a risk here, that due to time constraints we may not run all tests associated with the change, which may result in undiscovered defects.

These issues can be overcome by setting up our automated tests to run over night or after each deployment. This creates time for the tester to perform exploratory testing, concentrate on areas which cannot be automated and concentrate on other tasks.

Reduced cost: The number of resources required for regression testing is reduced.

Increased speed: As automated tests are run much faster than human users.

Increased coverage: Testers can create a test suite with covers testing every feature within the application.

Reusability: Automated tests can be reused on different versions of the software, even if the interface changes.

Repeatability: The same tests can be re-run in exactly the same manner, eliminating the risk of human errors such as testers forgetting their exact actions, omitting steps from the test scripts, or missing steps from the test script, all of which can result in either defects not being identified or the reporting of invalid bugs (which can be time consuming for both developers and testers to reproduce).

Maintenance of the test suite: We have found ourselves in situations where the test suite has become out of date. New functionality has been introduced or existing features have been changed in the way they work and the test cases are no longer up to date as the tester(s) has had no time to go back and update the test scripts. When tests are automated and run after each build, those that are out of date will fail, forcing the tester to go back and fix the test script. This process ensures the test scripts are kept up to date and the quality of the software is maintained.

Automation helps to save time by reducing the time taken to run tests; increasing the quality of the software and testing process through reliability, repeatability and comprehensiveness of the test suite; utilizing manpower more effectively by applying skills and time where they are needed most and increasing test coverage.

Automating testing requires a human tester within a project team, as not every test associated with a feature can be automated and not every project is suitable for automated testing. As a tester, automation is about making lives easier, using time and resources more efficiently to ensure quality is maintained not just within the application being developed but also within the testing process.

As part of the analysis to estimate impact, we modeled the following scenario, representing six test cases and five iterations:

Manual Testing:

  • Number of test cases: 6
  • Number of testing iterations: 5
  • Effort required to run all 6 test cases manually once: 10 min
  • Total manual testing time: 50 min

Automated Testing:

  • Number of test cases: 6
  • Number of testing iterations: 5
  • Effort required to write automation tests for all 6 cases: 10 min
  • Effort required to run automated tests for all 6 cases: < 1 min
  • Total automated testing time: 10 min
the technical task

The Technical Task

Company Y met with Integrant to determine the elements of the project. They were defined as:

  • Build and maintain different automation frameworks for the regression testing which was conducted on two separate layers of the application GUI layer – API layer.
  • Automate the regression testing that should run on each new build (to save time and effort).
  • Increase test coverage and test accuracy.
  • Establish safety net for continuous deliveries and fast releases.
  • Improve code quality by giving confidence to development team to refactor and enhance their code without being concerned with the cost of re-test.
  • Automate on different levels (API – GUI) to shorten and accelerate the feedback cycle, which would reduce the rework and increase the ROI for automation. For example API layer testing runs in seconds and delivers accurate feedback for changes or failures.

The Team

Integrant’s hat is filled with feathers in the form of team members. This is where we excel. We put together a team for Company Y that would complement its internal resources and advance the innovation and end results it wanted to achieve. The two teams were as follows:

Company Y Team

  • Initial team:
    • Senior QA engineer
    • QA automation engineer
  • Ongoing team:
    • QA manager
    • Lead QA engineer
    • QA automation engineer

Integrant Team

  • Java GUI:
    • 1 Technical project lead
    • 1 Technical lead
    • 2 Senior test analyst
  • Python API
    • 1 Technical project lead
    • 1 Technical lead
    • 1 Senior test analyst

The client positioned a senior QA engineer on its internal team for the duration of the test automation implementation phase. Upon implementation they transitioned the senior QA engineer off the team and replaced that role with a QA manager and a lead QA engineer. The role of the QA manager and lead QA engineer is to assess, delegate, and communicate regarding QA on this and other projects.

Project Phases

Working with Company Y, we jointly defined phases and associated approximate timelines for the project. They were as follows:

  • Initiation/Planning:
    • Know the business. Perhaps the most important aspect of initiation, this step ensures that the Integrant team understands what Company Y offers its customers, how it differentiates itself from the competition, and what its key performance indicators are. The result is that the Integrant team is in a position to make technical decisions that fit with and facilitate the company’s overarching goals.
    • Define what needs to be automated (client requirements).
    • Define technologies and infrastructure (development languages, Servers OS).
    • Develop a POC.
    • Demo the POC and share it; then obtain approval to execute.
  • Execution
    • Iterative work for writing new test cases and enhancing code base.
    • Setup infrastructure to execute the test in daily bases and integrated with normal build procedure (include sharing the execution reports).
    • Continues deliveries (check-ins into source control).
  • Final
  • Timeline
    • The Java GUI project was completed within 6 months; the API project is ongoing.

The Software Tools

We did not limit ourselves to specific technologies or stacks, but rather we selected the best solution with the most pros and least cons to fulfill the needs of each area. Below are some of the tools we used:

  • Java framework:
    • language: Java using Eclipse
    • Testing framework: Selenium Web Driver (Page Object Patterns)
    • Page Object Model (POM): Apache Maven
    • Source control: Bitbucket (Git)
    • Continuous integration server: Jenkins
    • Infrastructure for development code: AWS (Amazon Web Service)
    • AWS CodeDeploy automates code deployments
  • Python framework for “API”
    • Scripting language : Python using PyCharm
    • Virtual environment as isolation and testing environment
    • Testing framework: Python Unit test case
    • Source control: Bitbucket (Git)
    • Continuous integration server: Jenkins
  • Scripting:
    • Eclipse (Mars) for Java
    • Pycharm or Eclipse for Python

Testing Tools: Why Selenium?

Among automation testing tools we determined that Selenium was the best choice for the following reasons:

  • Operates on almost every OS.
  • Cost-effective pen source testing tool.
  • Supports multiple languages such as Python, Pearl, Ruby, PHP, .NET (C#) and Java.
  • The language used for building the program is independent of the language that the web application or website is using.
  • Supports a range of browsers like Opera, Safari, Chrome, IE, and Mozilla Firefox.
  • Community support.
  • Robust methods for location of elements such as CSS, Xpath, and DOM.
  • With Selenium, it is convenient to implement frameworks that revolve around object oriented programming like Keyword Driven, Data driven and Hybrid.
  • Supports integration of open source frameworks like TestNG, JUnit, NUnit and so on.
  • Paralleling, it is possible to execute simultaneous tests leveraging various browsers on various machines. This reduces time for test execution when a large project is in progress.

Issues and Decisions

Integrant approaches each project and each task within a project from an agile, values-based point of view. In particular, we bring to the table, and usually find in our clients, technical courage, clarity, candor, tenacity, and above all, respect-respect-respect. This means that when faced with challenges or problems, we see opportunities to make good choices. Here are a few of the issues Integrant and Company Y dealt with:

  • Working and dealing with different environments (e.g., MEAN stack, LAMP stack – Linux and Windows OS).
  • Working on different branches (e.g., staging development branch, QA shares development database, production database).
  • Dealing with complex business scenarios with insufficient documentations.
  • Due to nature of the app the test data is very hard to prepare.
  • Overcoming the degraded performance of the old code by utilizing different automation coding techniques.
  • Verifying external resources (PDFs, CSVs, Mail).
  • Paralleling the execution and code performance optimization (e.g., run time – memory consumption – processor consumption)
  • Working against challenging automation code standards.
  • Integration with test cases management tool which is integrated with release, and project management tools for automatic metrics generation.
  • Rapid changes in code base for application under testing (we are overcoming this by having faster feedback cycles and modular design and reusable components).
  • Cross browser handling.

The Result

The implementation of test automation for Company Y produced the benefits predicted, including reduced time, increased speed, lower cost, and repeatability. A snapshot of “before” and “after” follows:

  • Before:
    • Regression effort from experienced test engineer estimated to consume 12 working hours for only basic scenarios and only one browser.
  • After:
    • Automated framework executes, collects, compares, and reports on 1 browser (or more on multiple machines) in 1 hour, and we are optimizing that.
    • Running that on 4 browsers manually would consume 12 × 4 = 48 hours.
    • Automation without paralleling = 4 hours and with correct paralleling model it can be 1 hour.

Company Y is a client near and dear to our hearts both because of the service they provide and their commitment to their customers. The results we achieved together could not have been realized without an internal and external team that were working toward common goals in a like-minded technical and cultural environment. We look forward to the next chance to progress in lockstep with this industry leader.

About Integrant

We provide highly skilled teams of engineers that integrate seamlessly with your team and share the same passion and commitment that you have to deliver a quality product to your valued clients.

Full stack development: We specialize in .NET and JavaScript full stack development. While each of our developers specializes on the front end or coding for the business logic or data model, we expect all our team members to be familiar with all three layers. It has been our experience that works better in an Agile environment; it supports flexibility, creates highly efficient teams, and allows for a higher level of collaboration throughout project lifecycles.

Testing automation: We have significant expertise working with clients to transition to and implement scripting. While we primarily use test scripts for regression testing, we’ve successfully designed and implemented reusable test scripts for functional and integration testing as well.

Software quality assurance: While it’s important for developers to understand why they’re building what they’re building, business and domain knowledge are the foundation for software quality assurance. We ensure our team members understand U.S. standards. We become an extension of your internal team—learning and adhering to your unique processes, protocols, regulations, and standards.