Testing Automation for Interdependent eCommerce Gifting Sites

While Supporting Legacy to Responsive Design Rebuild

test automation for ecommerce 1000x540

Our client asked us to help them automate testing for their “checkout functionality,” a platform supporting six different ecommerce brands in the floral and gifting space. The client’s dev and test teams were top notch and we knew the project would be a rewarding challenge.

Automation testing objectives included:

  • Framework for handling different application flows involving lots of integration points and multiple solutions, e.g., solution for front end, services, database (we would focus on the front end side)
  • Handling multiple environments and browsers
  • Ease of running from a test suite/daily automatic runs
  • Ease of maintenance when changing HTML/DOM page structure
  • Enhanced separation of data from testing logic

Methodologies included:

  • Page object model
  • Data driven model
  • Generic in-page navigation method
  • Framework over Selenium handling the following:
    • Test runs, exception handling, and snapshots
    • Parsing data source template to be fed into test
    • Helpers, e.g., implicit and explicit waits, wait for page load/Ajax operations
    • Creating drivers depending on test configurations (Firefox, Chrome, IE, Presentation LFF, Presentation SFF)
    • Tests name spaced and categorized according to functionality
    • Support runs on different environments

A Project with Legs

From legacy code to responsive design: The test automation effort occurred concurrently with a client-Integrant dev team effort to upgrade legacy checkout sites to be responsive across multiple brands, platforms, and devices. Leveraging a QA automation solution for the legacy system was a strategic decision. The dev team couldn’t safely start rebuilding the legacy system directly without certainty that they wouldn’t affect the existing system, and manually testing the various scenarios in different environments would take a tremendous amount of time. The test automation project was the backbone to ensure that the existing functionality remained intact while migrating legacy code to the new responsive system.

Hybrid platform: We worked with the client to review pros and cons of using Microsoft coded UI test automation versus an open source framework like Selenium. The client decided on a hybrid design framework with Selenium and C# for web navigation and Microsoft Unit Test framework for writing/executing tests. This was a dynamic framework with a page object model and a data driven model used together.

Moving from legacy to responsive design meant that different form/screen factors and brands led to more complex data configurations and scenarios for testing. As a result, we took a data driven testing approach. The data driven model allowed us to feed test data to the test runner. The hybrid platform reflected the integration of data driven frameworks and keyword driven frameworks and meant we could cover multiple sites and multiple views with dynamic urls, dynamic data, and dynamic user handling. This would support various workflows including user specific data, and different payment methods/data across all brands. In addition, the platform could handle and prepare a report/snapshots of exceptions occurring when running test methods. It allowed extensibility to cover additional functionality when needed.

The testing framework focused primarily on testing the system from the user perspective. Our SQEs are ISTQB certified and cross-trained; internal Integrant experts provide training in QA for developers and training in automation for quality engineers. This helped us contribute in an environment of rebuilding legacy code plus moving to responsive design.

Brand, team, and function interdependencies: Separate client teams were in charge of different functions across multiple brands through the user process. These teams needed to collaborate and avoid retesting the same thing manually. Integrant worked as part of the checkout functionality team but the cart, backend, services, and other functionalities (all different teams) were serving the platform and were connected to checkout; our work needed to integrate with all other functions. If an integration was going to fail, we wanted it to “fail fast” and QA automation would allow this.

For example, six different client brands, including gourmet foods, specialty gifts, and live plants and flowers, needed to integrate as far as customer email and billing authentication.

Also, whatever we automated within the checkout function needed to apply to all links within it including gift options, message, billing, sign in, delivery, payment, and review.

External entities: Our testing automation included two areas involving third party vendors. One was billing vendors such as PayPal and credit cards. The other was an address verification system (AVS) including automatic modification of user address. We designed our test automation so test data would integrate with these external entities.

Process improvement: The internal process changed from checking in code, deploying a build, and manually testing it, to checking in code and having an automated build deployed on TFS and even onsite, and integrating with Microsoft Test Manager including name space and categories. For each test we both name spaced and categorized by functionality for ease of running the test suite. In this way we could accommodate multiple test teams interacting with multiple brands. We used name space to identify the QA team, e.g., checkout team, and categories to identify the function, e.g., cart function within checkout.

Earning Trust

The client came to us with some ideas about the platform and tools they wanted to use for this project. They created the initial framework structure but we saw lots of opportunity to add value and further optimize the performance, maintainability and user experience. In order to get in position to contribute as a true partner, we knew we had to establish trust between the client’s esteemed team and stakeholders, and us. We prioritized getting to know the client and understanding exactly what they wanted.

We started from scratch with the onsite team. At the beginning of the project, and for two weeks at the beginning of every 12-week block after that, we were onsite with the client. This ensured we started out and maintained a lock-step collaboration with the internal team including adopting their tools and communication processes.

The internal team led the initiative and then we started contributing. We started with a few mockups and then took over to implement real test cases. As the rebuild from legacy to responsive proceeded we recapped each user story and added existing test cases to be automated. We followed up with each user story to be sure it was fully automated.

Our priority was to do specifically what the client requested. Initially this involved a QA automation system that would allow rebuilding legacy code to responsive design with zero changes in functionality. We completed this request precisely as instructed in order to build trust with the internal team and all stakeholders. They needed to know their legacy code was safe with us.

Investing in Growth

The automated test framework maintained every single functionality of the legacy sites, without exception. This was the result of meticulous, exploratory and methodical examination, documentation, and cross-referencing.

At the same time and independent of the task at hand, we wanted to know how we could improve the testing automation process by enhancing and modifying legacy code that was weighing down advances in testing, like refining the screen handle functionality. If a new feature were to be added, existing functionality would have to remain untouched at least from the user experience perspective, so we started from the user point of view.

In rebuilding the legacy system, we were faced with multiple black box testing scenarios where we had no knowledge of the inner workings of what we were automating. We might know what the input was and what the expected outcome was, but not how the results were achieved. We knew that in order to write test automation cases down the road that would continue to garner the client’s trust and grow our relationship, we needed to understand all interdependencies.

To address this, our QA team manually tested all legacy System aspects within the responsive site development. This allowed them to gain information about the legacy system, how it handled data, and how we should refine or restructure this UI to be responsive. The team went through every single piece of legacy code for all six brands so they understood exactly how the legacy system worked.

Having converted legacy code to a responsive system, we then wanted to go back on our own time and look at converting the user experience within the legacy system to a “dream” experience for responsive users. We presented our ideas to the client in small increments after we had proven our capabilities in protecting their legacy code. We provided small proofs of concept for each small scenario. Each building block allowed us to grow together.

Value: When the client launched its responsive sites not only did they see zero loss of functionality, they saw an immediate and sizable uptick in orders. The old application included web forms, non-testable code, and lots of dependencies. The new application was responsive and configurable, and after the framework/tests matured, we started leveraging unit tests in the development cycle.


The client’s library of test cases now numbers close to 500. We started with 4 Integrant QAs and 2 client QAs. By the end of the project we were down to 2 Integrant QAs and 1 client QA. The new framework is very mature. A new method only needs to be added with the addition of a new feature.

Ongoing Partnership

As with many clients, at first this client considered us a separate team, culture, and country. Over time we earned the role of a full and trusted extension of their internal team. This took effort and commitment and was earned through results, not offered on blind faith. We understand our clients have too much at stake for blind faith.

Today this client and Integrant collaborate in multiple areas as true partners, growing together in knowledge, challenges, and successes. Our client trusts us to identify problems, propose solutions, and implement changes. We in turn continue to be inspired by their self-motivated and highly professional teams.