Beyond Testing – Data Automation

data automation


When starting with a new client, there is always a transitional period as you get ramped up on systems, policies, procedures, etc.

How you overcome this transitional period is what can set you up for long term success or failure.

There can be a steep learning curve when it comes to gathering information on how files that pass data from one system to another operate, this is essential to understand in order to be able to mock the data for testing purposes. Sometimes existing documentation is sparse and resources are limited for training on site. Teams often need to come up with a solution that works not only for the testing team, but also for the business testers, allowing them to be more independent during the UAT (User Acceptance Testing) phase to free up resources that could be directed elsewhere.

 

One of the problems typically encountered during testing is identifying who is responsible for generating the necessary data to validate system changes. In one instance, a customer’s systems received files from many external vendors, which came in a variety of file formats. This resulted in many different types of files needing to be generated for testing. There was no dedicated resource available to perform the job of creating the files, the developers didn’t have the time budgeted in the project plan nor did the project have the dollars in the overall budget and the business testers just want to have the necessary data. This provided an opportunity to show how the Olenick team could develop tools to automate areas in the testing arena outside of the usual automation of test scripts.

 

What the team quickly learned in this situation was that creating these files can be a headache due to their complexity. This activity took the testers focus away from testing in order to create the files during SIT (System Integration Testing) and UAT.

 

Meeting the Challenge The plan was to have each file documented; a difficult task in a high-paced and evolving environment. The documentation alone wouldn’t be a sufficient solution because of the complexity of some of the files, it is not atypical for mocking the files to take more time than testing. So, the team needed an alternate solution.

How does the Olenick team solve this problem and provide better support? By creating a tool that can take care of the file generation – by engaging our development group from Buenos Aires and following a standard SDLC process the team was able to create a Java based file creation tool. Based on templates provided by the Vendors, we developed an algorithm that calculates and evaluates the dropdown selections that defined the output file format. After selecting the inputs that determine the file format, a tab in the app is enabled displaying the fields to be included in the output file as appropriate for the specific utility. Once the inputs required are entered, a validation process is kicked off that contemplates format, length, and consistency checks. When complete, a custom file modeler process captures the input data and parses it and writes the output file. The result of this process generates a properly formatted file for the selected utility that could be dropped into the clients system for testing.

 

The principal key to the overall process and getting a useable tool was a well defined scope. We had limited time and availability to deliver this to the client. So to show that we could deliver we engaged with a Proof of Concept (POC) limiting the tool to a select few file formats and a few of the different file types within that format for 3 distinct utility companies. This was done to show that the same tool could be leveraged for the differing formats that would need to be supported while keeping it simple for the end user.

Once the tool was delivered and validated, we could present it. We showed how the tool worked which led into highlighting the strengths of the tool:

  • No specific file knowledge was needed –Anyone with access to the tool could generate the files they needed without having to know any details about the file and what needed to be modified
  • Efficient Use of Resources – no scheduling of resources issues based on what files they knew
    • No need to worry about what projects are being worked on or in the pipeline
    • No risk to other projects in case the person who knew a particular file took vacation time, left the company, or otherwise was unavailable
  • Flexibility – New file formats or file types could be added quickly as well as modifications to the logic used within the file to support more testing scenarios
  • Speed – Files could be generated on demand as needed, reducing the time spent in the testing phase PLUS more time on actual testing vs. mocking up data

 

All these points help show why the tool could be incorporated into the workflow. The last point may be the best to answer why the tool should be incorporated.

 

When trying to make a case for the tool the fact that by using this tool, enhancements can be done faster, with greater accuracy and, at a lower cost is an easy way to gain support of senior management.

 

As we had limited the scope of the tool initially to prove that it could be done quickly and still cover a fair amount of testing needs, there could always be more features or scenarios covered. What we presented was VERY well received as the burden they had been under, prior to us learning the different file types so we could mock up files ourselves, always caused a big discussion and distraction for every project – who was going to own this activity? At what cost to a project?

 

After presenting the tool, suggestions were made for future enhancements as the advantages of the tool were abundant.  This tool could be leveraged for other workflows that we hadn’t initially foreseen.  This was assumed, and no surprise upon initial delivery, due to the limited scope of the POC.

 

In conclusion, what we delivered could be used within multiple segments of the testing process and a client’s business.  By understanding the client’s needs and using our Olenick in-house resources to fulfill the needs we were aligned with strategic goals of operational efficiency.  This effort showed that we can do more than ‘just testing’ and serves as an example of our solution-oriented approach to client delivery.

 

Co-Authored by Mary Jeter, Associate, Olenick & Associates – Chicago & Mark Treilman, Senior Consultant – Olenick & Associates, Eastern Region


Related Content: Automation, Automation Governance, Data Migration Testing, Functional Testing, System Integration Testing, Test Data Management