How to Overcome Common Testing Challenges for Data Integrity Testing | TTC Global

How to Overcome Common Testing Challenges for Data Integrity Testing

Lessons from a Major European Bank

Kayla Cropped
  • Senior Manager
  • TTC Global
  • Cincinnati, OH, USA

With financial service institutions overseeing millions of transactions a day – be it loan payments, debit card charges, wire transfers, and so on – it’s no wonder that data integrity is such a critical part of the industry. Assuring customers that their financial data is safe builds their trust in you, an important part of both growing and maintaining your client base. But remember; with great trust comes great responsibility, hence the importance of data integrity testing.

In a webinar I hosted with TTC Canada Client Services Manager, Nisse Vaya, earlier this year, we discussed some of the common questions surrounding data integrity testing and how organizations could tackle those questions with insights from a real-world case study of one of our clients, a major commercial bank based in Europe. In the following sections, I recap the major keys from the webinar and share additional DI insights.

What is Data Integrity Testing?

Data Integrity (DI) involves maintaining the accuracy and consistency of data as it flows through various systems and transformations. For large organizations, such as global banks, the accuracy of data directly impacts financial operations and customer satisfaction – which makes sense. If I’m a parent looking to transfer funds to a child at college, I want to know that my bank will be able to process that transaction quickly and securely. Similarly, major corporations whose accounts see thousands of transactions a quarter will want to find a financial services partner that they can trust. These corporations need error-free financial information to make sure their customers, stakeholders, and partners can, in turn, trust them. This is where data integrity comes in.

Data Integrity helps you maintain your clients’ trust by addressing the following questions:

  • Is my data correct?
  • Is my data consistent across my systems?
  • Was any data lost during transformations?
  • Did the unique data identifiers remain intact throughout the data’s life cycle?

Common Testing Challenges for Data Integrity Testing for Large-Scale Processes

Data integrity testing in large-scale environments presents several challenges:

  • Handling millions of records and transactions.
  • Integrating data across various systems and environments.
  • Managing data transformations across systems.
  • Minimizing errors in manual testing processes.
  • Maintaining data privacy and regulatory compliance.

If you’re not across these areas, you can find yourself not only dealing with the testing implications (ex: longer project times, higher costs, innumerable headaches), but also the business implications as well. Failure to safeguard your integrations, transformations, and maintenance can lead to costly data breaches, financial penalties, operational disruptions, and, of course, loss of customer trust (which can result in decreased sales – ouch!).

How to Scale those Hurdles with Ease

Now that we’ve laid out some of those pesky challenges on paper, let’s talk about how to overcome them. While different solutions may be needed to help different teams meet their specific goals, the following sections describe how we approached this situation for a specific client.

Adopt an Agile Mindset

First and foremost; it’s incredibly helpful to adopt an agile mindset. As software testing teams shift away from waterfall, there’s a tendency to leave a smaller window for testing. This makes starting your testing earlier more attractive. In alignment with our client requirements, our approach was to leverage synthetically generated test data. Because they weren’t using something from production or waiting for customer data, we were actually able to start testing right away. Thanks to the system design we had in place, we could inject the generated data into the databases on the individual systems we're connecting out to and work through the processes to identify any red flags.

Quality is Everyone’s Job

Many project teams I’ve come across have a desire to make sure that everyone on the project feels responsible for quality. While it may feel like that’s just a job for the testing team, the truth of the matter is, we're all working towards the same goal together. Emphasize this to help create a shared approach to quality.

Involving the wider team can increase overall buy-in and create a culture of quality and innovation. In an ever-evolving field like BFSI, that type of culture can have big impacts on how quickly your software adapts to external forces like regulation changes, new technology, or bans.

Buying or Building: What’s Best for Me?

A major bank we worked with recently had been considering whether it would be better for them to buy a new tool rather than build on what they had. There can be a benefit to building within your company, especially if your budget is tight. Even so, I always recommend teams to consider the whole picture: there is intellectual property baked in with in-house production, and it's also possible for somebody to come in, develop something for you, and then leave, taking their expertise with them. This approach can also require more time. When your timelines are shorter, this may not be the best way forward.

Our banking client wanted a framework that was (1) readily available, (2) proven, and (3) something that wouldn’t have to be developed out of the box. This was a timesaver because software testers could spend their time effectively testing more quickly, and they could use the no-to-low code platform to test on a wider scale. So not only does that mean more testers are able to be involved in the automation process; you'd also remove the development effort and the development hours. You could then spend that time and money developing a custom customer framework, maintaining that, and even expanding it over time.

When clients are looking for a reliable, out-of-the-box tool to introduce to their projects, we do the work to find one that fits their budget, their goals, and their existing systems. In this case, the tool that we selected was Tricentis Data Integrity. This tool allows you to connect to basically any data source while also being able to compare those records at scale. Some of the most recent metrics I’ve seen for Tricentis DI is that it can test 400,000 rows of data per minute while boasting a 95% automation rate. Pretty great stats. This tool also provides immediate feedback on what your data is doing in your test system early on in the process.

We complemented Tricentis DI with a tool called Benerator by Rapidweller, a company out of Germany. Benerator allows you to generate, migrate, anonymize, or pseudo-anonymize data. Because of this, we were able to create synthetic test data, add it to the test system, and run our tests immediately. This helped our clients test more effectively by bringing more people into the testing process. An added bonus was safeguarding sensitive information. We tested earlier by generating synthetic test data while also avoiding any confidential customer data.

Know and Empower Your Team

This client I was telling you about? They were automation champs. Before TTC Global came in, they’d already had a highly technical automation team who knew what transformation goals they had in mind. This isn’t always the case. Within different organizations and even different departments, there can be skills gaps and learning curves that can hinder your data integrity testing efforts. Having those honest conversations and assessments of your team’s capabilities can begin the discussion on how to upskill your people and what support you need to go from good to great.

Our consultants work closely with our clients, training their internal teams where needed, running tests if asked, and really acting as a partner to fill in the gaps that may exist. Ultimately, we work with our clients to make sure they’re achieving their testing goals. By offering our expertise and training your staff, we’re able to empower our clients, not just check boxes off our to-do lists. Even from an internal standpoint, making sure to embed yourself within your team to address questions, provide feedback, and teach best practices goes a long way in supporting your DI journey.

How Will Addressing These Challenges Benefit Me?

At the end of the day, we all want the work we do to garner positive results. It’s why we invest so much in our people, our applications, our processes. Naturally, these results vary by company, project, and so on. But in our experience, creating a quality approach to your Data Integrity can greatly improve your outcomes. By following our engagement, our banking client saw some great results:

  • There was a 200% increase in automation coverage. Coming from that very slim margin of automation that had initially been done on a small number of systems, we were immediately able to increase this quite a bit.
  • 50% of defects were found in earlier testing phases. And, obviously, the earlier you find defects, the earlier they can be addressed and lower the impact and cost in fixing them. So that’s always good.
  • There was a 35% increase in time savings for regression testing. So, while they're saving time, they're decreasing their risk.

I’m really proud of this engagement because our team placed so much focus on doing it the right way the first time. We offer clients our knowledge so that they don’t have to stumble through and figure out everything on their own. We’re able to help them do things the right way the first time, every time, so they can get to their desired goals faster and with confidence.

What Next?

With all of that said, if there are any questions about how to confront the specific challenges at your organization, I'd be more than happy to have a conversation and discuss a little bit more about how TTC Global can help. Feel free to connect with me on LinkedIn or reach out to my email: kayla.gillman@ttcglobal.com.

Together, we can achieve excellence in data integrity!