Test Data Management and Data Masking.

Part of the problem stemmed from the customer's absence of a simple system for managing test data and data masking in a cost-effective and efficient manner.

Around the year 2010, a customer of mine built data governance tools and regulations for their production database. They worked hard to secure their production data and systems from the prying eyes of a hacker and the treacherous fingers of an unethical software programme.

They did, however, leave their development and test database systems relatively unsupervised, with little to no data controls in place. Worse, they took production data and utilised it for development. Then there was a breach in their development environment (the one containing production data). Millions of records comprising credit card information and customer information were taken as a result of the incident. As a result, a multi-million dollar fine was imposed.

Part of the problem stemmed from the customer's absence of a simple system for managing test data and data masking in a cost-effective and efficient manner. They had no way of keeping that information current. They lacked the capacity to conceal and update datasets in real time. They did the bare minimum and paid a high price for it. In their justification, test data management has a long history of problems.

Traditional Challenges with Test Data Management (TDM)

Understanding Test Data Management is critical to determining how to best apply the proper instrument. TDM presents numerous challenges, but they all fall into the following categories:   

Software Development Teams Lack Environment Provisioning is a Slow, Manual, and High-Touch Process Data masking for high-quality data is becoming increasingly important, yet it adds friction to release cycles. Test data requirements and storage costs are increasing all the time.

To overcome these issues, IT organizations must implement the tools and processes necessary to efficiently make the appropriate test data available to project teams. A thorough strategy should aim to improve TDM in the following areas:

Data Distribution: Shortening the time it takes to deliver test data

Data quality: Refers to meeting the requirements for high-fidelity test data.

Data Security: Reducing security risks without sacrificing agility

Infrastructure Costs: Reducing the cost of storing and preserving test data.

What is test data management? provides a more in-depth examination of the principles of test data management.

The SQL Distributed Challenge

TDM-focused software businesses have been helping enterprises to design, deploy, and maintain safe DevOps systems with "real world" data WITHOUT delivering "real world" data. Furthermore, the development of Distributed SQL databases threw TDM firms off guard. For one thing, Distributed SQL firms all attempt to address the same core concerns of scale, latency, data resiliency, and ANSI SQL language support.

Use cases for Test Data Management

Test Data Management is most often used to generate non-production settings from production sources.

The following are the primary usage cases:

  • Data Masking for Sensitive Information
  • Application development
  • Migration of legacy applications

Data Masking for Sensitive Information

Data restrictions are becoming more stringent. Countries and states all across the world are enacting privacy laws that extensively apply to corporations. Concerns about ransom ware are growing by the day. There is always the risk of data leakage to bad actors.

Application development

The requirement for data to build and test on is critical for both new and existing applications. Existing production systems are the most important source of data.

The virtualization and masking features provide a potent combination for providing an integrated TDM solution that uses compliant production

Migration of legacy applications

In today's world, software delivery speed is important. Every business strives to increase efficiency and job quality in order to compete in the business domain.

To take advantage of newer platforms, legacy apps must be incompatible with new systems. The continued use of outdated software results in costly support and maintenance. Legacy application migration is frequently complex and time consuming. It is an iterative procedure that must be repeated several times until it is perfect. To achieve the desired outcomes, you must frequently redo the entire migration. This results in a significant loss of labor and time.


Tanya Marten

4 Blog posts

Comments