May 03, 2022 What is ETL Testing? ETL testing ensures that all data processed using ETL is fully extracted, appropriately transformed, and loaded in the correct format. Lilyana Dimitrova QA Specialist In today’s world, companies gather and analyze data from multiple sources. Then, that data is processed using many different BI tools to glean valuable business insights. Or, you can store it for later use in a data warehouse. ETL testing ensures an effective and efficient ETL process regardless of how a business uses its data. What is ETL Testing? Before diving into ETL testing, we first have to understand what ETL is. ETL stands for extract, transform, and load. In the ETL process, data is first extracted from the data source, transformed into a standard format for further analysis, and load it into a shared storage location like a data warehouse. A properly designed ETL system extracts data from the source systems, enforces data quality and consistency standards, and conforms data so that it can use separate sources together. Finally, it delivers the data in a ready-to-use format that allows developers to build applications that end users can use to make decisions. ETL testing ensures the quality of the ETL process. It allows us to validate and verify the entire process to prevent data loss or redundancy. As more and more businesses move their organizations’ technical infrastructure to the cloud, they need to migrate their data from legacy systems to their new cloud environments. The best way to do so is by using ETL processes and tools. Organizations may have legacy data sources like RDBMS, DW (data warehouse), and others that lack performance and scalability. Migrating data to the cloud drastically increases performance, scalability, fault tolerance, and recovery. ETL Testing: Extracting The extracting phase of the ETL process entails extracting data from one or more data sources. If done incorrectly, the rest of the process is moot, so this is arguably the most critical part of the entire process. Data from multiple source systems combine in many data warehousing projects, but each may use a different format or data organization. Common data-source formats include relational databases, XML, JSON, and flat files. It may also have non-relational database structures like information management systems (IMS). Additionally, we can extract data from other types of data structures like virtual storage access management (VSAM), indexed sequential access method (ISAM), and others by using web spidering or screen-scraping techniques. If there’s no intermediate storage need, you can load the extracted data directly into the target database. A considerable part of extracting data is validating that the data includes the proper values in a given domain, like a list of values or a pattern. If it doesn’t, the data gets either fully or partially rejected and reported to the source for further analysis and rectification. ETL Testing: Transforming During transformation, the data is given a set of rules and functions in preparation for loading into the target destination. A critical piece of the transformation step is cleansing the data so that only good data gets loaded into the data warehouse. This process can be a challenge when various systems interact and communicate with one another as the available character sets for each system might differ. Performing data transformation testing is a bit complex, as you cannot achieve it by writing a single SQL query and then comparing the output with the target. For ETL testing data transformation, you may have to write multiple SQL queries for each row to verify the transformation rules. Testing during this stage consists of some or all of the following: Testing that the data correctly transforms according to business rules (this by itself could have many unique test cases) The consecutive pipeline runs behave as expected Validating the accurate processing of ETL-generated fields, e.g., surrogate keys. Validating the data types within the warehouse is the same as what the data model or design specified. Create data scenarios between tables that test referential integrity. The type and amount of tests vary and depend on a few things: the type of dimension (Type 3 might require the most test cases), the business rules (and thus, quality rules) set for the dimension. ETL Testing: Loading The third and final step in the ETL process is loading the data into its target destination. This end target can include any datastore from a delimited flat file to an entire data warehouse. Depending on the business requirements, there’s a lot of variation in the loading step. In some cases, the data warehouse might overwrite existing data with newly extracted data daily, weekly, or monthly. Others may regularly add new data in a historical format. To understand this, consider a data warehouse that maintains the prior year’s sales records of the last year. New data overwrites any data that’s older than a year. However, new data gets entered historically. Replacement timing and scope are strategic design choices but depend on business needs and availability. More complex systems maintain a history and audit trail of all changes made. As the data gets loaded into the database, the constraints defined in the database schema — and in triggers activated upon data load — apply (for example, uniqueness, referential integrity, mandatory fields), which also contribute to the overall data quality performance of the ETL process. 8 Stages of ETL Testing When done correctly, ETL testing detects all data source problems early on to avoid wasting time loading insufficient data into a data warehouse. It also identifies inconsistencies in business rules related to transforming and integrating the data. There are eight distinct stages of the ETL testing process. I. Identify Business Requirements To fully understand what you’re working with, it’s essential to identify the business requirements you’re trying to meet. To do this, start by designing the data model and defining the business flow. Then, move on to assessing your reporting needs. Determining this information upfront ensures a clearly-defined project scope that all testers on the project are fully documented and understood. II. Validate Data Sources The next phase is validating your data sources. To do this, start by performing a data count check. Doing so verifies that the table and column data type meet all the data model specifications you designed in stage one. Double-check that all the check keys are in place and eliminate redundancy by deleting duplicate data. If not done correctly, you could have an inaccurate or misleading aggregate report. III. Design Test Cases Stage three sees us designing ETL mapping scenarios, creating our SQL scripts, and defining the transformational rules. To ensure that the mapping document has all the necessary information, validate it during this stage. IV. Extract Data from Source Systems Stage four entails testing the first step of the ETL process: extracting the data from all of the various data sources. Per the business requirements, execute the ETL tests and identify and report the types of bugs or defects you encounter. Before moving on to stage five, you must detect, report, fix, resolve, and close the bug report for all defects. V. Apply Transformation Logic During stage five, transformation logic gets applied to the ETL process. The data needs to match the schema for the target data warehouse, so be sure and check the data threshold and alignment before validating the data flow. Doing so ensures that the mapping document for each column and table matches the data type. VI. Load Data into Target Warehouse Stage five entails loading all the data into its target location first by creating a staging environment and then moving it to a data warehouse. Before and after doing so, perform a record count check and confirm that the warehouse rejected all invalid data and accepted only default values. VII. Summary Report Stage six is all about the summary report. Before generating the report, verify the layout, filters, functionality, and options you want to include. Since this report makes the ETL test digestible for decision-makers and stakeholders, it’s essential to tailor it to the information they need to know to make informed decisions. VIII. Test Closure With the summary report completed, the only thing left to do is file a test closure and proceed with the ETL process. Tags Quality AssuranceData & AnalyticsData Development & Testing Share Share on Facebook Share on LinkedIn Share on Twitter AI & Cyber Security Learn how AI augments human expertise to detect, prevent, and mitigate cyber threats effectively. Download Share Share on Facebook Share on LinkedIn Share on Twitter Sign up for our monthly newsletter. Sign up for our monthly newsletter.