Among the most important phases of DW/BI testing is front-end report testing. The DW/BI project team should ensure that everything tested in the back-end DW and ETLs is now processed and displayed correctly in end-user reports.
BI reports are often the most visible product of any DW/BI system. Data quality defects that are not discovered during report verifications present a great risk of jeopardizing the entire DW/BI solution’s credibility.
In the context of DW/BI, the report testing phase is often not so much focused on the functions of the user interface and logic flows, but rather, the focus is on the quality of data. For testing purposes, it is usually challenging to separate the DW/BI back and front-end testing efforts. It is of crucial to understand the entire flow of data from source to reports so that testers can better understand and critically think about the information the report is displaying.
First, testing reports means that testers must be flexible in adapting to the different technologies in which BI reports are delivered on their project. Technologies can range from vendor tools, 100% custom-developed, in-house technologies, or something in between such as a hybrid between a vendor tool that was customized in-house to respond to business-specific needs. This means that sometimes the tester is not only testing the report, but also struggling with the environment and tools where the report is running.
Testing BI reports should not be isolated from the ETL/DW testing; that is, report testers likely need profound knowledge of the processes that stream data to the end-user reports. Report testers must often need backtrack errors to their source – ex., source data, or the staging and transformations before load to the DW.
Categories of Quality Assurance for BI Report Testing
In the end, it’s important to consider that report testing is about giving the end user the right information at the right time, and in the correct format. Consider a focus on the following topics for best results:
- Base and derived data correctness (e.g., calculations)
- Data aggregation accuracy (e.g., totals and sub-totals)
- Entity attribute hierarchies (e.g., data entry points and beyond)
- Report layout (e.g., usability)
- Prompts (e.g., invalid entries) and filters (e.g., cascade filtering)
- Summarized and aggregated data
- Table/chart formatting (e.g., rounding, decimal places) and naming conventions
- Drilling, sorting, and export functions (e.g., export to Excel)
- Web browser compatibility
- Linking and traceability correctness among tables/data entities
And, due to the level of criticality and high visibility within your enterprise, report testing should include a robust focus on delivery performance and security.
Conclusions
These suggested categories of DW/BI testing rigor frequently require additional effort and skilled resources. However, by employing methods and processes described here, DW/BI teams can be better assured of their DW/BI implementation process and data quality. Doing so builds confidence within the end-user community and will ultimately lead to more effective DW/BI implementations.
A few automated DW/BI testing tools have become available to support the entire development lifecycle. Consider the best of these, particularly for end-to-end, data integration, data profiling, and regression testing.
Testing data warehouse and BI applications requires superior testing skills as well as active participation in requirements gathering and design phases. Moreover, in-depth knowledge of DW/BI concepts and technology is crucial for understanding end-user requirements and therefore contributing to a reliable, efficient, and scalable design.