Abstract:
|
ETL testing techniques are nowadays widely used into the data integration process. These techniques refer to the ability of being able to check whether the information that is loaded into the data warehouse has correctly followed all the transformations steps. Due to the errors that might occur during the extraction, transformation and load stages, there is a need to monitor and handle the errors that can cause severe data quality issues into the data warehouse. This thesis paper is based on the information gathered from a previous project that was performed at UPC. The main goal of this project was to help the professors from UPC who are teaching the ¿Database¿ course, to better analyze the performance of the students. Therefore, an ETL system was implemented with the goal of extracting student information from multiple sources, transform and load it into the data warehouse. This information can refer to the student¿s personal data, the exercises they are performing, etc. The initial ETL design was based on the creation of the data warehouse schema containing the main dimensions and fact tables. The main issue is that it did not present any monitoring and error handling functionalities, even though the system was generating several errors every time the ETL was executed. The steps I have followed while working on this thesis project have been to model the initial ETL process using a BPMN representation, include error handling and monitoring functionalities and ultimately redesign the initial ETL processes using a chosen tool. Although the initial processes were modelled using Pentaho Kettle, due to the new requirements regarding the error handling and monitoring capabilities I had to perform a comprehensive ETL tool comparison to check what is the tool that can better answer the requirements of this project. |