Friday, 17 January 2014

What is ETL TESTING?


ETL essentially stands for Extract rework Load - that merely implies the method wherever you extract knowledge from supply Tables, rework them in to the specified format supported bound rules and at last load them onto Target tables.
There area unit various tools that assist you with ETL method - Informatica, Control-M being a couple of notable ones.

So ETL Testing implies - Testing this complete method employing a tool or at table level with the assistance of take a look at cases and Rules Mapping document.



In ETL Testing, the subsequent area unit valid -
1) file masses from supply system on to supply Tables.
2) The ETL Job that's designed to extract knowledge from supply tables and so move them to staging tables. (Transform process)
3) knowledge validation among the Staging tables to examine all Mapping Rules / Transformation Rules area unit followed.
4) knowledge Validation among Target tables to make sure knowledge is gift in needed format and there's no knowledge loss from supply to focus on tables.


Extract
In this step we tend to extract knowledge from completely different internal and external sources, structured and/or unstructured. Plain queries area unit sent to the supply systems, mistreatment native connections, message queuing,

ODBC or OLE-DB middleware. the information are place during a questionable area (SA), typically with identical structure because the supply. In some cases we wish solely the information that's new or has been modified, the queries can solely come the changes. Some tools will do that mechanically, providing a modified knowledge capture (CDC) mechanism.

Transform
Once the information is obtainable within the area, it's all on one platform and one info. thus we will simply be part of and union tables, filter and kind the information mistreatment specific attributes, pivot to a different structure and build business calculations. during this step of the ETL method, we will check on knowledge quality and cleans the information if necessary. when having all the information ready, we will like better to implement slowly ever-changing dimensions. therein case we wish to stay track in our analysis and reports once attributes changes over time, for instance a client moves from one region to a different.

Load
Finally, knowledge is loaded into a central warehouse, typically into reality and dimension tables. From there the information will be combined, mass and loaded into datamarts or cubes as is deemed necessary.

Author:
Prime on-line coaching is established to supply quality ETL Testing on-line coaching in Asian country and across the world this institute has wonderful school for on-line coaching for varied courses like Hadoop, SAP all modules, information reposition, .Net and then on...