ETL Dataware housing Interview

What are the various methods of getting incremental records or delta records from the ource systems?
getting incremental records from source systems to target can be done
by using incremental aggregation transformation

How can we use mapping variables in Informatica? Where do we use them?
After creating a variable, we can use it in any expression in a mapping or a mapplet. Als they can be used in source qualifier filter, user defined joins or extract overrides and in expression editor of reusable transformations.
Their values can change automatically between sessions.

Do we need an ETL tool? When do we go for the tools in the market?
ETL Tools are meant to extract, transform and load the data into Data Warehouse for decision making. Before the evolution of ETL Tools, the above mentioned ETL process was done manually by using SQL code created by programmers. This task was tedious and cumbersome in many cases since it involved many resources, complex coding and more work hours. On top of it, maintaining the code placed a great challenge among the programmers.
These difficulties are eliminated by ETL Tools since they are very powerful and they offer many advantages in all stages of ETL process starting from extraction, data cleansing, data profiling, transformation, debuggging and loading into data warehouse when compared to the old method.

How do we extract SAP data Using Informatica? What is ABAP?
What are IDOCS?
Go to source analser ,click on source,now u will get option ‘Import from SAP’
click on this now give your SAP access user,client,password and filter criteria as table name(so it will take lessertime).Afetr connecting ,import the sap source.
Now one important thing after finishing the map save it and generate ABAP Code for the map.Then only workflow will be running fine.

What are active transformation / Passive transformations?
ACTIVE Transformations:Transformations those can affect the number of records between Input and Output.
Passive Transformations:Transformations those do not  affect the number of records between Input and Output.

What is Informatica Metadata and where is it stored?
Informatica Metadate is nothing but the source definition, target definition and the transformations that have been built. They are stored in the respository server and is managed by Repository manager of the informatica client tool.

How do we call shell scripts from informatica?
You can use a Command task to call the shell scripts, in the following ways:
1. Standalone Command task. You can use a Command task anywhere in the workflow or worklet to run shell commands.
2. Pre- and post-session shell command. You can call a Command task as the pre- or post-session shell command for a Session task. For more information about specifying pre-session and post-session shell commands

What is latest version of Power Center / Power Mart?
Latest Ver of Power Center is 7.1 and Informatica stopped marketing powermart product.So right now powermart is not available in market.

Can we override a native sql query within Informatica? Where do we do it? How do we do it?
we can override a sql query in the sql override property of a source qualifier

Can we use procedural logic inside Infromatica? If yes how , if now how can we use external procedural logic in informatica?
We can use External Procedure Transformation to use external procedures. Both COM and Informatica Procedures are supported using External procedure Transformation

Techniques of Error Handling – Ignore , Rejecting bad records to a flat file , loading the records and reviewing them (default values)
Rejection of records either at the database due to constraint key violation or the informatica server when writing data into target table.These rejected records we can find in the badfiles folder where a reject file will be created for a session.we can check why a record has been rejected.And this bad file contains first column a row indicator and second column a column indicator.
These row indicators or of four types
D-valid data,
O-overflowed data,
N-null data,
T- Truncated data,
And depending on these indicators we can changes to load data successfully to target.

No comments: