SAP R/3 to SQL Extraction
Here in this class we will connect the Data services to SAP ECC system to extract the R/3 table’s data and move that to SQL DB.
Pre-Requisite
SAP ECC System
Data Services
Here we will connect to the SAP System through the Data Store and extract the data from the SAP system and save it to the SQL Database.
- Create a New Data Store and give the credentials to connect to the ECC system
- Select the Tables under the Data store and import the sap tables that are required in to the Data services
- Copy the SAP table’s data to a staging and use the staged tables for further steps
Make a note that we are using the already created Data base in SQL “Training”, the local repository and the job server. So we are not creating that again and again. If you wish to create from scratch you can refer to previous document where we created Database, repository and Job server.
Creating Data store to connect to the SAP ECC System
Note down the Application Server and system number .
Login to Data Services Designer. Create a Data store in the local object library under Data Store Tab.
Enter the credentials
Give the Datastore Name “ DS_SAP_IDES”
Datastore type “ SAP Application”
Application server
Username
Password
Click on Advanced and give the advanced parameters under the Advanced parameters
Client number : 800
System number: 02
Work Directory: \\ application server name \
Application path: \\ application server name \
Generate Directory: D:\Program Files\Business Objects\BusinessObjects Data Services
Security : SAP_ALL
Number or reties: 1
(Note:- make sure that you change the Number of retries to 1 as if we enter any wrong credentials” password” this will try connecting to the sap system 3 times with the wrong password and the user id will be locked due to improper login for 3 times.)
Now you can find the newly created Datastore under the local object library.
Select the Data store → right click and click on search
Select the search for External data by using the drop down button button, Give the table name MARA and click on search. After you click search you will find the name of the table and the description. Select the table and right click on the table and click on import.
Now you can find the MARA table under the Datastore you created under the tables.
Similarly import the MARC table.
Now your sap tables are ready to proceed with the job.
As per SAP AIO , we first save the sap tables to the local data base. With this we can avoid connecting to the sap time every time we need to fetch the data.
Now storing the sap table’s data in to the data base.
Go back to the designer, create a job, data flow, and under the dataflow drag both the tables MARA and MARC tables in to the canvas.
Now we need to save these tables data in to the local database. Drag query Transformation and temp tables and join source tables to query and under the query drag the required fields from schema in to schema out. Go back and connect the query transform to the target template tables.
Please follow the bellow screen shots.
Click to open the query and move the desired fields from schema in to schema out.
Here we can see the tabs like mapping , select , from , outer join , where and so on. Which are use to filter the data or make a join. Will explain them in detail after this session.
Now click on back and create a template table.
Click on the template table on the right side pane and click it back on the canvas. Give the table name and select the data store on which you want to save and click on OK. And connect the query to the template table.
Similarly do the same to the marc table. And click on save. Now you are saving the sap table’s data to the SQL.
Select the job under the project area and right click and execute the job.
You will get the execution properties and click on “ok”.
Under the trace you can see the job starting and at the end you will see the job completed successfully.
Under the monitor next to the trace you can see the below details
This concludes loading the data from sap to SQL .
To check this under the SQL if the tables are added and the records, login to the SQL Management studio Express. Select the data base “Training”. Expand this database and click on tables, you can see the list of tables on the right side.
Select the table right click → script table as →select to→ new query editor window. You will find the below screen.
Press “Ctrl + A” and press “F5”. Now you will find the data.
Now we merge the data from two tables mara and marc and put that in to one single table.
Here we cover the concept of Global variables, Script and conditional flows.
Global variables :- are used to define the variables that are available all over the job and these global variables will be assigned to a fixed value.
Defining a Global variable : tools→ Variables in the designer window.
You will get the variables and parameters screen.
Select the Global variable and right click and click on insert.
Double click on the global variable and define the variable.
Give the above credentials and click on ok. Similarly create another global variable $GV_load.
Script:-
Scripts are used to initialize the global variables and these are used to drive the job. You can restrict some portion of the job from execution by setting the Global Variable to ‘N’.
Click on the scripts on the right side pane and click it on the canvas. Give the name .
click on the script and open it and define the global variables as below.
Now go back .
Conditional Flows:-
These Global variables are used in the conditional flows to control the job flow. This is done as below.
In the designer , select the condition flow on the right side of the canvas.
and click it on the canvas. Double click on the condition and give the global variable in side that as below.
Similarly do the same for the Conditional load.
Now Drag the Dataflow used to extract under the condition as above. Now click on back and open the second Condition. Drag a data flow from the right side pane and click it under the Condition load as below and name it as DF_Load.
Open the Data Flow by clicking on it. Now drag the tables STG_MARA and STG_MARC from the Data store STAGING as below.
Now drag a query transform from the right side pane and click it on the canvas. Connect the source tables, STG_MARA & STG_MARC to the Query Transform. Now click on the query to open it.
You can see the two tables under the schema in. expand them and drog the required fields to the schema out. Here we need to join both the tables under the where condition. Click on the where tab. Drag MATNR from MARA to the where condition type “=” and now drag MATNR from the MARC table.
Its mandatory to join if we are using two tables as source.
You can also filter the data for a particular plant like above in the where condition.
Click on back and create a template table and connect the query to the template table.
With this we complete the joining of two tables in to one using a where condition.
Now you link the script to the Condition1 and link the condition1 to condition2 as below.
Now you click on validate/Validate all on the top of the screen to check If the total job correct or not.
Now you can run the job.
Select the job under the Project Area right click and click on execute.
This Concludes Using Global Variables, Scripts and Conditional Flows and extracting the data from sap and loading it in to the SQL and then Using those SQL tables combining both the tables in to single table.