What is “SAP Business Objects Data Services”?
This is a software tool designed by Business Object ( a company that got acquired by SAP in 2007) Some of the basic purpose of this tool is to perform jobs like –
§ ETL (Extraction Transformation and Loading)– Pulling out data from any system/database/tables ,applying changes to modify the data or applying programming logic to enhance the extracted data ,and loading data into any other system/database or tables. E.g.: ETL of data from SQL server database to Oracle.
§ Data Ware Housing– A database specifically designed and developed in a particular format to enable easy data analysis or reporting. This could be developed using data from various databases or any other data sources.
§ Data Migration - Moving of data from one place to another. This is a subset of ETL where data is relocated from one software system or database to another .This also involves modification and alteration of data.
§ Business Intelligence– A concept which combines the data warehousing system and reporting. This is applied to analyse data of organization to effectively perform functions like Business performance improvement.
Why SAP Business Objects Data Services?
There are many other software tools in the market which are capable of doing the same functions or activities as mentioned above or even more. They are the direct competitors for Business Objects. They are Informatics, Data stage, Congo’s, and SSIS etc. The above mentioned activities can also be performed using programming tools like .Net or even Java and also directly within database end like SQL Server or Oracle.
The tool BODS provides a very easy and efficient interface to perform these specialist tasks which involve data manipulation. The objects and functions within BODS are specifically designed to perform manipulations and transformation of huge and complex volume of data very efficiently. There are system provided objects and functions which can be dragged and dropped easily and jobs can be created. And, being a SAP tool has very god compatibility with SAP applications compared to any other similar tool.
Common terms and terminologies
Designer
Designer is the graphical user interface that lets you create, test, execute and debug BODS Job. This is the space where the data transformations take place.
Repository
Repository is like a database that stores the objects in a designer. The job metadata, the transformation rules and the source and target metadata also. There are primarily three types of repositories Local, Central and Profiler. The designer cannot even be opened for any task without having a local repository. In other words local repository is a mandatory repository for BODS functioning. At this point we are not bothered about the other two repositories.
Engine
The BODS Engine executes the jobs created using the Designer. When the BODS application is started, there are enough Engines launched to effectively accomplish defined tasks.
Job Server
The Job Server is an application that launches the Data Services processing engine and serves as an interface to the engine and other components in the Data Services suite.
Access Server
The Access Server passes messages between web applications and the Data Services Job Server and engines.
Data store
A data store provides a connection to a data source such as a database. This is a linking interface between the actual backend database and Data services. Through the data store connection, Data Services is able to import descriptions of the data source such as its metadata.
CMC (Central Management Console)
This is a web based administration tool for BODS which is used for some basic functions such as repository registration, User Management etc.
These terms should be at the finger tips of a BODS programmer as these would be used very often while working on BODS.
BODS Architecture
The illustration below (Figure 1.1) shows the basic architecture of BODS.
Figure 1.1
Above diagram explains the relation among Designer, central and local repository and web application. We will gradually understand these later. In the next page we will start creating our first BODS application
In the first chapter of this article we have learnt the very basic of BODS. In this part of the article we will begin with BODS Repository creation process
Before launching the BODS application and starting actual BODS programming there are three mandatory activities which needs to completed they are –
1. Repository Creation
2. JobServe Configuration
3. Registering Repository with CMC.
Creating a Repository
The first and foremost activity one has to do after installing BODS and database is, creating a local repository. The below mentioned are the step by step process to create it-
- 1. Log on to your Database (SQL server 2008 Express in this case)
- 2. Create a new database (Example - DS_LOCAL_TEST in the figure 2.1).This database would be used for Local repository. (This article does not explain how to create database and tables outside BODS.)
Figure 2.1
- 3. Go to Start menu open SAP Business Objects Data Services 4.0 SP1 -> Data Services Repository Manager
- 4. A screen as shown in the below diagram (Figure 2.2) would popup. Enter the required parameters into the text boxes as stated below -
Figure 2.2
Below are the credentials one would need to fill in the Repository Manager.
Repository type– Local
Database Type– Microsoft SQL Server (in this case)
Database Sever name– This should be the machine/server name where the database is residing. If you have installed database in your local machine the machine name or machine IP has to be provided.
Windows authentication– DONOT check this option .It is always recommended to use password authentication.
User name– The login name of that is used to login to the above mentioned Database residing in the Database server mentioned.
Password - The same password used to login to the above mentioned Database residing in the Database server mentioned for the login id specified.
- 5. Click on create. The below displayed message could been seen(Figure 2.3), indicating “The local repository was created successfully”. (If you do not get the success message, then, any one of the credentials is incorrect like the database server name or the database name or user id password)
Figure 2.3
- 6. Close the window
Attaching repository to JobServe
The next step following the creation of repository is attaching the repository to a JobServe. Only once it is attached to a job server, it can be opened using designer. Below are the steps to associate a repository to JobServe-
- 1. Go to Start Menu -> SAP Business Objects Data Services 4.0 SP1->Data Services Server Manager. A screen as shown below would be displayed (Figure 2.4)–
Figure 2.4
- 2. Click on “Configuration Editor” and the screen Figure 2.5 would be displayed.
Figure 2.5
- 3. Next click on “Add” button and below screen would be displayed (Figure 2.6)
Figure 2.6
- 4. Click on “Add” button on the “Associated Repositories” label to enable the options in the “Repository Information” label.
- 5. Enter the credentials as shown in Figure 2.7 and click “OK”.
Figure 2.7
§ Job Server name– Enter any name that you wish to assign for the JobServe. Example in the above figure JobServer_Test in this case.
§ Job Server port– By default it is 3500.If there occurs any error assigning the port as 3500, then try to increment the value like 3501 or 3502 or higher.
§ Ignore the Check boxes and go to the label “Repository Information”.
§ Database type– Select the database in which you created the repository. Microsoft SQL Server in this case.
§ Database Server name– Name of the machine/server where database resides.
§ Database name– Name of the database which was created for local repository.
§ Username and Password– Same user name and password used to log in to SQL Server database in which the local repository database resides.
§ Leave the “Default repository” check box as is.
- 6. Click on “Apply” and screen as shown in Figure 2.8 would be displayed.
Figure 2.8
- 7. The “Associated Repositories” box would be automatically populated with the concerned repository name.
- 8. Click on OK. Screen as in figure 2.9 can be seen.
Figure 2.9
- 9. Click on OK and we get a screen as shown in figure 2.10
Figure 2.10
- 10. Click "Close and Restart” to finish the JobServe configuration for the repository created. This will restart the BODS Engine and dialog box may be popped confirming it would restart the engine.
In the first chapter of this article we have learnt the very basic of BODS. In this part of the article we will begin with BODS Repository creation process
Before launching the BODS application and starting actual BODS programming there are three mandatory activities which needs to completed they are –
1. Repository Creation
2. JobServe Configuration
3. Registering Repository with CMC.
Creating a Repository
The first and foremost activity one has to do after installing BODS and database is, creating a local repository. The below mentioned are the step by step process to create it-
- 1. Log on to your Database (SQL server 2008 Express in this case)
- 2. Create a new database (Example - DS_LOCAL_TEST in the figure 2.1).This database would be used for Local repository. (This article does not explain how to create database and tables outside BODS.)
Figure 2.1
- 3. Go to Start menu open SAP Business Objects Data Services 4.0 SP1 -> Data Services Repository Manager
- 4. A screen as shown in the below diagram (Figure 2.2) would popup. Enter the required parameters into the text boxes as stated below -
Figure 2.2
Below are the credentials one would need to fill in the Repository Manager.
Repository type– Local
Database Type– Microsoft SQL Server (in this case)
Database Sever name– This should be the machine/server name where the database is residing. If you have installed database in your local machine the machine name or machine IP has to be provided.
Windows authentication– DONOT check this option .It is always recommended to use password authentication.
User name– The login name of that is used to login to the above mentioned Database residing in the Database server mentioned.
Password - The same password used to login to the above mentioned Database residing in the Database server mentioned for the login id specified.
- 5. Click on create. The below displayed message could been seen(Figure 2.3), indicating “The local repository was created successfully”. (If you do not get the success message, then, any one of the credentials is incorrect like the database server name or the database name or user id password)
Figure 2.3
- 6. Close the window
Attaching repository to JobServe
The next step following the creation of repository is attaching the repository to a JobServe. Only once it is attached to a job server, it can be opened using designer. Below are the steps to associate a repository to JobServe-
- 1. Go to Start Menu -> SAP Business Objects Data Services 4.0 SP1->Data Services Server Manager. A screen as shown below would be displayed (Figure 2.4)–
Figure 2.4
- 2. Click on “Configuration Editor” and the screen Figure 2.5 would be displayed.
Figure 2.5
- 3. Next click on “Add” button and below screen would be displayed (Figure 2.6)
Figure 2.6
- 4. Click on “Add” button on the “Associated Repositories” label to enable the options in the “Repository Information” label.
- 5. Enter the credentials as shown in Figure 2.7 and click “OK”.
Figure 2.7
§ Job Server name– Enter any name that you wish to assign for the JobServe. Example in the above figure JobServer_Test in this case.
§ Job Server port– By default it is 3500.If there occurs any error assigning the port as 3500, then try to increment the value like 3501 or 3502 or higher.
§ Ignore the Check boxes and go to the label “Repository Information”.
§ Database type– Select the database in which you created the repository. Microsoft SQL Server in this case.
§ Database Server name– Name of the machine/server where database resides.
§ Database name– Name of the database which was created for local repository.
§ Username and Password– Same user name and password used to log in to SQL Server database in which the local repository database resides.
§ Leave the “Default repository” check box as is.
- 6. Click on “Apply” and screen as shown in Figure 2.8 would be displayed.
Figure 2.8
- 7. The “Associated Repositories” box would be automatically populated with the concerned repository name.
- 8. Click on OK. Screen as in figure 2.9 can be seen.
Figure 2.9
- 9. Click on OK and we get a screen as shown in figure 2.10
Figure 2.10
- 10. Click "Close and Restart” to finish the JobServe configuration for the repository created. This will restart the BODS Engine and dialog box may be popped confirming it would restart the engine.
In our earlier article, we have learnt how to create a repository in BODS. Once that part is done, The final process before launching the BODS application is registering the repository with CMC.
The repository that was created and attached to JobServe needs to be registered in Business Object’s Central Management Console or CMC.
- 1.Goto Start Menu -> Information Platform Services 4.0 -> Information Platform Services -> Information Platform Services Central Management Console .A webpage as displayed as in Figure 2.11 would popup.
Figure 2.11
The credentials required for the figure 2.11 would need to be furnished as required.
System– The name or IP address of the machine on which the BODS server is installed(not to be confused with Database server).This BODS server could be the same machine on which you are working or can be installed on any other machine and connected via network.
User Name– This username is set by the Administrator user who installs the BODS tool. In this case I would be using the super user name “administrator”.
Password– The password set for the username administrator at the installation time.
Figure 2.12
- 2. After logging in, a screen as shown in figure 2.12 is opened up with the options as shown. At present we can ignore all the options except “Data Services”.
- 3. Click on the “Data Services” icon and a new window would be opened as shown in figure 2.13.
Figure 2.13
- 4. Right click on the link Repositories -> Configure repository or right click on the link Repositories -> Manage -> Configure repository and a new window as in figure 2.14 is opened up.
Figure 2.14
- 5. Each of the fields in figure 2.14 needs to be filled as mentioned below and also as shown in figure 2.15 -
§ Repository Name – Name of the database that was created for local repository. In our case we would use DS_LOCAL_TEST.
§ Description - A description in few words as a label.
§ Database Type – Microsoft SQL Server.
§ Hostname - This should be the machine/server name where the database is residing. If you have installed database in your local machine, then machine name or machine IP has to be provided.
§ Port – Leave the default port as it is ‘1433’.
§ Database Name – Same as repository name DS_LOCAL_TEST.
§ Windows Authentication – ‘No’
§ Is Profiler Repository – ‘No’
Figure 2.16
- 6. Click on “Test Connection” button and a message box as shown in figure 2.17 is displayed.
Figure 2.17
- 7. Click on “Save” and you can see the repository that was just registered with CMC and the status as active. Figure 2.18.
Figure 2.18
- 8. Click on “Logoff” and close the window. Now we are setup and are ready to launch the Designer and start BODS programming.
In our earlier article on Data Services we have learnt how to register the local repository with CMC. In this part of the article we will start using Data Services Designer.
Launching the BODS Designer to create Jobs
After the successful creation of Local repository, JobServe configuration and registering repository in CMC you are now good to launch the BODS designer. After opening the designer you would be able to create and execute BODS jobs.
Here are the steps for launching the BODS designer-
- 1. Go to Start Menu and click on SAP Business Objects Data Services 4.0 SP1 -> Data Services Designer. A screen as shown below would be displayed (Figure 2.4) -
Figure 3.1
- 2. Enter the credentials as required
System – host[post] - The name or IP address of the machine on which the BODS server is installed(not to be confused with Database server).This BODS server could be the same machine on which you are working or can be installed on any other machine and connected via network.
User Name – This username is set by the Administrator user who installs the BODS tool. In this case I would be using the super user name “administrator”.
Password – The password set for the username administrator at the installation time.
- 3. After entering the credentials, Click on “Log On” and it would display all the repositories that we created and registered with CMC on that BODS server (Figure 3.2).We had only one repository created so that can be seen.
Figure 3.2
- 4. Select the repository (DS_LOCAL_TEST in this case) and click on OK (or double click on the repository selected) and the Designer gets opened as shown in figure 3.3.
Figure 3.3
- 5. The figure 3.3 shows the first screen we get once we launch the BODS Designer, and this is our playground. We create our BODS jobs, execute, and debug them using this Designer.
Creating Data Services Jobs
After the preparation steps and launching the Designer we are now set up to start our job. Before we start the programming there are program hierarchies that we need to understand.
§ Every job that we create in BODS should be embedded within a project.
§ The object “Project” is the highest in the hierarchy followed by “Job” then “Workflow”/”Conditions “and finally “Dataflow”, and in the Dataflow we drag and drop the various transformations or tables or objects to perform the actual job run.
The above statement may be little confusing or difficult to easily understand.
To make it simpler we will go with a real life example. Here BODS Designer is the world and Project would be country. This country can accommodate many states, so the states become our jobs. In the state there are districts and those are our Workflows or Condition flows and in those districts we have corporations/municipalities. Likewise in the workflows there are data flows. And in these corporations and municipalities there are people. Similarly in the dataflow there are tables and transformations.
World>Country >State>District>Corporations/Municipalities>People
Designer>Project>Job>Wrokflow/Condition>DataFlow>Tables/Transforms
We would familiarize with the windows that can be seen within the Designer window (like Local Object Library etc) on the go. As we show this in the coming examples and practice jobs these would be clearer.
Creating our first BODS Program
- 1.Click on Project ->New->Project.(Figure 4.1)
Figure 4.1
- 2.We would get a new window to create new project. Enter the name of the project you would like to assign. In this case we go by the name “Prj_Test”(Figure 4.2).
Figure 4.2
- 3. Click on “Create” button.
- 4. The Prj_Test is created. Now we have to create a job under the project. Right click on the project Prj_test that we just created and select the option “New Batch Job”(Figure 4.3). At present we are dealing with only batch jobs and not real time jobs. The basic difference between batch and real time job is TBD
Figure 4.3
- 5. Create a job with any name, We have used “Job Test” (figure 4.4)
Figure 4.4
- 6. After creating the “Job Test”, we are now ready to create workflows/conditions and dataflow. To execute transformations the minimum object we need to have inside Job is a dataflow. Workflows or Condition flows are not mandatory but is a good programming practice to use them. From the tool bar on the right (indicated in figure 4.5), click on work flow (second object from the top) and place it anywhere on the Designer workspace. Rename the workflow (any name).I have used WF_TEST1.
Figure 4.5
- 7.Double click on the workflow WF_TEST1. From the tool bar on the right , click on dataflow (third object from the top) and place it anywhere on the Designer workspace within the workflow. Rename the dataflow as ‘DF_TEST1’ (figure 4.6).
Figure 4.6
- 8. We can now see the objects that we created, both in the “Local Object Library” by clicking on each of the tabs and also in the “Project Area” in a hierarchical format (figure 4.8).
Figure 4.9
- 9. Next we would start the actual transformation. For this we would begin with creating a “Datastore”. As discussed earlier, datastore is a logical connection to database.
To create a datastore in BODS and to link it to any database first of all, we should have a database and few tables in that database already created .For example, In our case, I have created a database “db_Test_Data” in my database server and two tables tblEmployee and tblEmpSalary ( These were created at the database backend and not in BODS).
We would try to now import these two tables using datastore connection into the BODS staging area (Staging is another common term that is used to describe the BODS Designer area itself where the data transformations are done).Figure 4.10 shows the Database and tables I have created in SQL Server database.
Figure 4.10
- 10.Now we create the datastore to connect to Database and import the tables. Figure 4.11 shows datastore creation. Go to datastore tab in the Local Object Library and right click on it and a popup window would come up. Click on “New”.
Figure 4.11
- 11. Once “New” is selected a screen as shown in figure 4.12 would be displayed.
Figure 4.12
- 12. Enter the credentials in the textboxes and select appropriate options (Figure 4.13).
Figure 4.13
Datastore name – The name you would to name you new datastore that you are creating now. Here I have use “DS_TEST1”.
Datastore Type – In our case it is “Database”
Database Type – My database is “Microsoft SQL Server”
Database version – The version on my system is “Microsoft SQL Server 2008”.Select the appropriate option the you are using.
Database Server name – This should be the name or IP of the machine /server which has the tables that you are going to import. (not the one have your local repository database. It could be the same or different). I have created both in the same database sever.
Database name – Name of the database which contains the tables to be imported to BODS.
Username – User name which is used to log on to the database server.
Password – Password for the username which is used to log on to the database server.
- 13. After filling up these details click on “OK”.
- 14. Leave all the other check boxes and other buttons seen on the window for now. We do not need them for now.
- 13. We would now be able to see the datastore (DS_TEST1) which we just created on the Local object library as shown in the figure 4.14.
Figure 4.14
- 14. Click on the datastore DS_TEST1 to expand the tree three nodes can be seen namely “Functions”,”Tables” and “Template Tables”. Our next task is to import the tables “TblEmployee” and “TblEmplyeeSalary ” that we created at the database backend using this datastore. There a two ways one can do it.
1. Double click on the “Tables” icon.
2. Right click on the “Tables” icon and select “Import by Name”. Using this method we can import the tables faster from database, but one has to know the exact table name to do this.
Here I would go with method ‘1’,as I do not have many tables in my database.
- 15. After double clicking the “Tables” icon we would get a window as shown in figure 4.15.
Figure 4.15
Note - In case you have more tables in that database all of those tables would be listed. Since I have only two tables that I had created only those two are listed.
- 16. Select both the tables right click, a menu will popup click on “Import” (figure 4.16).
Figure 4.16
- 17. Now, once again click on the Tables icon on the Local object Library and you can see the tables we have imported in the tree (figure 4.17)
Figure 4.17
- 18. We now have the source system data in our datastore. (But this does not necessarily mean we have extracted the data.)
Exercise 1
As we have set up our datastore and are ready with the job, let’s try out a small exercise. We are here going to perform a small scale Data Migration or ETL, which includes the following scopes -
§ To extract this data from this source system for which we have created the datastore (DS_TEST1) into a staging area.
§ To do a small transformation on the incoming data by adding a new column to the table tblEmployee, called ‘EmpLastname’ and leave it blank.
§ To join the two tables tblEmployee and tblEmployeeSalary with the common field EmpNo.
§ Load the result into a table in another database.
(Do not panic seeing too many to-do items; these are easily doable tasks for a starter)
Solution design
Any programmer knows, there are always different ways to approach a problem. But, in the end, the most efficient and easy method is adopted. Here also we would follow the same approach. In BODS the best way to make you programs execute efficiently is to reduce the number of objects and flows where ever possible. This ability can be achieved over a period of time only, that too by practice and experience.
Let’s take our example; we can actually manage all the four requirements using just one dataflow. Here is how we do it –
- 1. Go to the job ‘Job_Test’ we had created, double click on the dataflow ‘DF_TEST1’.
- 2. Drag both the tables from the Local Object Library to the Designer workspace.
- 3. A dropdown menu would popup ( for both the tables) .Select ‘Make Source’ for both the tables(Figure 4.18)
Figure 4.18
- 4. Once both tables are dragged into the workspace, we would need to use BODS defined transformation objects. We cannot cover all of the transformation objects or functions, but here we are going to use one of the transform object, which is most useful and helpful transformation object; the ‘Query Transform’.
- 5. Click on the “Transforms” tab of the Local Object Library. Then, Click on the “Platform” icon to expand the node and we can see a list of transform objects. Drag the fourth transform from the top called ‘Query’ transform to the workspace between both the tables.
- 6. After placing the transform, Click and drag on the dark blue dot on the table (‘A’ in figure 4.19) and join the line to the inward arrow of the Query transform(‘B’ in figure 4.19). The table tblEmployee is already connected in figure 4.19 and the other table is still not connected.
Figure 4.19
- 7. Once both the tables are connected to the Query transform, open the query transform by double clicking on it. Inside the query transform we can see both the source tables that we have on the left side. The right side of the query transform which is now blank is the place where we have to perform our operations (figure 4.20).
Figure 4.20
- 8. Select and drag all the fields from tblEmployee to the right and also only the EMPSALARY field from the tblEmployeeSalary table (as per the requirement we need all the fields from tblEmployee along with salary field). And then click on the “WHERE” tab below the fields area. Here, we have to specify the join condition just like normal SQL query. The join condition here is “tblEmployee.EMP_NO = tblEmployeeSalary.EMP_NO”. To do this you do not have to type the field names. Just click on ‘EMP_NO’ field in the table tblEmployee on the left pane and drag it to the empty space inside “WHERE” tab.Figure 4.21 shows the activities discussed here.
Figure 4.21
- 9. Our next task is to add a new field to the output ‘EmpLastname’. To do this we need to right click on the right hand side pane of the query transform, on any of the existing fields (preferably the last field which is EMPSALARY in this case).Select the option ‘New Output Column’ (figure 4.22).
Figure 4.22
- 10. A window would be popped up as seen in figure 4.23, select “Insert Below”.
Figure 4.23
- 11. Another window is displayed to initialize and set the properties of the new field which we are going to create. Figure 4.24 shows the window with values entered as required for our scenario. Ignore all the other tabs for now and just enter the “Name” ,”Data type” and “Length” as required.
Figure 4.24
- 12. Click on “OK” and now we would be able to see the field on the right hand side pane of the query transform (Figure 4.25).
Figure 4.25
- 13. After we have added the ‘EMPLASTNAME’ field to the target side, we can check if syntactically we stand correct so far, and we do this using the “Validate” button. At this point by looking at the screen we can recognize, we are not yet done with the field mappings as the blue triangle is missing in the “EMPLASTNAME’ field. But just to understand the functionality of the “Validate” let’s click on it (figure 4.26).
Figure 4.25
- 14. As expected we get the error message box clearly indicating that the field ‘EMPLASTNAME’ is not mapped. To do that we have we have to syntactically correct it by completing the mapping of that field. For this, go to the “Mapping” tab and in the empty space of the Mapping tab and type in two single quotes ‘’ .By two single quotes we assign the default value of the field as blank (which is as per the requirement).Now again validate the and we can see there are no errors .Next click on the “Back” button to go back to the dataflow (figure 4.27).
Figure 4.28
- 15. Now we have completed three of our four tasks. Next we have to create our target table to get the final output. Here, I am going to create another datastore for a database residing on another server. So, the effect of this job would be like an actual data migration or and ETL where we extract tables from one database and transform it in BODS, add fields and load it into another database. The datastore I have created for target table is DS_OUTPUT1 (I am not repeating the datastore creation steps).There are two ways we can add an output table to our transformations.
1. Click on the “Template tables” icon on the Local Object Library under the appropriate datastore. Drag and drop it to the workspace.
2. Select the “Template table” option from the tool bar on the right side of the workspace and click anywhere on the Designer workspace. A new dialog box would popup, enter the name of the output table as required and select the appropriate datastore under which the table needs to be created.
Here, I would go with method number ‘2’. Figure 4.29 shows this action step.
Figure 4.29
- 16.Here we can give any name to the output table that is being created, I name it “tblOutput” and select the correct datastore in which the table needs to be created, from the drop down “In datastore”. Then click “OK’.
- 17. We can now see the table that we created in the workspace. Join the table to the Query transform as we did to the other two tables. After doing that validate your job again. You would see there are no errors and now we are good to test run our job (figure 4.30).
Figure 4.30
Executing your BODS job
The job has been created and checked for syntax errors and we see that job is ready for execution. Before execution there is one more object we can quickly brief through, the Script object. Let’s add a script to our job and display the message “Hello World”. Without “Hello world” any programming looks incomplete. Figure 4.31 shows how to add script.
- 1. Click on the “Job_Test” icon on project area to navigate back up or you can do this using “Back” button also.
- 2. Click on the “Script” button on the toolbar and then click on any space on the designer workspace. You can see the script has been placed on our workspace.
- 3. Rename it to “SCR_MSG” and then connect it to the workflow “WF_TEST1”.
All these steps are shown in the figure 4.31.
Figure 4.31
- 4. Double click on ‘SCR_MSG’ to open the script.
Figure 4.32
- 6. Press the “Back” button or click on the Job icon to go back to the job level. Validate the job again to make sure everything is syntactically correct. Save the work you have done so far. Click on ‘Save All’ button on the toolbar on the top (shown in figure 4.33).
- 7. Now execute the job. This can be done in many ways
1. Right click on the Job_Test on the project area and select ‘Execute’.
2. Click on the ‘Execute’ icon on the upper tool bar or press F8.
Figure 4.33
- 8. Either ways we execute, we would get a screen as shown in figure 4.34. For now just ignore all of the check boxes and tabs that we see on the screen and click ‘OK’.
Figure 4.34
We can see the name of the Job server that we had created for this repository.
- 9. Once we press ‘OK’ the job begins execution and we can see the message Hello World and towards the end there is a message ‘Job is completed sucessfuly’ Both these have been indicted in figure 4.35.
Figure 4.35
- 10. Congrats!! you have now successfully executed your first BODS job.
- 11. After execution of our job we can see the number of records that were processed. This can be done by clicking on the monitor icon (shown in figure 4.36) on top of the screen. At alter stage if you want to review or re-visit you old job execution details, then click on the Monitor tab in the Project area (shown in figure 4.36) and you can see all the jobs you have executed. Click on the job you would like see the status of. Few things to keep in mind are-
1. In the Project area Monitor tab if you see a small red light as in this case it means the job was executed successfully.
2. In the Project area Monitor tab if you see a Big red cross (we would see this next) it means the job was executed with errors.
3. In the Project area Monitor tab if you see a green light it means the job is still being executed.
Figure 4.36
- 12. Our next most important activity is check if the data has been actually migrated to the target table and if that has been migrated as expected. To do this, go in to the dataflow by clicking on the dataflow in Project Area. Next click on the lens like icon on the target table ‘TBLOUTPUT’ and there we see the output data as required.
Figure 4.37
Now, if you cross verify this against our requirements we can very well see that the output matches the expectations. This table can also be viewed at the database backend also by querying the table.
Debugging a BODS job
We now know how to create and execute a job; our next task is debugging the job that has errors. For this lets create an error in our job. Detach the link between our script SCR_MSG and workflow WF_TEST1.Save and execute the job. You can see that the job stopped execution and a Red indicator lighted up on the top of execution window (Figure 4.38). Click on the red button and you can see the error clearly listed.
Figure 4.38
You can test it further with creating your own errors and debugging. Some errors may not be very easy to understand this would become more and more understandable with experience and practice.
Do’s and Don’ts and Programmer tips
- 1. Do not copy paste dataflows if you want to re-use the same dataflow with modifications. Dataflows should be replicated only.
- 2. To re-use the dataflow without any modification, select it from the Local object library and drag and drop it to the designer work space.
- 3. Do not rename tables or copy paste target tables once created. Should you rename a table delete the table entirely from the Local object library and re-create it.
- 4. In a dataflow once a target table is created, we cannot have a starting point from that table. That target table would need to be used as a source in next dataflow.
- 5. There is no predefined or hard and fast rule while placing or arranging the transforms /tables on the workspace with respect to the layout. It recommended to arrange the objects neatly aligned in a workspace. For example, avoid jerks while connecting two objects which are on a straight line.
- 6. Keep saving you job at each step. BODS like any other software can crash any time and you may lose your work done that far.
- 7. ATL – The BODS program dump is called ATL. This is same like an ‘.exe’.