Documentos de Académico
Documentos de Profesional
Documentos de Cultura
created by Rati Verma on Apr 10, 2012 11:15 AM, last modified by Rati Verma on Apr 11, 2012 6:04 AM
Version 1
Applies to:
This Article applies to SAP BI 7.0.
Summary
This article provides step by step procedure on how to archive data in write optimized DSO using ADK method. The purpose is to explain detailed technical procedure of ADK archiving, deleting and reloading archived data in the write optimized DSO. For archiving, we need to create the logical file and physical file for each DSO based on some properties using Archiving Object. Developers who want to understand stepwise implementation of SAP BI ADK archiving for write optimized DSO will be benefited from this article.
Author: Rati Verma Company: Infosys Limited Created on: 11 April 2012
Author Bio
Rati Verma is a SAP BW consultant with Infosys Ltd with 4+ years of relevant experience. She has worked in various BI/BW implementation and support projects.
Table Of Contents
Table Of ContentsIntroduction Business Scenario SAP BW Archiving Archiving Process Creation of Data Archiving Process Step 1: Creating the Data archiving process (DAP) from transaction code RSA1 Step 2: General settings tab Step 3: Selection Profile tab settings Step 4: Semantic Group tab settings Step 5: ADK tab settings Performing the Write operation Step 1: Go to archive administration of the DSO Step 2: Create a variant on the SARA screen Step 3: Name the variant and give selection conditions Step 4: Maintain start date and spool parameters for the variant Viewing the data in archived file: Performing the Delete operation: Step 1: Start delete job from SARA transaction
Step 2: Maintain archive selection, start date and spool parameters for the delete job Performing the Reload operation: Step 1: Start the reload job from SARA transaction Step 2: Create variant for reload and define other settings Step 3: Provide description for the reload job Step 4: Execute reload job and view job logs Related Content
Introduction
SAP BW projects have to handle huge volumes of data. The database size and the data in it are of high importance for every organization. Although, over a period of time, the large amount of accumulating data becomes a point of concern as the database increases in size. Most of this data is inactive and it becomes difficult for the organizations to take care of such data that substantially increases day after day. This eventually leads to increase in query execution time and brings in serious issues with the system performance and maintenance. As a resolution to the above issue, the concept of BW Archiving comes into picture. In BW archiving the inactive data present in the Infoproviders can be deleted and transferred to some alternate storage systems. This data can be reloaded back in case of special requirements.
Business Scenario
We have data in our system that we do not want to delete for good; instead, we want to move it away from our main data targets (InfoCubes and DSO Objects). We want to improve the load and query execution performance by decreasing the volume of data. In addition, we want to restrict system growth, and have decided to move some of the data onto a slower storage medium, which is also cheaper than the expensive, quickly accessible Infoprovider memory within our system. We may need to retrieve this data sometime in the future and therefore, we think of archiving our data.
SAP BW Archiving
SAP BW Archiving is the perfect solution for handling the high volume of inactive data present in the database. The creation of Data Archiving Process is very easy process as the Archiving Object is created by the system itself. In this article we will be focusing on the ADK process of archiving for Write optimized DSOs. Archiving Data Kit (ADK) is a tool provided by SAP. It acts as abstract layer between the SAP applications, the data and the archive files.
Archiving Process
The Archiving Process in SAP BW 7.0 can be divided into three main sections: 1. 1. 2. 3. Creation of Data Archiving Process First, we define the Data Archiving Process and all the necessary settings which are required to archive data from a Write Optimized DSO. Performing the Write operation Next, we perform the write operation on the data to be archived from the DSO into the archive files. Performing the Delete operation Finally, we perform the Delete operation on the data which has been archived from the Write Optimized DSO.
Step 1: Creating the Data archiving process (DAP) from transaction code RSA1
Search and select the Write Optimized DSO which we want to archive from transaction code RSA1->InfoProvider and right click to access the Context menu. From the Context menu select Create Data Archiving Process.
In the Sequence section we can specify whether we want the data from the DSO to be deleted before storing the Archive files in the content repository or vice versa. The Delete Program Reads from Storage System check box is ticked when we want the delete program to take a copy of the data which is saved in archive files in the content repository and match it with the data in the DSO and then proceed with the delete operation on the DSO data. The Data Archiving process needs to be saved and activated after this.
After the above step we come to the SARA transaction. Step 2: Create a variant on the SARA screen
Below is the screenshot of the screen we come to after Step 1.
We can see that the Archiving Object gets automatically created by the system. Here we need to click on the Write button and create a variant, which is shown in below steps
Here we can specify the name of the variant which we want to create and then click on the Maintain button. Below is the screenshot which shows the screen where we provide selection conditions in the variant to perform the archiving process.
In the case of write optimized DSOs we can enter the selections only through relative option whereas in standard DSOs we have options for relative as well as absolute values to be entered for the selection criteria. Note: The absolute value input is out of the scope of this article. In the above screen depending on the value with which we want the Loading Date field to be populated, we need to fill the number of days in the Only Data records Older Than field. In our example we have entered 959 days. Accordingly the system calculated 959 days prior to the current system date and populated the Loading date field. Instead of Days we can also calculate the Loading Date value depending on Year, Half-Year, Quarter, Month and Week. We can select between less than or less than equal to logical operators for selection condition of the loading date field. Automatic request Invalidation option: whenever we provide selection conditions in the variant for archiving data the data gets locked for archiving during the write job. If suppose some error occurs and the write job gets cancelled, then that data cannot be archived again as it is locked. Using the Automatic request invalidation option the selected data can be automatically unlocked so that it can be available for archiving again. We can manually unlock this data in the DSOs manage options archiving tab by invalidating the archiving request which got created along with the cancelled write job if the automatic request invalidation option is not selected. Note: The archiving tab gets created in the DSOs manage after the Data archiving process (DAP) for that DSO has been created, saved and activated. Processing Options: In processing options, it is necessary to select the Production mode option as in the Test mode option only a simulation of the archiving process takes place and not actual archive files get generated. In the Production mode actual archive files get generated. Other settings: We also have other settings where we can select the type of log, its output type etc. After these settings are done we can provide description to our variant, save it and click on the back button.
Step 4: Maintain start date and spool parameters for the variant
In the below screen we need to provide the Start date and spool parameters.
Start date: The start date option helps us in scheduling the archiving process. We can schedule the job immediately, at a particular date or time, after a particular job gets finished, after an event or at an operation mode. In our example we will schedule the job immediately. For doing this we need to select the immediate button and then click on save. Below screenshot explains the Start date options.
Spool Parameters: The spool parameters option helps us to choose the print parameters for the archiving log. Below screenshot shows the different options.
After this is done we can start the archiving job by clicking on the execute button and then view the job logs by clicking on the job logs button. Please refer to the above screenshot. In the below screenshot we can see that the write job as well as the storage job has finished. The storage of the archive files has happened automatically because we have selected the option Start automatically during the creation of data archiving process (DAP) explained previously in the article. We can even see the start time & end time of the jobs and also the total time taken by each job to get finished.
Below screenshot shows the job log of the archiving write process for the write optimized DSO. In the highlighted portion we can see the archive file name and its location. Also we can see the number of records which have fulfilled our selection criteria provided in the variant. In our case 1213959 numbers of records got selected for archiving. The job got finished and we have created the archive files.
For a write optimized DSO as the archiving happens request based, we can have a look at the requests tab in manage of our DSO. The requests that have been selected for archiving have clock symbol in the Request is archived
column. As we have given the request loaded date as less than equal to 07/20/2009 all the requests under this category have this clock symbol .Below screenshot shows the same.
Note: The fourth tab i.e. the Archiving tab seen in the above screenshot gets generated when the data archiving
process (DAP) is created and activated for the DSO.
In the below screenshot we can view in the archived file. The data shows the various columns in the DSO displayed vertically on the left. Each row of data gets displayed in vertical format.
Note: Creating indexes on the primary characteristic used for archiving improves the archiving jobs performance by reducing the time required for archiving.
Step 2: Maintain archive selection, start date and spool parameters for the delete job
In the below screen we need to maintain three things: Archive selection, Start date and Spool parameters. In Archive selection we can select among the archived files which file we need to delete. Select the Archive selection button.
In the below screen we can select the archiving file which got created when we performed the archiving operation for deletion. We can also see that the status of the archive file is Write completed.
After the above step we can also maintain the Start date and the Spool parameters same as we did in the archiving write process explained above in the article. After all the three options have been maintained we can execute the delete operation and then view the logs as explained in the below screenshot.
The below screenshot shows the deletion log in which the highlighted part shows the selection condition used for deletion of data from the DSO and the number of records that got deleted from the DSO. The number 1213959 matched with the number of records we had archived in the write operation explained previously in the article. In our case as we had taken a write optimized DSO, the selection conditions show the request numbers of the various requests which fulfilled our deletion conditions. In the case of a standard DSO the data gets deleted from the active table of the DSO. In our case of write optimized DSO the DSO has only one table i.e. the active table.
In the below screenshot we can see the request tab in the manage of the write optimized DSO. We can clearly see that the request numbers from which the data got deleted are having the tick mark in Request is archived column. Hence the data has been deleted as per the selection conditions i.e. Request based from the database.
SAP also warns us during the reload process by displaying the below popup message.
In the below screen we can view the job log of the reload job we just started. The highlighted portion shows the archive file name and the numbers of records that got reloaded back to the database. This number is same as the number of records we archived and deleted from the database.
The details about the complete archiving process can also be seen in the archiving tab in manage of the DSO. In the below screenshot we can see one archiving request and one reloading request in the archiving tab in manage of the DSO. We can also see the selection conditions the number of records archived and reloaded and other such details. In the Request type column we can see the reload request with a green arrow and the archiving request with a yellow arrow.
Related Content
http://help.sap.com/saphelp_nw04s/helpdata/en/8d/3e4d70462a11d189000000e8323d3a/frameset.htm http://help.sap.com/saphelp_nw04s/helpdata/en/b7/78104292615833e10000000a155106/frameset.htm http://help.sap.com/saphelp_nw04s/helpdata/en/2a/fa0391493111d182b70000e829fbfe/frameset.htm
Hi, Step 1: Create an Archiving object for your Infocube in RSDAP t-code General settings - ADK-Based Selection Profile - 0calday ADK - ARCHIVE_DATA_FILE For each tab, enter the above info. On the file structure enter maximum size as 100 MB and maximum records as 10000 Activate it. Step 2: Create an Archiving variant for the archiving object This step can be implemented in SARA t-code. You know How to do this I believe. Regards, Suman