24
7/27/2019 w46 Bdsx40 Bpd en Xx http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 1/24 SAP Data Services 4.1 February 2013 English Migration Planned Independent Requirements (W46) SAP AG Dietmar-Hopp-Allee 16 69190 Walldorf Germany Business Process Documentation

w46 Bdsx40 Bpd en Xx

Embed Size (px)

Citation preview

Page 1: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 1/24

SAP Data Services4.1

February 2013

English

Migration PlannedIndependentRequirements (W46)

SAP AGDietmar-Hopp-Allee 1669190 Walldorf Germany

Business Process Documentation

Page 2: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 2/24

Migration Planned Independent Requirements (W46): BPD

Copyright

© 2013 SAP AG or an SAP affiliate company. All rights reserved.No part of this publication may be reproduced or transmitted in any form or for any purpose without the expresspermission of SAP AG. The information contained herein may be changed without prior notice.

Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.

National product specifications may vary.

These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informationalpurposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those

that are set forth in the express warranty statements accompanying such products and services, if any.Nothing herein should be construed as constituting an additional warranty.

SAP and other SAP products and services mentioned herein as well as their respective logos are trademarksor registered trademarks of SAP AG in Germany and other countries. Please seehttp://www.sap.com/corporate-en/legal/copyright/index.epx#trademark for additional trademark information andnotices.

© SAP AG Page 2 of 24

Page 3: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 3/24

Migration Planned Independent Requirements (W46): BPD

Icons

Icon Meaning

Caution

Example

Note

Recommendation

Syntax

External Process

Business Process Alternative/Decision Choice

Typographic Conventions

Type Style Description

Example text Words or characters that appear on the screen. These include field names,screen titles, pushbuttons as well as menu names, paths, and options.

Cross-references to other documentation.

Example textEmphasized words or phrases in body text, titles of graphics and tables.

EXAMPLETEXT

Names of elements in the system. These include report names, programnames, transaction codes, table names, and individual key words of aprogramming language, when surrounded by body text, for example, SELECTand INCLUDE.

Example

text

Screen output. This includes file and directory names and their paths,messages, source code, names of variables and parameters as well as namesof installation, upgrade and database tools.

EXAMPLE TEXTKeys on the keyboard, for example, function keys (such as F2) or the ENTER key.

Exampletext

Exact user entry. These are words or characters that you enter in the systemexactly as they appear in the documentation.

 <Exampletext> 

Variable user entry. Pointed brackets indicate that you replace these words andcharacters with appropriate entries.

© SAP AG Page 3 of 24

Page 4: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 4/24

Migration Planned Independent Requirements (W46): BPD

Contents

Migration Planned Independent Requirements.............................................................................51 Purpose...................................................................................................................................... 5

2 Prerequisites...............................................................................................................................5

2.1 Preparation........................................................................................................................... 5

3 Overview..................................................................................................................................... 5

3.1 IDoc Structure Represented in Data Services Content........................................................6

3.2 Global Variable Usage and Default Values..........................................................................7

3.2.1 Default Value Variables.................................................................................................8

3.2.2 Segment Execution Variables........................................................................................8

3.2.3 Segment Population in the IDoc Variables....................................................................8

3.2.4 IDoc Control Record information....................................................................................9

3.2.5 Data Formatting Variables ............................................................................................9

3.3 Referenced Lookup Tables................................................................................................10

4 Process Steps........................................................................................................................... 11

4.1 Configure and Execute Lookup Delta Population Job (Optional).......................................11

4.2 Connect to Source (Legacy) Data......................................................................................12

4.3 Map Legacy Data...............................................................................................................15

4.4 Execute Object Processing Job .........................................................................................16

4.5 Review Invalid Output .......................................................................................................18

4.6 Map Input Values in the Lookup Maintenance Application.................................................18

4.7 Import Object to SAP ERP.................................................................................................19

4.8 View IDocs in SAP.............................................................................................................20

4.9 Check IDoc Status .............................................................................................................20

5 Troubleshooting........................................................................................................................22

6 Appendix................................................................................................................................... 23

6.1 Functions............................................................................................................................ 23

6.1.1 Functions Used for Validation......................................................................................23

6.1.2 Functions Used for Enrichment....................................................................................24

© SAP AG Page 4 of 24

Page 5: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 5/24

Migration Planned Independent Requirements (W46): BPD

Migration Planned Independent Requirements

1 PurposeThis scenario guide provides instructions on the Planned Independent Requirements migration process beginning with the legacy data mapping through to the upload of the data to the target SAPsystem.

The SAP Data Services tool allows organizations to extract, transform, cleanse, map, and validatetheir legacy data for uploading to the SAP system using SAP NetWeaver IDoc technology.

The SAP Data Services tool provides a graphical user interface (GUI) development environment inwhich you define data application logic to extract, validate, transform, and load data from databasesand applications into the required data migrations formats. You can also use the tool to definelogical paths for processing message-based queries and transactions from web-based, front-office,and back-office applications. The tool also allows you to view the errors that occur during themigration process.

This document applies to both 4.0 and 4.1 versions of SAP Data Services.

2 Prerequisites

2.1 PreparationEnsure that you have completed all the steps specified in the following guides to set up the SAPData Services and SAP environments for migration:

• Data Migration Quick Guide 4.0 or 4.1 depending on the content and software that you are

using for the deployment.

• IDoc Configuration Guide (W01_BDSX40_Config_Guide_EN_XX.doc).

3 OverviewEvery data migration job developed for the SAP Rapid Data Migration to SAP ERP and SAP CRMfor Data Migration package follows the same processing logic.

1. Mapping – Map the data from the legacy source (such as application, flat file, MS Excel) to thepredefined mapping structure in the respective mapping Dataflow.

2. Validation – Validates the source legacy data in three ways:

a. Mandatory checks – ensures that mandatory columns are populated with values

b. Lookup checks – ensures that the legacy values are mapped to respective SAP valueswithin the specific lookup table using the Migration Services application

c. Format checks – ensures that the legacy values are of the correct format (length, dateformat, decimal format) to load into SAP without further conversion

2. Enrichment – Changes the valid data that has passed the validation into the required valuesfor the SAP load. For lookup columns, the legacy value is translated to the SAP value. For decimal and date based columns, they are converted to the required SAP format. Enrichmentcolumns that are left null are enriched with the respective default value.

3. Profile the MAP tables’ lookup columns for their distinct list of values. These values aretransferred to the MigrationServices application, populating the drop-down dialog boxes in thevalue mapping screens.

4. Generate the IDoc with valid, enriched data and transfer to SAP for loading.

© SAP AG Page 5 of 24

Page 6: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 6/24

Migration Planned Independent Requirements (W46): BPD

Steps 1 to 3 are repeated for each segment within the IDoc and then step 4 is executed once thevalid data records have been created and enriched. The specifics for the object processing arecontained in the following sections of this document.

3.1 IDoc Structure Represented in Data Services ContentThe Planned Independent Requirements IDoc (ZREQUIREMENTS_CREATE01) represents ahierarchical structure comprising nested schemas which themselves contain the fields required toload the various relational table structures in the target SAP application. The complete structure for loading Planned Independent Requirements data is represented in the IDoc and as such this can beused to support the functionality required for both migration or interfacing requirements. As part of the SAP Rapid Data Migration to SAP ERP and SAP CRM for Data Migration development, wehave limited the loading of the Planned Independent Requirements IDoc to the segments that arerequired to support an SAP Rapid Data Migration to SAP ERP and SAP CRM deployment. Thesegments covered by the delivered SAP Data Services content are shown in the following table anddiagram. The table details the relationship (for example, 1 to many) and mandatory or optionalnature of the segments and their technical names and descriptions. The diagram graphicallydemonstrates the relationship of the segments in a hierarchical model to assist in the understanding

of IDoc structure.

Segment

Level 1

Segment

Level 2

Required (R) /Optional(O)

Min / Max# per IDoc

Description(Table Namein SAP)

Z1ZREQUIREMENTS_CREATE

O 1 / 1 Header  Segment

Z1BPSSHDIN

O 1 /999999999

Segment for Communication fields: indep.

reqmtsschedule linesinput

 An IDoc is similar in structure to any other nested relational data structure, for example XML. It ishierarchical in nature and allows for segments to either by required or optional as can be seen inthe preceding table. The IDoc always contains a single ‘root’ level segment; in this case a singleIndependent Requirements Header data record is represented in one IDoc. Below the ‘root’ levelthe IDoc allows for 1 or more segments of the various lower level segments depending on the minand max values allowed for the lower segments. In the table above you can see that for ScheduleLine data we can have between 1 and 999999999 segments below a single Header Data segment.

The diagram below shows how we can have 1 Header Data segment for Planned IndependentRequirements A with 2 Schedule Line segments 10 and 20.

© SAP AG Page 6 of 24

Page 7: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 7/24

Migration Planned Independent Requirements (W46): BPD

Diagram 1: Planned Independent Requirements IDoc structure delivered by standard SAPRapid Data Migration to SAP ERP and SAP CRM content

The diagram shows how the segments of the Planned Independent Requirements IDoc are relatedin a hierarchical tree structure. You can see from the diagram how the IDoc starts from the ‘root’segment of the Independent Requirements Header data at level 1 and moves down to further childsegments at level 2 (Schedule Lines). The child segment at level 2 is what we class as ‘leaf’

segments (they have no further segments below them). The ‘root’ segment is always the top level of the IDoc structure and the ‘leaf’ segments are always the lowest level of the IDoc when navigatingdown a specific branch of the IDoc.

3.2 Global Variable Usage and Default ValuesThe following section details the use of global variables within the SAP Data Services Vendor IDocmigration content Job - Job_ DM  _PlannedIndependentRequirements_IDoc.

You can increase the flexibility and reusability of Jobs, Work Flows, and Data Flows by using localand global variables when you design your jobs. Variables are symbolic placeholders for values.The data type of a variable can be any supported by the software such as integers, decimals, dates,or text strings.

We use global variables within the SAP Rapid Data Migration to SAP ERP and SAP CRM

development Jobs rather than local variables to allow for the variable values to be passed in fromthe Migration Services application (only global variables are exposed at the Job level).

The global variables are used in a number of ways within the SAP Rapid Data Migration to SAPERP and SAP CRM Jobs as described below:

• Default values – variables used to define default values for certain fields in the segmentswhen no value is provided by the source mapping (Company Code or PurchaseOrganization).

• Segment execution – variables used to determine whether a segment is required to beexecuted for that run of the Job to perform the mapping, validation, and enrichment steps.

• Segment creation in the IDoc – variables used to determine whether you wish to create theIDoc structure with the specific segment populated.

• IDoc control record information – variables are used to provide the EDI_DC40 controlrecord values that changes depending on the target SAP application.

© SAP AG Page 7 of 24

Page 8: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 8/24

Migration Planned Independent Requirements (W46): BPD

• Data formatting – variables used to determine what format is used to convert data from thesource (legacy) data formats to that expected by the target SAP application (for example,date field conversion, decimal conversion, and so on).

The following tables define the global variables for the Planned Independent Requirements IDocJob.

3.2.1 Default Value VariablesProvides a default value for a field that has either not been mapped to the legacy application or contains a NULL value.

Name Data Type Description Default Value

$G_LoadDate Datetime Used to provide aconsistent timestampfor a single executionof the Job

Current date and time atthe start of the Jobexecution

$G_ProfileMapTables

Varchar(1) Defines whether toexecute the profileMAP table data step

‘Y’

$G_SourceData_Path

Varchar(255) Points to the locationof the test datareferenced by eachmapping Data Flow.

'C:\Migration_ERP\Source_Files_DI_IN\ ‘

$G_SourceId Varchar(10) Points out the currentexecutor for this job

'TEST_SRC1'

$G_Stop_Job_If_Invalid_Data_Exists

Varchar(1) If invalid data exists inthe table, the job stops

‘Y’

3.2.2 Segment Execution VariablesThese variables specify whether a segment is processed as part of the specific Job execution. Thisruns the mapping, validation, and enrichment steps for the specified segment.

Name DataType

Description DefaultValue

$G_HeaderData_Req Varchar(1) Independent RequirementsHeader segment

‘Y’

$G_Schedule_Req Varchar(1) Independent Requirements

Schedule Lines segment

‘Y’

$G_GenerateIDoc_Req Varchar(1) Define whether to execute theIDoc generation step.

‘N’

3.2.3 Segment Population in the IDoc VariablesThese variables specify whether a segment is populated in the IDoc for a specific Job execution.These variables are used in the generate IDoc Data Flow in both the mappings (parent segments)and where clauses (leaf segments), to determine whether a field or segment is populatedrespectively.

© SAP AG Page 8 of 24

Page 9: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 9/24

Migration Planned Independent Requirements (W46): BPD

Name DataType

Description DefaultValue

$G_HeaderData_IDoc_Req Varchar(1) Independent RequirementsHeader segment

‘Y’

$G_Schedule_IDoc_Req Varchar(1) Independent RequirementsSchedule Lines segment

‘Y’

3.2.4 IDoc Control Record informationThese variables provide values for the IDoc control segment. The client, receiver partner number,and receiver port may require modification.

Name Data Type Description Default Value

$G_Client Varchar(6) SAP Clientto be loadedby the IDoc

‘181’

$G_IDocType Varchar(60) IDoc type of  theCustomer Master IDoc

‘ZREQUIREMENTS_CREATE01’

$G_MessageType Varchar(60) Messagetype of theCustomer Master IDoc

‘ZREQUIREMENTS_CREATE’

$G_ReceiverPartnerNumber Varchar(20) <System Id>CLNT<Client

number> isthe defaultformat

‘RDECLNT181’

$G_ReceiverPort Varchar(20) SAP<SystemId> is thedefaultformat

‘SAPRDE’

3.2.5 Data Formatting VariablesProvides a format that is used to validate and convert the legacy data of that type to the expected

SAP data format.Name Data Type Description Default

Value

$G_Date_Format Varchar(20) Specifies the default format thatdates are migrated from thelegacy application if coming infrom a file or non-date formatfield in a database.

‘YYYYMMDD’

$G_Time_Format Varchar(10) Specifies the default format thattimes are migrated from thelegacy application if coming infrom a file or non-time formatfield in a database

‘HHMISS’

© SAP AG Page 9 of 24

Page 10: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 10/24

Migration Planned Independent Requirements (W46): BPD

$G_Decimal_Separator Varchar(1) Specifies what decimal separator is being used in the legacyapplication if the decimal data iscoming from a file or non-decimal format field.

‘.’

3.3 Referenced Lookup TablesThe following table defines which lookup tables are required to be mapped between the legacyvalues and the SAP Configuration data values using the Migration Services application. Thisinformation is also included against each individual field requiring a lookup table in the objectsmapping spreadsheets.

Object Lookup Tables Comment

Independent

Requirements Header 

External Requirements Types,

Material Number, MRP Area, Plant,Reference type, Version number for independent requirements

Reference the mapping

spreadsheets for thecorrect column to lookuptable relationships.

IndependentRequirementsSchedule Lines

BOM Explosion Number, MaterialNumber, Plant, Production Versions Of Material, Unit Of Measurement,Version number for independentrequirements

Reference the mappingspreadsheets for thecorrect column to lookuptable relationships.

© SAP AG Page 10 of 24

Page 11: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 11/24

Migration Planned Independent Requirements (W46): BPD

4 Process Steps

4.1 Configure and Execute Lookup Delta Population Job(Optional)

Use A delta job Job_DM_Lookups_SAPToApp provides logic to import any changes to the configurationdata in the SAP application. This job is only required once the target SAP application is installedand configured with changes (either new codes or text changes) to the standard delivered SAPRapid Data Migration to SAP ERP and SAP CRM Baseline configuration. The job extracts the latestconfiguration table contents from the SAP system and joins it back to the already populated lookupmapping tables in the lookup maintenance application and underlying database structure.

If the DS_SAP Datastore has not been correctly configured to connect to the new SAPsystem then the job may fail to execute correctly. You then have to resolve any issueswith the Datastore connectivity and rerun the job.

Procedure1. Access the Data Services Designer by choosing the following option:

System - Host <BOE Server Name: BOE Server Port> 

User name Administrator or BOE assigned user name

PasswordAdministrator password or assigned BOE 

 password 

Authentication Enterprise

2. Choose the Log on button and then select the repository and choose OK .

Repository DM_REPO_ERP 

3. In the dialog box, enter the repository password: Welcome1, and chooseOK.

4. From the Local Object Library, open the DM  _BPFDM_Lookups project.

5. From the Project Area, right-click Job_DM_Lookups_SAPToApp and choose Execute. If thesystem displays a prompt to save all changes and execute, choose OK.

6. On the Execution Properties dialog box, choose the Global Variable tab and perform thefollowing:

Field name Description

User action and values Comment

$G_Path ‘C:\\Migration_ERP’   Ensure that $G_Path pointsto ‘C:\\Migration_ERP’ or alocation that you plan tokeep your data migrationfiles.

$G_Global_Language ‘E’   Global language

$G_Local_Language ‘D’   Local language

$G_Default_String ‘K’   Default legacy field value inlookup content

$G_GetDataFromObjectList 

‘N’  Object list indicator 

© SAP AG Page 11 of 24

Page 12: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 12/24

Migration Planned Independent Requirements (W46): BPD

$G_Runtime_ObjectListFile_Name

‘OBJECTS_RUNTIME.c sv’ 

Object list file name

$G_GetDataFromTXT File

‘Y’  Lookup list indicator 

$G_Runtime_TxtFile_Name

‘LOOKUPS_RUNTIME.c sv’ 

Lookup list file name

$G_GetDataFromVariable

‘N’  Variable indicator 

$G_ObjectAndLookup AndChkTable

Object Name

• $G_LoadDate captures the date and time at the beginning of the session. You do not enter any value here, the tool defaults the value to the system data and time.

These values are effective only for the current runtime. To set the default values for 

each execution, right-click the Job in the Project Area and choose properties, the globalvariables are available on a tab in this screen and can be set as above.

7. To find the statistics of a job, choose the Monitor button. To check the log, choose the Log button. Ensure that there are no errors and that the job is completed successfully before youmove on to the next step.

ResultThe job extracted the latest configuration table contents from the SAP system and joined it back tothe already populated lookup mapping tables in the lookup maintenance application and underlyingdatabase structure.

4.2 Connect to Source (Legacy) Data

UseIn this activity, you connect all your source (legacy) data into the Data Services Designer.

Procedure1. Access the Data Services Designer by choosing the following option:

System - Host <BOE Server Name: BOE Server Port> 

User name Administrator or BOE assigned user name

PasswordAdministrator password or assigned BOE 

 password 

Authentication Enterprise

2. Choose the Log on button and then select the following repository and

choose OK .

Repository DM_REPO_ERP 

3. In the dialog box, enter the repository password: Welcome1, and chooseOK.

Option 1: Connecting to TXT files

© SAP AG Page 12 of 24

Page 13: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 13/24

Migration Planned Independent Requirements (W46): BPD

1. In the local object library, choose the Formats tab that is at the left-bottom portion of the screen,right-click Flat Files, and select New .

2. In Type, specify the file type. Select Delimited if the file uses a character sequence to separatecolumns. Select Fixed width if the file uses specified widths for each column.

3. Under  General , in Name, enter a name that describes this file format template, for example,Planned_Independent_Requirements .

4.

 After you save this file format template, you cannot change the name. If you would liketo change the name or reuse the file template, right-click the file in the Local Object Library and choose Replicate.

5. Under  Data Files:

• In Root Directory , browse to the folder where you have your text file stored.

• In File name(s), browse to the text file that you would like to run.

6. Under  Delimiters:

• In Column, select the required delimiter for your file.

• In Text , select a text delimiter if one is used to define the text fields in your file.

The “value in the Text field could create a problem when you map the inch (“) values inthe Unit of Measure lookup table.

7. Under  Input/Output :

• In Skipped rows, enter the number of rows that you would like to skip from your source file.

• In Skip row header , select Yes if your file contains the column headings in the first row.

• In Write row header , select Yes if you are going to write the file out again using the samefile format and would like to preserve the column headings in the first row.

When you change the preceding settings, you may be prompted to overwrite thecurrent schema with the schema from the file you selected. Choose Yes for the tool tocreate automatically the required schema for your flat file.

8. Edit the field names, data types, field sizes as required for the source file that you plan toimport.

9. Choose Save and Close to save the file format template and close the File Format Editor .

10. The system displays your newly created file format under Flat Files in the Local Object Library  section.

To check on the flat file definition, right-click the newly created flat file template andselect View Data.

Option 2: Connecting to XLS files1. In the local object library, choose the Formats tab that is at the left-bottom portion of the

screen, right-click Excel Workbooks, and select New .

2. In the Import Excel Workbook dialog box:

• In Format name, enter a name that describes this file format template, for example,Planned_Independent_Requirements .

 After you save this file format template, you cannot change the name. If you would like

to change the name, you can only right-click the file in the Local Object Library, chooseDelete, and create a new one.

© SAP AG Page 13 of 24

Page 14: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 14/24

Migration Planned Independent Requirements (W46): BPD

• In Directory , browse to the folder where you have your Excel file stored.

• In File name, browse to the file from which you would like to extract.

• If your data is contained within a named range, select the Named range button and choose

the named range you wish to use from the dropdown box.

If the first row contained within the named range contains the column headings, selectthe Use first row values as column names and choose the Import Schema button.Otherwise, choose the Import Schema button and change the field names to thedesired names.

• If your data is contained in a worksheet, choose the Worksheet button and then select theworksheet you wish to use from the dropdown box.

• Select Custom range, then choose . Select the header row within the Excel sheet andclose the file. The range is populated in the Custom range field. Then select the Extend range checkbox.

• Select Use first row values as column names checkbox if the first row contains the columnnames.

• Choose the Import Schema button. In the schema that is populated, change all the Data

Type to varchar and the Field Size to 255 for all fields if you are unsure of the data type andsize of the fields that you plan to import.

• Choose OK .

3. Your newly created file format is shown under Excel Workbooks in the Local Object Library  section.

Option 3: Connecting to the Database1. In the Local Object Library , choose the Datastores tab that is located at the bottom left portion

of the screen, right-click inside the window, and select New .

2. In the Create New Datastore dialog box, enter the following:

• In Datastore name, specify the name you wish to give your datastore.

• In Datastore type, select Database.

• In Database type, select one of the available databases from the list. Depending on theDatabase type you choose the connection information may differ. For example, if youchoose ODBC from the Database type selection, the system prompts you to choose theData source from the available ODBC data sources defined on the machine. If needed, youcan choose the ODBC Admin button to create a new ODBC entry. Enter the User name and Password required for the connection to the source.

 After you save the datastore, you cannot change the name. Do not includedevelopment tier information (DEV or TEST) to the name as you cannot change this if you change the connection information in your datastore.

3. Choose the Advanced button if you wish to set any of the additional parameters for thedatastore. Refer to the technical manuals provided with the Data Services installation for moreinformation on these settings:

4. Choose Apply to check that the connection information is correct and then choose OK to savethe new datastore.

5. In the Local Object Library (on the bottom left), select the new datastore you have just created,right-click, and choose Open.

6. In the window (on the right), the datastore tables are displayed if the correct connection

parameters have been supplied.

© SAP AG Page 14 of 24

Page 15: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 15/24

Migration Planned Independent Requirements (W46): BPD

7. Select all the tables that you require as a source for mapping, right-click, and chooseImport/Reimport to import the metadata for each of the tables into the datastore.

8. Expand the datastore (in the Local Object Library), and then expand Tables to ensure that thetable metadata has been imported.

When you open a datastore, both views and tables within a database are visible andthey are then available for import.

ResultSource (legacy) data is now connected into the Data Services Designer and ready for mapping tothe required segments in the IDoc structure represented in the mapping Data Flows.

4.3 Map Legacy Data

UseIn this activity, you map your legacy data to the SAP Object into the Data Services Designer.

Procedure1. Access the Data Services Designer by choosing the following option:

System - Host <BOE Server Name: BOE Server Port> 

User name Administrator or BOE assigned user name

PasswordAdministrator password or assigned BOE 

 password 

Authentication Enterprise

2. Choose the Log on button and then select the repository and choose OK .

Repository DM_REPO_ERP 

3. In the dialog box, enter the repository password: Welcome1, and chooseOK.

4. Open your project DM_BPFDM_IDOC and select the JobJob_DM_PlannedIndependentRequirements_IDoc . Expand the Job and expand the Conditionalnode PlanIndReqHeaderData_Z1ZREQUIREMENTS_CREATE_Required . Select the Dataflowobject DF_DM_PlanIndReqHeaderData_Map. Delete the source test data Excel Workbook

from the Dataflow in the Design Area (on the right). To do so, right-click the icon andselect Delete from the displayed dropdown menu.

We recommend that you replicate the original Data Flow and replace the original DataFlow within the Job before changing the new Data Flow. This allows you to receivefixes and updates to the code from SAP without overwriting the objects and code youhave developed yourself.

5. From the Object Library (on the bottom left), select the source table, file format template, or Excel sheet that you wish to use as your source, Planned_Independent_Requirements in thisexample. Drag the Planned_Independent_Requirements file and drop it on the Dataflow andmake it the source. Draw a connector between the source object and Qry_BestPractices queryobject.

© SAP AG Page 15 of 24

Page 16: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 16/24

Migration Planned Independent Requirements (W46): BPD

To additional fields that are not included in the Qry_BestPractices query transform,delete the Qry_BestPractices transform and map the source table or file directly to theQry_AllFields transform. Ensure that all fields mapped to the previous

Qry_BestPractices transform fields are either set to NULL or mapped to the new sourcetable or file fields.

6. If you are using a flat file as your source, double-click the source object. In the dialog box thatappears, under General , in Adaptable Schema, choose Yes for the field size to be compatiblewith the legacy data. This allows you to work with flat files generated from Excel.

7. Double-click Qry_BestPractices  under DF_DM_PlanIndReqHeaderData_Map on the left.On the right side of the screen, the system displays the Cost Element data (input schema) onthe left and the legacy data (output schema) on the right. Map the source file (input schema) tothe target table (output schema).

If you have deleted the Qry_BestPractices query transform, double-click Qry_AllFields transform and map the source fields (input schema) to the target table (output schema).

8. Go back to the Dataflow screen. Choose the Display button on the source object to see thedata. If the data appears truncated, you can modify any of the data types to Varchar andmanually specify the size of the data to be expected on input. To do so, right-click thePlanned_Independent_Requirements file format in the Object Library and choose Edit. Nowmanually change the data type and field size for the fields that you want to be correctlydisplayed.

To change multiple fields in one go highlight all fields and select properties and thisallows you to set the data type and size of the columns in one go.

9. If the number of rows displayed are lesser than the rows that are in your text file, from theapplication menu, choose Tools → Options. In the Options dialog box, choose Designer →General on the left. In the View data sampling size (rows), increase the number of rows that youwould like to be displayed and choose OK .

10. Repeat steps 4 through 8 for all the subobjects you require to migrate within the JobJob_DM_PlannedIndependentRequirements_IDoc , see the following table:

Sub Object Conditional Data Flow

IndependentRequirements Header 

PlanIndReqHeaderData_Z1ZREQUIREMENTS_CREATE_Required

DF_ DM  _PlanIndReqHeaderData_Map

IndependentRequirements ScheduleLines

PlanIndReqSchedule_Z1BPSSHDIN_Required DF_ DM  _PlanIndReqSchedule_Map

ResultThe legacy data is now mapped to the SAP Object into the Data Services Designer.

4.4 Execute Object Processing Job

Use

In this activity, you execute the Planned_Independent_Requirements processing job for validationand enrichment, without loading it yet to the SAP system.

© SAP AG Page 16 of 24

Page 17: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 17/24

Migration Planned Independent Requirements (W46): BPD

Procedure1. Access the Data Services Designer by choosing the following options:

System - Host <BOE Server Name: BOE Server Port> 

User nameAdministrator or BOE assigned user name

PasswordAdministrator password or assigned BOE 

 password 

Authentication Enterprise

2. Choose the Log on button and then select the repository and choose OK .

Repository DM_REPO_ERP 

3. In the dialog box, enter the repository password: Welcome1, and chooseOK.

4. From the Project Area, right-click the job you wish to execute for example,Job_DM_PlannedIndependentRequirements_IDoc and choose Execute. If the system displaysa prompt to save all changes and execute, choose OK.

5. On the Execution Properties dialog box, choose the Global Variable tab and perform thefollowing:

• For the segments that you would like to execute, under Value, enter ‘Y’ (for example,$G_HeaderData_Req, $G_Schedule_Req)

• Ensure that $G_Path points to ‘C:\\Migration_ERP’or the location that your datamigration files exist in.

• $G_LoadDate captures the data and time at the beginning of the session. Do not enter anyinformation here; the tool defaults the value to the system date and time.

• Ensure that the global variables for the default values to be used in the enrichment rulesare set to the values you require. See section Error: Reference source not found: Error:Reference source not found for the complete list of variables.

 These values are effective only for the current runtime. Right-click the Job and chooseProperties and then the Global Variable tab to set the default values to be used for allexecutions.

The default date format is set to ‘YYYYMMDD’. If the date format in your legacy data isin a different structure, set the $G_Date_Format variable to sync with the legacy datadate format. For example, if 31072009 is the date format of your legacy data then setthe $G_Date_Format to ‘DDMMYYYY’, and if 07312009 is the date format of your legacy data then set the $G_Date_Format to ‘MMDDYYYY’.

• Ensure that the global variable $G_GenerateIDoc_Req is set to ‘N’ to run the validationand enrichment steps only without sending the data via the IDoc to the SAP target system.

ResultTo find the statistics of a job, choose the Monitor button. To check the log, choose the Log button.Ensure that there are no errors and that the job is completed successfully before you move on tothe next step.

 For information on the custom functions applied during the validation and enrichmentsteps of the processing, refer to section 6.1 in the Appendix.

© SAP AG Page 17 of 24

Page 18: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 18/24

Migration Planned Independent Requirements (W46): BPD

4.5 Review Invalid Output

For information on reviewing invalid output using the BOE reports delivered as part of the Data Migration content, see the Reporting and Visualization document.

Procedure1. Once the object processing Job is completed successfully, you can drill down to the

DF_DM_<ObjectName>_Validate.

2. In the right-hand work area, the job is displayed. A small magnifying lens in the lower rightcorner of the invalid records container box is visible.

3. Choose the magnifying glass on the invalid records container.

4. In the bottom area, the system displays an overview of all records that failed one or morevalidations. Choose Save to download the complete list of failed records, including a status fieldthat indicates the reason for the failed records.

5. You can either download the file with invalid records or look at the errors directly within DI.

Within DI, you can scroll to the right where you can find the status column, or alternativelychoose the Show/Hide Columns button, select Status, and deselect all others.

6. The system displays an overview of all records and can see where the validation failed.

 To filter the data records, select the data in a cell, then right-click and choose either Filter or the more specific column name = value Filter option to apply that cell value asa filter for the data set. If the data set you are processing is larger than the sampledataset value then choose the Refresh button to apply the filter to the whole dataset.Data can be ordered by clicking on the column headings.

.

4.6 Map Input Values in the Lookup MaintenanceApplication

UseIn this activity, you map in the lookup maintenance application input values from the legacy systemto SAP structural configuration data.

Procedure1. Open the Lookup Maintenance Application:

• (http://<Data Services Server>:8080/MigrationServices)

If you have deployed the Lookup Maintenance application on a different Tomcatenvironment (for example SAP BusinessObjects Business Intelligence platform rather than Data Services) then replace this information in the previous step <Data ServicesServer>:Port with the environment / port number in which you have deployed theapplication.

2. Within the lookup maintenance application, select the staging area that your Data Servicesenvironment is using for the DS_STG_MGMT_LKP datastore and choose the LookupMaintenance button.

If you do not have a staging area defined or a setup for your staging database, refer tothe help file embedded in the application in the top right-hand corner of the application

© SAP AG Page 18 of 24

Page 19: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 19/24

Migration Planned Independent Requirements (W46): BPD

screen.

3. The lookup maintenance screen would display a specific object, view and lookup table list.From the Object dropdown list, select the object you are working and then select the View Name for the segment that you are working on.

You would now see which lookup tables require mapping by examining the statuscolumn on the right of the application screen. If the status shows a yellow triangle thenthere are unmapped values in the legacy data that are required to be mapped to acorresponding SAP value.

4. Choose the lookup table hyperlink to open the specific lookup table.

The Legacy Value table is populated by the profiling step in the Data Services object

Job, and displays all values found in the specific columns that use the configuration

table in SAP to limit the possible load values.

5. You map the legacy values from the Legacy Value table by dragging and dropping the legacyvalue onto the corresponding Legacy_<Lookup> column in the lookup table or directly enteringthe value by typing it into the dialog box. If you drag a legacy value onto a row that already hasa value mapped, the application duplicates the row and map the new value while also leavingthe old value mapped.

To filter the values that are displayed in the lookup mapping table choose the Filter  button in the column headings. An additional set of filter boxes (one for each column) isdisplayed. You can enter a value and press Enter to filter the displayed rows. Toremove the filter, choose the Remove Filter button or clear all values from the filter 

boxes and press Enter .6. Once you have mapped all values from the Legacy Value table they are displayed with a green

circle next to them. Choose the Save Data button at the bottom of the screen to commit themappings to the staging database lookup table.

To map multiple legacy system values to the same SAP lookup value then you canchoose to replicate a row. To replicate a value, choose the Duplicate button to the rightof the row. To delete a row you have created, choose the Delete button.

To revert to the original state of the lookup mapping table, choose the Revert Changes

button to reload the lookup maintenance table.

ResultThe lookup table maintenance tool manipulates the data directly in the lookup tables in thedatabase defined in the DS_STG_MGMT_LKP and DS_STG_MGMT_LKP_INIT datastores. Thereis no need to run any further Data Services jobs to upload the changes to the data made in theapplication.

4.7 Import Object to SAP ERP

UseWhen you have succeeded with the previous steps, and your data is validated and enrichedsuccessfully, you can then follow the instructions in this section, to import your data into SAP.

© SAP AG Page 19 of 24

Page 20: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 20/24

Migration Planned Independent Requirements (W46): BPD

Procedure1. Access the Data Services Designer by choosing the following option:

System - Host <BOE Server Name: BOE Server Port> 

User nameAdministrator or BOE assigned user name

PasswordAdministrator password or assigned BOE 

 password 

Authentication Enterprise

2. Choose the Log on button and then select the following repository and

choose OK .

Repository DM_REPO_ERP 

3. In the dialog box, enter the repository password: Welcome1, and chooseOK.

4. From the Project Area, right-click the job you wish to execute for example,Job_DM_PlannedIndependentRequirements_IDoc and choose Execute. If the system displaysa prompt to save all changes and execute, choose OK.

5. On the Execution Properties dialog box, choose the Global Variable tab and perform thefollowing:

• Take the same variable as you have run the validation check.

• Ensure that the global variables $G_GenerateIDoc_Req is set to ‘Y’ to generate the IDocsand run the Master Data upload.

 If the content is good and you do not want to reextract data from the source (you do notneed to reexecute the mapping, validation, or enrichment processes for the objects youare migrating), set the section global variables to ‘N’ to ensure that they are not

executed.• Ensure that the global variables that control the IDoc segment population (if utilized for this

object [see section 3.2.3]) are set to ‘Y’ if the segment is to be transferred to SAP (such as$G_HeaderData_IDoc_Req, $G_Schedule_IDoc_Req).

ResultThe legacy data is now uploaded into the SAP system via the IDoc mechanism. When you log on tothe SAP system you can track the upload as well, via transaction WE05.

4.8 View IDocs in SAP

UseThe following instructions lead you through the steps of viewing the IDocs.

Procedure1. On the SAP Easy Access screen, enter transaction code WE05.

2. In the IDoc List screen, enter the date for which you want to see the IDocs that were createdand choose Execute.

3. In the following screen, you would see the list of IDocs that were created.

4.9 Check IDoc Status After you have completed section 4.7, you can now use the IDoc status checking code delivered bySAP Rapid Data Migration to SAP ERP and SAP CRM for Data Migration to check the currentstatus of the IDocs in the target SAP environment. For more information about this process, see the

© SAP AG Page 20 of 24

Page 21: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 21/24

Migration Planned Independent Requirements (W46): BPD

IDoc Status Check document (DataMigration_IDoc_Status_Check_EN_US.doc ) on the Step-by-Step Guide under Deploy > Activate Solution > Business Process Documentation. The functionalitydelivered populates the MGMT_IDOCSTATUS table in the staging area database defined in theDS_STG_MGMT_LKP datastore, allowing you to build reports on top of this table for the requiredobjects.

 Prebuilt reports on top of the MGMT_IDOCSTATUS table are delivered as part of theBOE content package for Data Migration.

© SAP AG Page 21 of 24

Page 22: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 22/24

Migration Planned Independent Requirements (W46): BPD

5 Troubleshooting• Version Incompatibility

Data Services Designer only imports the project file (atl ) if the file version is the same or lower than the Data Services version that you are running on your machine. If the importedproject file does not match the Data Services version that you are running, the systemdisplays an error message. You must upgrade your Data Services installation.

• When the connections do not succeed

If the datastore connection to the application does not work, the datastore is not saved.

• Tables not available for import

If the tables you are trying to import do not appear in the Datastore when you open it, it maybe due to permission issues on the database for the user that you are connecting with.Ensure that the user has read access to the tables and data dictionary where appropriate.

• SQL errors when trying to generate the IDoc

If you receive SQL errors stating that the tables do not exist in the staging database whentrying to run the Generate IDoc Data Flow, this may happen because not all sections havebeen executed, even if you are not supplying data for that section. Refer to the Quick Guideregarding the running of all segments prior to mapping and using the Jobs for migration.

© SAP AG Page 22 of 24

Page 23: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 23/24

Migration Planned Independent Requirements (W46): BPD

6 Appendix

6.1 FunctionsThe validation and enrichment of the object related data is performed by both prebuilt and customfunctions written in Data Services. The standard prebuilt functions, such as ifthenelse andlookup_ext are not within this guide, and are referenced in the Data Services Technical Manuals.The custom functions are discussed in detail below so that an understanding can be gained as totheir functionality, usage, and purpose.

6.1.1 Functions Used for Validation

Function (Input Parameters)(Return)

Description WhereUsed

Comments

is_valid_date (

legacy date field [Input],expected date format [input]($G_Date_Format))

(Return -‘1’ = valid date‘0’ = invalid date)

Example of the function withinthe format validationtransforms

is_valid_date(table.field1,$G_Date_Format) = 1

The standard functionis_valid_date checks to see if thevarchar value being passed inmatches the standard specifiedin the second input parameter ($G_Date_Format). If it doesthen the return is ‘1’ else thereturn is ‘0’. The default value for the $G_Date_Format variable is‘YYYYMMDD’ but this can bechanged in the initialization scriptat the start of the Job or withinthe parameter management

screens in the Migration Servicesapplication.

Note: If the legacy applicationsupplies the date from adatabase date field then thisvalidation can be turned off and astraight mapping can be appliedas the date validation is alreadyapplied by the source system.

For allinputdatefields.

Is_valid_decimal(

legacy decimal field [Input],decimal format [Input]

(Return -‘1’ = valid date‘0’ = invalid date)

Example of the function withinthe format validationtransforms

Is_valid_decimal(table.field1,‘#####’||$G_Decimal_Separator ||‘###’) = 1

The standard function

Note: If the legacy applicationsupplies the decimal from a

database decimal field then thisvalidation can be turned off and astraight mapping can be appliedas the decimal validation isalready applied by the sourcesystem.

For allinputdecimalfields.

© SAP AG Page 23 of 24

Page 24: w46 Bdsx40 Bpd en Xx

7/27/2019 w46 Bdsx40 Bpd en Xx

http://slidepdf.com/reader/full/w46-bdsx40-bpd-en-xx 24/24

Migration Planned Independent Requirements (W46): BPD

6.1.2 Functions Used for Enrichment

Function (Input Parameters)(Return)

Description WhereUsed

ENR_Decimal_Conversion

(Legacy Input String [Input],Decimal Separator [Input])

(Return – Varchar string with thecorrect ‘.’ separator used to separatethe decimal values)

For example,ENR_Decimal_Conversion( Table.Field, ‘,’) converts decimal string 123,45to 123.45

Converts the legacy decimal fieldvalue from its Varchar input formatto the desired SAP format for thedecimal (such as with a ‘.’ rather than a ‘,’ decimal separator).

For allinputdecimalfields.