Sei sulla pagina 1di 5

Best Practices for Data Loading into SAP® ERP Systems

Summary
An organization’s ability to act swiftly and make business decisions is based on having access to complete and accurate
views of enterprise data. Companies recognize the value in leveraging all of their corporate information and the
benefits of reconciling data into consolidated views, but the path to get there can be difficult. Getting data into the
company’s ERP system in a timely and accurate manner is a critical success factor in business decisions. This white
paper examines existing data-loading methods and best practices to easily incorporate data into SAP solutions.

A white paper by:


Winshuttle, Inc.
Bothell, WA 98012, USA
+1 425.368.2708
www.winshuttle.com

© 2008 Winshuttle, Inc. All rights reserved. www.winshuttle.com


Introduction
It is a core business principle that to run a successful company, assets must be managed effectively. However, there is
an often overlooked asset that, if correctly harnessed, is a turning point in competitive advantage, particularly with
the increasing sophistication of enterprise wide business management systems such as SAP ERP. This asset is the core
data produced in abundance by every company in the world. Corporate data sets include transactional data and master
data.

DATA MIGRATION DATA MAINTENANCE DATA INTEGRATION DATA CREATION

Fig. 1 - Data loading use cases

Managing large amounts of data can be a significant challenge to most organizations. Some of the common data
management tasks include:

• Data migration, e.g., loading legacy systems data into SAP applications during initial SAP implementation or M&A
activity
• Data maintenance, e.g., mass changes to SAP data for price changes and payroll changes
• Data integration, e.g., loading vendor invoices or bank statements into SAP applications
• Mass data creation, e.g., creating new master data (materials, vendors, customers, etc.), or transactional data
(journal vouchers, invoices, etc) in SAP systems

A common theme among these data management applications is data loading – loading of data into SAP systems from
external files such as spreadsheets or other databases. The limited resources in IT and in the lines of business struggle
to complete these data-related tasks in a timely and accurate manner. However, if the best practices described in this
white paper are followed in each stage of a data loading process, these challenges can be overcome.

1. Planning the data load project


2. Developing templates and preparing the data
3. Running the data load process
4. Post data load activities

PLANNING DEVELOPING RUNNING POST DATA LOAD

Fig. 2 - Stages of a data loading project

1. Planning the Data Load Project


Every data loading project should have a plan that includes quality and user acceptance as the top priorities.
Preparation is key to the success of any operation, and data loading is no exception. Items to be considered when
planning a data loading project:

• Selecting the right tool for the job


For small projects affecting fewer than 50 transactions, manual data entry may be the best choice. However,
when uploading large amounts of data, such as employee records, pricing conditions, material masters,
purchase orders, or customer invoices, manually keying in the data is resource-intensive, time-consuming, and
stressful for data entry personnel and IT support teams. In addition, the manual entry of data increases the risk
of errors, thereby increasing the total cost of ownership (TCO) of SAP solutions.

- Custom programming
One alternative to the manual entry of data into an SAP system is to write custom ABAP™ programs. Many
companies have developed custom programs for very large data loading tasks that will remain static. These
programs can streamline the repetitive entry of hundreds of thousands to millions of records. However,
creating robust programs involves multiple iterations of requirements gathering, programming, testing,
documentation, transporting, and refinement, and these programs may be used only once or twice a year,
making the effort particularly costly in terms of man hours and ROI. Using a program that was hastily put
together or one that has not been well-tested can damage or destroy data.

© 2008 Winshuttle, Inc. All rights reserved. www.winshuttle.com 2


- SAP-provided tools
If technical resources aren’t limited, one approach to create and change master and transactional data is to
use the data importing tools already resident in SAP applications such as BDC and LSMW. LSMW can be a very
effective way of creating data in a new SAP implementation. However, SAP technical tools are intended to
be used by technical resources in IT departments, rather than everyday business users. In addition, any data
import scripts and programs added to the SAP system need to be maintained as SAP versions are upgraded
even if they were created as one-time-use tools. Finally, your business users require extra authorizations to
perform uploads using SAP-provided tools.

– SAP-certified third-party tools


Third-party tools such as Winshuttle’s transactionSHUTTLE do not reside inside the SAP system,
so they generally do not require extra authorizations and can be used by everyday business users.
transactionSHUTTLE simplifies the data loading process because it does not require any programming and
does not require any technical resources. For your SAP systems, data loading using a third-party tool is
exactly the same as manually entering data, only at a much faster speed.

• Uploading to SAP via the Correct Interface


Data should never be uploaded directly to SAP tables. Writing directly to SAP tables circumvents the data
validation provided by normal SAP transactions. Always upload data via pre-configured SAP transactions, BAPIs
or IDocs, and use tools such as BDC, CATT, LSMW, or a qualified third-party tool. This maintains the validations
configured in each transaction.

• Ensuring Regulatory Compliance


One of the most common SOX audit concerns is that users in IT departments have very broad access to
production data in SAP systems. Data uploads should be carried out by data owners who are authorized to
perform them. Check to be sure that rights and duties are assigned to different individuals separately so that
no one individual has the power to divert business or transactions in a fraudulent manner.

• Sharing Templates
There is no need to re-invent the wheel for each transaction. Reuse templates developed for previous
uploads and share them among departments. Keeping a repository of templates, in SharePoint or a shared
file repository, along with a common naming convention, will enable users to easily find a previously created
template.

• Planning for Iterations


Due to changes in requirements and errors in data, data loading projects are iterative in nature. Plan to create
several scripts and allot time for iterations.

2. Developing Templates and Preparing the Data


The following best practices should be considered while developing the data loading scripts and templates:

• Develop and test on non-production systems


Testing data loading templates thoroughly on a non-production system allows you to detect and correct
systematic errors before moving the data into production.

• Use a peer-review process


To verify their accuracy and performance, any scripts or templates developed should go through a peer-review
process. This process can range from a very simple method of “looking over one’s shoulder” to a complete
workflow-based review process.

• Use a versioning method to keep track of changes


Since data loading scripts tend to change as requirements change, use a version control system to keep track
of these changes. All document management and collaboration systems, such as Documentum or SharePoint,
offer easy-to-use version control features.

© 2008 Winshuttle, Inc. All rights reserved. www.winshuttle.com 3


• Develop data validation methods in conjunction
with basic load scripts
Validating data prior to loading it into the
SAP system is good way to avoid getting and
reprocessing errors during the loading. Data can be
validated in two ways: 1) Using Excel references
and drop-downs, simple validation rules can be
built into the Excel data file itself so that invalid
data is not created in Excel. 2) Building simple
validation scripts which simulate entering the data
into required SAP fields, but do not post or save the
data.

• Create an Excel template with an embedded loading script


A data loading script is usually associated with an Excel file format since Excel fields are typically mapped to
SAP fields within the data loading script. It is significantly easier to manage a single file than two linked files;
therefore, embed the script within the Excel file template so that only one file needs to be managed.

The following best practices should be considered while preparing the data files to load:

• Keep the data in native format


Minimize the number of transformations that the data has to go through. The Excel files that you need to
load into your SAP system may need to undergo some format conversions or field manipulations before the
loading program will accept these files. Some data loading programs will require a CSV format or a tab-
delimited format, requiring you to transform the native Excel file into a CSV file before every run. Ideally, your
data loading program should allow you to work with your native Excel file format. This will avoid the extra
conversion steps and will also allow you to use Excel formulas, such as VLOOKUP, and Excel formatting directly.

• Use data cleansing tools


To avoid loading poor quality data into SAP systems, use data cleansing tools to remove duplicate records, fill
in missing values, etc. Excel 2007 includes some data quality tools such as duplicate detection, and several
other third-party data cleansing tools, such as Trillium and Data Flux, are available in the market.

• Use a data review process


To verify the accuracy of the data prior to loading it, the prepared data files should go through a peer-review
process. As before, this process can range from “looking over one’s shoulder” to a complete workflow-based
review process.

3. Running the Data Load Process


When running data loading scripts, consider these best practices:

• Data ownership and login credentials


Make sure that the data loading is carried out by users who own the data and who login using their own SAP
credentials. This ensures regulatory compliance and keeps the correct audit trails in the SAP system. This also
makes sure that the data goes through the correct validation rules before being posted into the SAP system.
Additionally, empowering business users to do the upload themselves, using SAP tools or third-party tools, saves
time and money in data loading projects and frees up IT resources.

• Error handling
Any data loading project is sure to have records that will not be accepted by SAP applications due to errors,
so it is important to make sure you are prepared to process such errors. Ideally, the data loading tool you use
will show you the SAP error messages alongside the data records. This will make it easy to identify the cause of
those errors, allowing you to correct the errors in the data file and reload only the error records. If you cannot
identify the causes of the errors, the recommended method is to process those records in a foreground (step-
by-step) mode.

© 2008 Winshuttle, Inc. All rights reserved. www.winshuttle.com 4


• Log each run separately
Since you may have to run a single data file multiple times while processing the errors, it is recommended that
you log each run separately. Either keep a copy of the intermediate run files or log the results of each run in a
separate column.

• Prevent double posting of transactions


Another risk of multiple runs through a data file is that some transactional data may end up being posted twice
into the SAP system. Make sure you prevent such double postings by keeping track of the successfully posted
records separately, or by ensuring that your data loading tool prevents double posting.

• Avoid data loads during high demand, peak-business hours


Some data loads, especially changes to master data, may affect other users who are working on the system.
Also, a large upload may affect the SAP system performance. Your data loading tool should provide a
scheduling feature that will allow you to schedule large uploads during off-peak hours.

• Develop a process to backup old data


When a data loading script is intended to change existing data in your SAP system, make a backup of the old
data before effecting the change. A simple way to backup data is to read the current values of the fields that
you are trying to change in your SAP system and save that into a file. Having this will allow you to revert to the
original data in case of problems with the data loading script.

4. Post Data Load Activities


The following best practices should be considered after completing a successful data load run:

• Archive the data file and the associated scripts


After a data file has been successfully processed, archive that file showing the log messages into your
document management system. This will ensure complete traceability of data for regulatory compliance
purposes.

• Share templates
The data loading scripts and templates that you have developed should be kept in a common repository so that
they can be shared with other users within your organization.

Conclusion
Applying the best practices discussed in this white paper will ensure quicker, error-free and compliant data loading,
regardless of whether the data sets are large or small, or whether the data is master or transactional data. Training
business users to apply good processes and enabling them with the right tools to upload data improves the decisions
that a company makes by guaranteeing their data is up to date. This improved accuracy can be a huge competitive
advantage, and when combined with the operational flexibility that users have with the right tools and practices, can
be instrumental in creating revenue growth.

Winshuttle, Inc. is the leading provider of data loading and extraction tools for SAP users worldwide. Winshuttle’s products replace
manual data entry and complex technical tools by easily and securely shuttling data between Microsoft Excel and SAP solutions.
Winshuttle’s flagship product, transactionSHUTTLE, effectively bridges the needs of business users with the governance requirements
of IT. transactionSHUTTLE transforms business processes for both implementation and post production projects, including data
migration, data maintenance, data integration and data creation. Managing SAP data has never been simpler for the hundreds of
Global 2000 companies that rely on Winshuttle. Headquartered in Bothell, Washington, Winshuttle has offices in the United Kingdom,
France and India. For more information visit http://www.winshuttle.com, email info@winshuttle.com or call 1-800-711-9798.

Corporate Headquarters United Kingdom France India


18323 Bothell Everett Hwy 64 Kimber Road, Southfields, 58, rue Delalain Third Floor, Tower D
Suite 110, Bothell, WA 98012 SW18 4PP, London, U.K 94700 Maisons-Alfort DLF Building,
Tel + 1 (800) 711-9798 Tel +44 (0) 208-704-4170 Tel +33 (0) 1 48 93 71 71 Technology Park
Fax +1 (425) 527-6666 Fax +44 208 711 2665 Fax +33 (0) 1 43 68 37 68 Chandigarh 160101, India

© 2008 Winshuttle, Inc. All rights reserved. www.winshuttle.com 5

Potrebbero piacerti anche