<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1334192293361106&amp;ev=PageView&amp;noscript=1">

Data Migration, as we have discussed through the past few weeks during Data Primer, Data Design, Data Extraction, Data Transformation, Data Loading and Data Verification, is the process to transfer the data from legacy systems (non-SAP) to SAP systems by using ETL applications and tools based on defined/derived business rules as per the future business operations/transactions.

Legacy system

The legacy system is the source system that carries the current business operations as per the defined scope and business processes. Limitations may exist that interact with new applications/interfaces and domains, as well as to meet the growing needs in the organization.

Target system

A target system is defined to meet future needs of the organizational operations and more efficiently handles all the complex applications and interfaces.

A Data Migration is typically a large undertaking and proper planning is key. Here are some common issues and mistakes – which can affect your time, effort and budget.

Lack of understanding the legacy systems and their business processes/operations

  • Data migration is not merely moving or transferring the data from one system to another, or converting A to B. One has to understand the business relevance and functionality as it is in the legacy system and methods to reflect the same in the target system without losing the business functionality as per the defined scope of the target system.
  • In most scenarios, the migration team or a vendor expects the legacy team to define or provide the detailed requirements without educating them about the target business process requirements and the GAPS between the source and target systems. Many wrong assumptions, misinterpretations and GAPS can be avoided by educating the legacy team.

Delay in follow up and finalizing Business, Functional and Technical specifications

  • It’s always challenging to freeze the business requirements and changes, but a clear vision, strategy, timely planning, execution and change management helps to control changing dynamics
  • Gathering the requirements, conducting workshops, and preparing action items/issues list doesn’t help to finalize the business processes/rules when there is no follow up to connect the missing pieces and resources are not made accountable and responsible.

Getting ready for early development without environment planning and design

  • Bringing the development team to the project too early has become a common mistake by many vendors. In past projects we have seen the lack of proper design, model, and architecture or System landscape.
  • It is necessary to finalize the scope and object inventory for data migration, determine what the master data objects and transactional data objects are, as well as their dependencies and sequences.
  • Establish the environment for DEV, QA and PROD, and their networks, firewall and ETL applications/tools.

Missing components in Data Quality

  • Beyond Data Quality other than profiling, address validation, and other formats, is Data Referential Integrity. Missing customers, vendors and materials data in the main source as master data does have an impact on their transactions.
  • For example, you cannot maintain a purchase info record without a vendor. Because of this, 15% to 20% of transactional data will be excluded from the actual data load.
  • De-duping and consolidation: There would be a scenario in legacy systems where more than one sequence number will be maintained along with additional keys. This might need to be parsed and split into one or more columns while passing to target based on the defined target structures. And more over the corresponding transactions associated with these numbers should be consolidated, mostly either we forget to tie these rows carefully or never maintained any reference to this fact, and we always find the variance in counts, data or $amounts.

Invalid Cross Reference Data Conversion and Missing SAP configuration

  • The cross reference or XREF will be contrived or derived for certain key columns either for master or transactional data objects. It’s matching source values to target values (Apples and Oranges). These XREF files be placed either in shared network/FTP or ETL staging, and these will not be updated timely as per the business changes requests. This will result in data issues while performing business transactions.
  • For example, a material is supposed to be extended to 5 plants, yet there is only XREF maintained for 3 plants. The result is any business transaction related to other 2 plants will fail.
  • The other common missing scenario is the SAP configuration, such as materials groups, payment terms, plants, storage locations and purchase orgs.

Lack of thorough data validation

  • Data validation not only means verifying that the data is correctly converted and migrated from source to target system. It is necessary to verify and validate the business functionality and process by using that data. You may have business-required and mandatory values to be migrated to the target system, and, in order to migrate to target system, it might require additional columns or fields which may have not been a part of the legacy system.
Scenario:

Data validation: Payment terms

If the legacy vendor has payment terms of (net 45 days) , as a part of the data validation, verify if the same net 45 days payment term is populated at the target system.

Business data validation: Payment terms

For any vendor in SAP, the purchasing view and accounting view should have the same payment term, as part of business data validation. Verify whether the both views have the same payment term or not.

Business process validation

While posting AR/AP, if the due date is calculated based on payment term, verify that the due date has properly populated

Poor Performance and load balancing estimations

  • With less expertise on database sizing and data volumes, defining inaccurate system/network connection parameters, firewalls, ports and landscape leads to dead performance issues during the data extraction, staging and loading into target systems
  • Need to define appropriate hardware/software resources based on the data volumes
  • Define batch jobs to clear the system, application and data logs timely
  • Generate parallel processing and additional CPUs

Missing Cut-Over activities

  • In the process of moving from DEV to QA and then to PROD environments, the cutover activities should maintain the additional dependencies and ad hoc changes on the fly immediately. All teams need to verify their pre-migration, and migration activities.
  • For example, without maintaining the profit center and hierarchy, we cannot expect to load any materials or vendor s

Poor visibility of Errors and Handling reconciliation

  • Fallouts would occur, starting from extraction, staging and loading. Capturing all these errors and providing visibility of these errors to object owners and determining the timely resolution will certainly help to avoid repeating the same error again and again.
  • When we have a tight connection on these fallouts and variances, it would be easy to tie out the object counts, and financial balances (Amounts)

Business process integration

  • Defining all the business processes required to test in a sequence and on a priority basis is another important aspect after the data load, as a part of post migration and support activities
  • Lack of well-defined test process and scripts
  • Defining the pre-requisites for the interface testing inbound / outbound
  • Inexperienced or untrained power users in use of acceptance testing
New Call-to-action