Groupios-folderShape

By Harshvardhan Rathore

“If you don’t have time to do it right, when will you have time to do it over?”John Wooden

When a customer decides to move to a new platform, there is a need to migrate all the functional data from their legacy system. Salesforce.com data migration has its own unique process, with its own set of complexities. Whatever the specific nature of any data migration, the ultimate aim is to improve corporate performance and deliver a competitive advantage.

Salesforce data migration is the very first thing you must do  before go-live in order to bring all the existing data from  your legacy system into the Salesforce Cloud without losing any of it.

Migration is a one-time activity; once the data is loaded into the Salesforce org, the migration task is complete. So it is very important to avoid any kind of mistake that could cascade into having to do the whole process again, or even risk jeopardizing customers’ sensitive data.

To avoid these issues, here are some best practices for preparing for an efficient data migration:

Identify required fields

The first step is to list the fields involved in the migration process:

  • Required fields
  • Optional fields
  • System Generated fields

Then identify any additional fields that might be required, like:

  • Legacy IDs
  • Business rules

Determine the order of migration

In Salesforce, relationships that exist between objects and dependencies dictate the order of migration. For example, all accounts have owners, and opportunities are associated with an account. In this case, the order would be to

  • Load users
  • Load accounts
  • Load opportunities

Relationships are expressed through related lists and lookups in a Salesforce application while IDs(foreign key) create relationships in a database.

Data Migration Workbook

Create and follow a data migration workbook throughout the scope of migration. This is a consolidated workbook that holds the data mapping for each object involved in the process. A single template with multiple tabs (one each for each mapping object) including a DM checklist and storage requirements. The workbook can be personalized based on your own business requirements.

Pre-data migration considerations:

  • Create and set up a user with a system administrator profile for data migration.
  • Complete system configuration.
  • Set up roles/profiles.
  • Be sure to store all possible legacy IDs for a record in Salesforce. (This can help with troubleshooting later on.)
  • Confirm that record types and picklist values are defined.
  • Set up every single product/currency combination in the pricebooks if it will be used in Salesforce. (This will need to be loaded into the standard pricebook first.)
  • Proper mapping needs to be defined.

Data load considerations:

  • Clean and optimize your data before loading. It’s always good practice to standardize, clean, de-dupe and validate source data prior to migration.
  • Use Bulk API for better throughput, especially when working with large data volumes to increase load speed.
  • Disable and defer what you can. When you know your data is clean, you can safely disable the processes that you would normally have in place to protect against data entry errors in batch loads, or made by users during daily operations. All of these operations can add substantial time to inserts — complex triggers in particular. These are the first things you should investigate when you debug a slow load.
  • While loading large data volumes, the calculations can take a considerable amount of time. We can probably increase load performance by deferring the sharing calculations until after the load is complete.

Some additional rules for migration:

  • Clearly define the scope of the project.
  • The process builder must be aware of source format and target (Salesforce) required data format.
  • Your migration process must have the ability to identify failed and successful records. The common approach is to have an extra column in a source table that stores the target table’s unique ID. That way, if there are fewer failure records after the first iteration, you can re-execute the process, which will only pick failed records that are not yet migrated.
  • Actively refine the scope of the project through targeted profiling and auditing.
  • Minimize the amount of data to be migrated.
  • Profile and audit all source data in the scope before writing mapping specifications.
  • Define a realistic project budget and timeline based on knowledge of data issues.
  • Aim to volume-test all data in the scope as early as possible at the unit level.

These best practices can help you prepare for and execute a successful large-volume Salesforce data migration.

Join the Conversation

Your email address will not be published. Required fields are marked *

  1. Nice read Harsh. Great points about data migration.

    Few additional points that I want to mention –

    1. Audit Fields Consideration
    2. Email Deliverability – Setting this to No Emails can avoid any emails that may have been generated due to data load because of active workflows/ triggers or process builder processes.
    3. Use of Upsert and external id – Use combination of upsert and external id for relationships (lookups and master detail).

Join us for Appirio's Worker Experience Tour '17
MORE INFO