Salesforce Org-to-Org Data Migration: Complete Guide with Tools and Best Practices
TL;DR
- Choose the right tool: Jitterbit Data Loader offers cloud storage, credential management, and scheduling
- Plan import order: Generate data models to understand object relationships before migration
- Maintain relationships: Use external IDs or VLOOKUPs to preserve object references
- Handle complex objects: Create proxy objects for Notes, Tasks, Events, and Attachments
- Automate where possible: Use Apex batch jobs for complex data transformations
What You'll Learn
- How to choose the best migration tool for Salesforce org-to-org transfers
- Step-by-step approach to planning migration order and maintaining data relationships
- Advanced techniques for migrating attachments and complex object types
- Production-ready code solutions and automation strategies
The Problem
Migrating data between Salesforce organizations is one of the most complex challenges in Salesforce administration. Organizations often need to consolidate multiple orgs, migrate from sandbox to production, or transfer data during mergers and acquisitions.
Common Questions This Article Answers:
- What's the best tool for Salesforce org-to-org data migration?
- How do I maintain object relationships during migration?
- What's the correct order for importing related objects?
- How do I migrate attachments and binary data between orgs?
Quick Answer
Successful Salesforce org migration requires four key steps:
- Tool Selection: Choose a migration tool with relationship mapping capabilities (Jitterbit recommended)
- Import Planning: Generate data models to determine correct import sequence
- Relationship Strategy: Use external IDs or VLOOKUPs to maintain object references
- Complex Object Handling: Implement proxy objects for Notes, Tasks, Events, and Attachments
The key is thorough planning before any data movement begins.
Comprehensive Migration Strategy
1. Choosing the Right Migration Tool
Why Jitterbit Data Loader is Recommended
Key Advantages:
- Cloud-based storage: No local file management required
- Credential management: Save and reuse source and target org credentials
- Scheduling capabilities: Automate recurring migration tasks
- Data transformation: Built-in transformation options for field mapping
- Relationship handling: Advanced features for maintaining object relationships
Alternative Tools Comparison:
# Migration Tool Comparison
jitterbit:
strengths: ["Cloud storage", "Scheduling", "Transformations"]
best_for: "Complex migrations with relationships"
data_loader:
strengths: ["Free", "Salesforce native", "Simple interface"]
limitations: ["Manual process", "No relationship mapping"]
workbench:
strengths: ["Web-based", "Quick exports"]
limitations: ["Limited transformation", "Manual relationships"]
2. Planning Your Migration Order
Data Model Analysis Process
Step 1: Generate Relationship Map
-- Query to understand object relationships
SELECT
EntityDefinition.QualifiedApiName,
FieldDefinition.QualifiedApiName,
FieldDefinition.DataType,
FieldDefinition.ReferenceTargetField
FROM FieldDefinition
WHERE EntityDefinition.QualifiedApiName IN ('Account', 'Contact', 'Opportunity')
AND FieldDefinition.DataType = 'Lookup'
Step 2: Determine Import Sequence
# Recommended Import Order
migration_sequence:
phase_1_foundation:
- Users
- RecordTypes
- Custom Settings
phase_2_master_data:
- Accounts
- Contacts
- Products
phase_3_transactional:
- Opportunities
- Cases
- Orders
phase_4_complex:
- Notes
- Tasks
- Events
- Attachments
3. Maintaining Object Relationships
Strategy 1: External ID Approach (Recommended)
Implementation Steps:
- Add external ID fields to all objects in target org
- Populate external IDs with source org record IDs
- Use external ID references during import
Example External ID Setup:
// Create external ID field for Account object
// Field Name: Source_Org_ID__c
// Type: Text(18)
// External ID: Checked
// Unique: Checked
// During migration, map Source Account.Id -> Target Account.Source_Org_ID__c
Strategy 2: VLOOKUP Approach
When to Use:
- Cannot modify target org schema
- One-time migration without ongoing sync
- Simple parent-child relationships
Implementation Example:
# Excel VLOOKUP for Contact.AccountId
=VLOOKUP(B2,AccountMapping!A:B,2,FALSE)
# Where:
# B2 = Source Account ID
# AccountMapping = Sheet with Source ID -> Target ID mapping
# A:B = Lookup range (Source ID in A, Target ID in B)
4. Handling Complex Objects with Proxy Patterns
Proxy Object Strategy for Attachments
Problem: Direct attachment migration loses parent relationships
Solution: Create proxy objects to maintain references
Proxy Object Implementation:
// Create custom object: Attachment_Proxy__c
public class AttachmentMigrationBatch implements Database.Batchable<sObject> {
public Database.QueryLocator start(Database.BatchableContext bc) {
return Database.getQueryLocator([
SELECT Id, Name, Body, ParentId, ContentType
FROM Attachment
WHERE ParentId != null
]);
}
public void execute(Database.BatchableContext bc, List<Attachment> attachments) {
List<Attachment_Proxy__c> proxies = new List<Attachment_Proxy__c>();
for (Attachment att : attachments) {
Attachment_Proxy__c proxy = new Attachment_Proxy__c(
Name = att.Name,
Original_Parent_ID__c = att.ParentId,
Content_Type__c = att.ContentType,
File_Size__c = att.Body.size(),
Migration_Status__c = 'Ready'
);
proxies.add(proxy);
}
insert proxies;
}
public void finish(Database.BatchableContext bc) {
// Trigger attachment recreation process
System.scheduleBatch(new AttachmentRecreationBatch(),
'Recreate Attachments', 5);
}
}
Jitterbit Script for Attachment Export
Binary Data Handling:
// Jitterbit transformation script for attachment export
$fn = root$transaction.response$body$queryResponse$result$records.Attachment$Name$;
WriteFile("Targets/Files/AttachmentFile",
Base64Decode(root$transaction.response$body$queryResponse$result$records.Attachment$Body$),
$fn);
FlushFile("Targets/Files/AttachmentFile");
// This script:
// 1. Extracts attachment filename
// 2. Decodes base64 body content
// 3. Writes to file system for bulk processing
5. Advanced Migration Patterns
Batch Processing for Large Datasets
Memory-Efficient Approach:
public class LargeDataMigrationBatch implements Database.Batchable<sObject> {
private String query;
private Integer batchSize;
public LargeDataMigrationBatch(String soqlQuery, Integer size) {
this.query = soqlQuery;
this.batchSize = size;
}
public Database.QueryLocator start(Database.BatchableContext bc) {
return Database.getQueryLocator(query);
}
public void execute(Database.BatchableContext bc, List<sObject> records) {
try {
// Transform data for target org
List<sObject> transformedRecords = transformForTarget(records);
// Use external ID upsert to handle duplicates
Database.upsert(transformedRecords,
Schema.getGlobalDescribe()
.get('Target_Object__c')
.getDescribe()
.fields.getMap()
.get('External_ID__c')
.getDescribe()
.getSObjectField());
} catch (Exception e) {
// Log errors for retry processing
logMigrationError(records, e);
}
}
}
Data Validation and Quality Checks
Pre-Migration Validation:
public class MigrationValidator {
public static ValidationResult validateMigrationReadiness() {
ValidationResult result = new ValidationResult();
// Check required fields
result.requiredFieldsComplete = checkRequiredFields();
// Validate relationships
result.relationshipsValid = validateObjectRelationships();
// Check data quality
result.dataQualityScore = calculateDataQuality();
// Verify external ID uniqueness
result.externalIdsUnique = checkExternalIdUniqueness();
return result;
}
private static Boolean checkRequiredFields() {
List<sObject> recordsWithMissingFields = [
SELECT Id FROM Account
WHERE Name = null OR Type = null
];
return recordsWithMissingFields.isEmpty();
}
}
Best Practices & Common Pitfalls
ā Migration Best Practices
1. Always Test in Sandbox First
- Create full copy sandbox for migration testing
- Validate all relationships and data integrity
- Test rollback procedures
2. Document Everything
# Migration Documentation Template
migration_plan:
objects: ["List of objects in order"]
relationships: ["Mapping of lookup/master-detail fields"]
transformations: ["Data transformation rules"]
validation_rules: ["Post-migration validation criteria"]
rollback_plan: ["Steps to reverse migration if needed"]
3. Monitor and Log Everything
// Comprehensive logging approach
public class MigrationLogger {
public static void logMigrationStep(String step, Integer recordCount,
String status, String details) {
Migration_Log__c log = new Migration_Log__c(
Step_Name__c = step,
Record_Count__c = recordCount,
Status__c = status,
Details__c = details,
Timestamp__c = DateTime.now()
);
insert log;
}
}
ā Common Migration Mistakes
1. Ignoring Governor Limits
- Problem: Attempting to migrate too much data at once
- Solution: Implement batch processing with appropriate batch sizes
2. Not Planning for Relationships
- Problem: Importing child records before parents
- Solution: Always create dependency mapping first
3. Skipping Data Validation
- Problem: Migrating corrupt or incomplete data
- Solution: Implement comprehensive validation before and after migration
Performance Optimization Strategies
Large-Scale Migration Optimization
1. Parallel Processing
// Execute multiple batch jobs for different object types
System.enqueueJob(new AccountMigrationQueueable());
System.enqueueJob(new ContactMigrationQueueable());
System.enqueueJob(new OpportunityMigrationQueueable());
2. Bulk API Utilization
// Use Bulk API for large datasets
BulkConnection bulkConnection = getBulkConnection();
JobInfo job = createJob("Account", "insert", bulkConnection);
List<BatchInfo> batchInfoList = createBatches(job, accountData);
3. Selective Field Migration
-- Only migrate necessary fields to reduce payload
SELECT Id, Name, Type, BillingStreet, BillingCity,
External_ID__c, Owner.External_ID__c
FROM Account
WHERE LastModifiedDate >= :migrationDate
Frequently Asked Questions
Q: What's the maximum number of records I can migrate at once?
A: Use batch sizes of 200-2000 records depending on object complexity. For simple objects like Accounts, 2000 is fine. For complex objects with many relationships, use 200-500. Always monitor API limits and processing time.
Q: How do I handle duplicate records during migration?
A: Use external IDs with upsert operations. Create an external ID field in the target org, populate it with source org record IDs, then use upsert instead of insert. This automatically handles duplicates and allows for incremental migrations.
Q: Can I migrate Salesforce Knowledge articles between orgs?
A: Knowledge articles require special handling. Export articles as data using SOQL, migrate the Knowledge__kav records, then use the Knowledge API to publish articles in the target org. Consider using Salesforce's Data Export feature for Knowledge.
Q: How do I migrate custom metadata types?
A: Custom metadata types can be migrated using the Metadata API or packaged in managed/unmanaged packages. They cannot be migrated using standard data migration tools since they're metadata, not data.
Q: What's the best way to handle large file attachments?
A: For attachments larger than 25MB, consider using Salesforce Files (ContentDocument/ContentVersion) instead of the legacy Attachment object. Use the REST API for large file uploads and implement retry logic for failed transfers.
Key Takeaways
- Tool Selection Matters: Jitterbit provides the best balance of features for complex migrations
- Planning is Critical: Generate data models and relationship maps before starting any migration
- External IDs are Essential: Use external ID fields to maintain relationships across orgs
- Batch Processing Required: Implement proper batch processing for large datasets and governor limit compliance
- Test Everything: Always test in sandbox environments before production migration
What's Next?
Recommended Reading:
- Salesforce Data Import Best Practices - Advanced import strategies
- API Limits and Optimization - Managing governor limits during migration
- Attachment to Files Migration Guide - Modern file storage patterns
Action Items:
- Create a detailed migration plan using the templates provided
- Set up external ID fields in your target org
- Test migration process in sandbox environment
- Implement monitoring and logging for production migration
Resources & References
- Salesforce Data Loader Guide
- Jitterbit Data Loader Documentation
- Salesforce Bulk API Developer Guide
- Migration Best Practices Trailhead
About This Guide: This comprehensive guide provides production-ready strategies for Salesforce org-to-org data migration. Last updated January 2025 with modern migration patterns and API best practices.
Tags: #salesforce #datamigration #jitterbit #enterprise #bestpractices
Comments (0)
Loading comments...