SLAM_load_bulk
Overview
Provides the highest level of control over large data loads, including batch sizes and serial vs parallel processing. Bulk v1 supports hardDelete. Note that Salesforce specifically has not validated Bulk v1 for use with custom address fields and therefore you should look to Bulk v2 in those situations
Performance Tip: By default, Speediful enables parallel data loading with Bulk v1 API. This parallelizes the loading operation on the Salesforce server. It can lead to dramatic time reductions in loading operations vs serial loading. However, judicious use is advised, as this option can potentially be detrimental to overall throughput when loading to objects with lengthy triggered business process flows or to junction objects. To disable parallel loading, set @bulk_parallel = 0 or consider using SLAM_lockbuster
Parameters
| Name | Type | Default | Description |
|---|---|---|---|
@sObject | NVARCHAR | Required | Salesforce object name to load data into |
@table | NVARCHAR | Required | Input SQL table containing data to load |
@operation | NVARCHAR | Required | Operation type ('insert', 'update', 'upsert', 'delete', 'hardDelete') |
@bulk_parallel | BIT | 1 | Enable parallel batch processing when set to 1. Serial mode when 0 |
@batch_size | INT | 10000 | Number of records per batch (max 10000) |
@externalId | NVARCHAR | NULL | External ID field name (required for upsert operations) |
@failure_threshold | DECIMAL(5,2) | 100 | Allowable failure rate (0-100) before throwing a SQL error. When 100, no error will be thrown |
Usage Examples
Insert new Account records with minimal parameters:
EXEC dbo.SLAM_load_bulk
@sObject = 'Account',
@table = 'Account_insert',
@operation = 'insert'
Upsert Contact records using external ID with parallel processing:
EXEC dbo.SLAM_load_bulk
@sObject = 'Contact',
@table = 'Contact_upsert',
@operation = 'upsert',
@externalId = 'Contact_External_Id__c',
@bulk_parallel = 1,
@batch_size = 5000
Skip the recycle bin with hardDelete:
EXEC dbo.SLAM_load_bulk
@sObject = 'Task',
@table = 'Task_hardDelete',
@operation = 'hardDelete',
@batch_size = 2500