Salesforce Governor Limits and Bulkification: Writing Scalable Apex Code
2025-12-03
Salesforce’s multi-tenant architecture enforces governor limits to ensure fair resource distribution across all organizations. These limits prevent any single process from consuming too many resources, but they can catch developers off guard, leading to runtime exceptions in production.
Understanding governor limits and writing bulkified code is essential for building scalable Salesforce applications.
What Are Governor Limits?
Governor limits are runtime restrictions that Salesforce enforces on Apex code execution. They include:
- SOQL Queries: 100 synchronous, 200 asynchronous
- DML Statements: 150 synchronous, 200 asynchronous
- DML Rows: 10,000 synchronous, 10,000 asynchronous
- CPU Time: 10,000ms synchronous, 60,000ms asynchronous
- Heap Size: 6MB synchronous, 12MB asynchronous
- Callouts: 100 HTTP callouts per transaction
Critical Point: These limits apply per transaction, not per method. All code executed in a single transaction shares the same limit pool.
The Bulkification Problem
What is Bulkification?
Bulkification means writing code that can efficiently handle multiple records at once, whether processing 1 record or 200 records in a single transaction.
The Anti-Pattern: SOQL in Loops
The most common mistake is executing SOQL queries inside loops:
// ❌ BAD: Hits SOQL limit with just 101 records
trigger AccountTrigger on Account (before insert) {
for (Account acc : Trigger.new) {
// This executes once per record!
List<Contact> contacts = [SELECT Id, Name FROM Contact WHERE AccountId = :acc.Id];
// Process contacts...
}
}
Problem: If 101 accounts are inserted, this triggers 101 SOQL queries, exceeding the 100-query limit.
The Anti-Pattern: DML in Loops
Similarly, DML operations inside loops are problematic:
// ❌ BAD: Hits DML limit with 151 records
trigger AccountTrigger on Account (after insert) {
for (Account acc : Trigger.new) {
Contact newContact = new Contact(
FirstName = 'Default',
LastName = 'Contact',
AccountId = acc.Id
);
// This executes once per record!
insert newContact; // 150 DML limit exceeded!
}
}
Bulkification Solutions
1. Collect IDs, Query Once
Instead of querying inside a loop, collect all IDs first, then query once:
// ✅ GOOD: Bulkified - one query regardless of record count
trigger AccountTrigger on Account (before insert) {
Set<Id> accountIds = new Set<Id>();
for (Account acc : Trigger.new) {
accountIds.add(acc.Id);
}
// Single query for all accounts
Map<Id, List<Contact>> accountContactsMap = new Map<Id, List<Contact>>();
for (Contact con : [SELECT Id, Name, AccountId FROM Contact WHERE AccountId IN :accountIds]) {
if (!accountContactsMap.containsKey(con.AccountId)) {
accountContactsMap.put(con.AccountId, new List<Contact>());
}
accountContactsMap.get(con.AccountId).add(con);
}
// Process all records
for (Account acc : Trigger.new) {
List<Contact> contacts = accountContactsMap.get(acc.Id);
if (contacts != null) {
// Process contacts...
}
}
}
2. Collect Records, DML Once
Collect all records to insert/update, then perform DML once:
// ✅ GOOD: Bulkified - one DML operation
trigger AccountTrigger on Account (after insert) {
List<Contact> contactsToInsert = new List<Contact>();
for (Account acc : Trigger.new) {
contactsToInsert.add(new Contact(
FirstName = 'Default',
LastName = 'Contact',
AccountId = acc.Id
));
}
// Single DML operation for all records
if (!contactsToInsert.isEmpty()) {
insert contactsToInsert;
}
}
3. Use Maps for Efficient Lookups
Maps provide O(1) lookup time and are essential for bulkification:
// ✅ GOOD: Using Map for efficient lookups
trigger OpportunityTrigger on Opportunity (before insert) {
Set<Id> accountIds = new Set<Id>();
for (Opportunity opp : Trigger.new) {
if (opp.AccountId != null) {
accountIds.add(opp.AccountId);
}
}
// Query once, store in Map
Map<Id, Account> accountMap = new Map<Id, Account>([
SELECT Id, Name, Industry, AnnualRevenue
FROM Account
WHERE Id IN :accountIds
]);
// Use Map for lookups (no additional queries)
for (Opportunity opp : Trigger.new) {
Account acc = accountMap.get(opp.AccountId);
if (acc != null) {
// Use account data without additional queries
opp.Description = 'Account: ' + acc.Name + ', Industry: ' + acc.Industry;
}
}
}
Common Bulkification Patterns
Pattern 1: Parent-Child Relationship Queries
When you need related records for multiple parents:
// ✅ GOOD: Query all related records at once
public static void processAccountsWithContacts(List<Account> accounts) {
Set<Id> accountIds = new Set<Id>();
for (Account acc : accounts) {
accountIds.add(acc.Id);
}
// Single query with parent-child relationship
List<Account> accountsWithContacts = [
SELECT Id, Name,
(SELECT Id, Name, Email FROM Contacts)
FROM Account
WHERE Id IN :accountIds
];
// Process all accounts and their contacts
for (Account acc : accountsWithContacts) {
// Access related contacts via acc.Contacts
for (Contact con : acc.Contacts) {
// Process contact...
}
}
}
Pattern 2: Aggregate Queries
When you need counts or sums for multiple records:
// ✅ GOOD: Aggregate query instead of counting in loop
public static void updateAccountOpportunityCounts(List<Account> accounts) {
Set<Id> accountIds = new Set<Id>();
for (Account acc : accounts) {
accountIds.add(acc.Id);
}
// Single aggregate query
List<AggregateResult> results = [
SELECT AccountId, COUNT(Id) oppCount, SUM(Amount) totalAmount
FROM Opportunity
WHERE AccountId IN :accountIds
GROUP BY AccountId
];
// Create Map for easy lookup
Map<Id, AggregateResult> accountStatsMap = new Map<Id, AggregateResult>();
for (AggregateResult ar : results) {
accountStatsMap.put((Id)ar.get('AccountId'), ar);
}
// Update accounts
for (Account acc : accounts) {
AggregateResult stats = accountStatsMap.get(acc.Id);
if (stats != null) {
acc.Number_Of_Opportunities__c = (Integer)stats.get('oppCount');
acc.Total_Opportunity_Amount__c = (Decimal)stats.get('totalAmount');
}
}
}
Pattern 3: Selective SOQL Queries
Make queries selective (use indexed fields) to avoid full table scans:
// ❌ BAD: Non-selective query (full table scan)
List<Account> accounts = [
SELECT Id, Name
FROM Account
WHERE Custom_Field__c = 'Value' // Not indexed!
LIMIT 10000
];
// ✅ GOOD: Selective query (uses index)
List<Account> accounts = [
SELECT Id, Name
FROM Account
WHERE Id IN :accountIds // Id is always indexed
AND Custom_Field__c = 'Value'
];
Selective Query Rules:
- Use indexed fields (Id, Name, Email, External IDs, etc.)
- For custom fields, ensure they’re indexed or use WHERE clauses with indexed fields first
- Avoid functions in WHERE clauses (UPPER(), LOWER(), etc.) when possible
Best Practices for Bulkification
1. Always Assume Bulk Data
Write triggers and methods that handle 1 record or 200 records:
// ✅ GOOD: Handles any number of records
public static void processRecords(List<SObject> records) {
if (records == null || records.isEmpty()) {
return;
}
// Bulkified logic here...
}
2. Use Collections Efficiently
- Lists: For ordered collections and DML operations
- Sets: For unique values and efficient membership testing
- Maps: For key-value lookups (Id → SObject is most common)
// ✅ GOOD: Efficient collection usage
Set<Id> accountIds = new Set<Id>(); // Unique IDs
Map<Id, Account> accountMap = new Map<Id, Account>(); // Fast lookups
List<Contact> contactsToInsert = new List<Contact>(); // For DML
3. Batch Processing for Large Datasets
For very large datasets, use Batch Apex:
// ✅ GOOD: Batch Apex for large datasets
public class ProcessAccountsBatch implements Database.Batchable<SObject> {
public Database.QueryLocator start(Database.BatchableContext bc) {
return Database.getQueryLocator('SELECT Id, Name FROM Account');
}
public void execute(Database.BatchableContext bc, List<Account> scope) {
// Process batch of records (up to 200 by default)
// Each batch gets its own governor limits
processAccounts(scope);
}
public void finish(Database.BatchableContext bc) {
// Cleanup or finalization logic
}
}
4. Monitor Governor Limits
Use Limits class to check remaining limits:
// ✅ GOOD: Check limits before operations
if (Limits.getQueries() < Limits.getLimitQueries() - 10) {
// Safe to perform queries
List<Account> accounts = [SELECT Id FROM Account LIMIT 10];
} else {
// Use cached data or skip query
}
Common Mistakes to Avoid
1. SOQL in Loops
// ❌ NEVER DO THIS
for (Account acc : accounts) {
List<Contact> contacts = [SELECT Id FROM Contact WHERE AccountId = :acc.Id];
}
2. DML in Loops
// ❌ NEVER DO THIS
for (Account acc : accounts) {
insert new Contact(AccountId = acc.Id, LastName = 'Test');
}
3. Non-Selective Queries
// ❌ AVOID: May cause full table scan
List<Account> accounts = [SELECT Id FROM Account WHERE Custom_Text__c = 'Value'];
4. Forgetting Null Checks
// ❌ BAD: May cause NullPointerException
for (Opportunity opp : opportunities) {
String accountName = opp.Account.Name; // Account might be null!
}
// ✅ GOOD: Null check
for (Opportunity opp : opportunities) {
if (opp.Account != null) {
String accountName = opp.Account.Name;
}
}
Testing Bulkification
Always test with bulk data (200 records):
@isTest
private class AccountTriggerTest {
@isTest
static void testBulkInsert() {
// Create 200 accounts
List<Account> accounts = new List<Account>();
for (Integer i = 0; i < 200; i++) {
accounts.add(new Account(Name = 'Test Account ' + i));
}
Test.startTest();
insert accounts; // Should not hit governor limits
Test.stopTest();
// Verify results
System.assertEquals(200, [SELECT COUNT() FROM Account WHERE Name LIKE 'Test Account%']);
}
}
Summary
Key takeaways for writing bulkified Apex code:
- Never put SOQL or DML inside loops - Collect data first, then query/DML once
- Use Maps for efficient lookups - O(1) access time vs O(n) for Lists
- Query related records using parent-child queries - More efficient than separate queries
- Always test with 200 records - Ensures your code handles bulk scenarios
- Use selective queries - Leverage indexed fields to avoid full table scans
- Monitor governor limits - Use
Limitsclass to check remaining resources - Use Batch Apex for large datasets - Each batch gets its own governor limits
By following these patterns and best practices, you’ll write Apex code that scales efficiently and avoids governor limit exceptions, even when processing hundreds of records in a single transaction.