//CPU Timeout
Critical Limit Breach

System.LimitException:
Apex CPU time limit exceeded

System.LimitException: Apex CPU time limit exceeded

Your code runs perfectly in Sandbox with 100 test records. Production has 50,000. Those nested loops you wrote? They're O(n²). With 50,000 records, that's 2.5 billion iterations. The CPU timeout hits at 10 seconds, and your transaction dies.

6,200 developers search for this error every month. The worst part? It only manifests under real data volumes, making it nearly impossible to catch in dev.

Jataka catches CPU timeout in real-time
1:15
Watch: Nested loops burn CPU → Jataka blocks PRLoom Recording

The Limit: 10 seconds synchronous / 60 seconds async

10s

Maximum CPU time for synchronous Apex (triggers, controllers)

60s

Maximum CPU time for async Apex (Batch, Future, Queueable)

6 hrs

Average downtime from CPU timeout in production

CPU time includes: Apex execution, formula evaluation, workflow execution, validation rules, and trigger recursion. Your 5-second trigger might actually consume 8 seconds when you factor in all the automation that fires after your code runs.

The Bad Code

Nested loops. The silent killer. O(n²) complexity grows exponentially with data volume.

CommissionCalculator.cls❌ Anti-Pattern
// ❌ BAD: Nested loops with O(n²) complexity
// Works fine with 100 records in Sandbox
// Burns through CPU time with 10,000+ records in Production

public void calculateCommission(List<Opportunity> opps) {
    for (Opportunity opp1 : opps) {
        for (Opportunity opp2 : opps) {
            // O(n²) comparison - exponential CPU growth
            if (opp1.AccountId == opp2.AccountId) {
                Decimal commission = calculateComplexFormula(opp1, opp2);
                opp1.Commission__c = commission;
            }
        }
    }
    update opps;
}
// With 10,000 opportunities = 100,000,000 iterations
// CPU timeout at 10 seconds

Why this is dangerous: Static analysis can't predict CPU time because it depends on data volume. Your Sandbox has 100 records. Production has 50,000. Only runtime profiling reveals the truth.

Jataka Report Card

Jataka executes this code with production-scale data in Sandbox. We measure actual CPU consumption.

PR #428 Blocked
8 minutes ago

CPU Time

12,847ms/10,000ms

SOQL Queries

3/100

DML Statements

1/150

Records Tested

10,000

CPU Timeout Detected

Transaction consumed 12,847ms CPU time. Limit is 10,000ms. Found nested loops with O(n²) complexity at line 5.

The Fix

Use Maps to achieve O(n) complexity. Linear time regardless of data volume.

CommissionCalculator.cls✓ Optimized
// ✅ GOOD: Use Maps for O(n) complexity
// Linear time regardless of record count

public void calculateCommission(List<Opportunity> opps) {
    // Group by AccountId using a Map
    Map<Id, List<Opportunity>> oppsByAccount = new Map<Id, List<Opportunity>>();
    
    for (Opportunity opp : opps) {
        if (!oppsByAccount.containsKey(opp.AccountId)) {
            oppsByAccount.put(opp.AccountId, new List<Opportunity>());
        }
        oppsByAccount.get(opp.AccountId).add(opp);
    }
    
    // Process each account's opportunities
    for (List<Opportunity> accountOpps : oppsByAccount.values()) {
        for (Integer i = 0; i < accountOpps.size(); i++) {
            Decimal commission = calculateFormula(accountOpps[i]);
            accountOpps[i].Commission__c = commission;
        }
    }
    update opps;
}

Result: O(n) complexity. 10,000 records process in 847ms. CPU consumption reduced by 94%. Production stays online.

Related Anti-Patterns

Stop production timeouts

Jataka catches CPU timeout
before the merge.

Book a demo and watch Jataka profile CPU time with production-scale data. Your transactions stay fast.