Retry with Backoff

Automatic retry for transient failures

Network requests can fail temporarily due to network issues, server overload, or rate limiting. Implementing retry logic with exponential backoff makes your application more resilient.

Exponential backoff means waiting progressively longer between retries (1s, 2s, 4s, 8s...) to give the server time to recover.

Simple Retry

async function fetchWithRetry(url, options = {}, retries = 3) { for (let attempt = 0; attempt <= retries; attempt++) { try { const response = await fetch(url, options); // Only retry on 5xx errors (server issues) if (response.status >= 500 && attempt < retries) { console.log(`Attempt $${attempt + 1} failed, retrying...`); continue; } return response; } catch (error) { // Network error - retry if we have attempts left if (attempt === retries) throw error; console.log(`Network error, retrying...`); } } }

Exponential Backoff

function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms)); } async function fetchWithBackoff(url, options = {}) { const maxRetries = 4; const baseDelay = 1000; // 1 second for (let attempt = 0; attempt <= maxRetries; attempt++) { try { const response = await fetch(url, options); // Don't retry client errors (4xx) if (response.status >= 400 && response.status < 500) { return response; } // Retry server errors (5xx) if (!response.ok && attempt < maxRetries) { const delay = baseDelay * Math.pow(2, attempt); console.log(`Retry $${attempt + 1} in $${delay}ms...`); await sleep(delay); continue; } return response; } catch (error) { if (attempt === maxRetries) throw error; const delay = baseDelay * Math.pow(2, attempt); console.log(`Network error. Retry in $${delay}ms...`); await sleep(delay); } } }

Backoff schedule:

  • Attempt 1: immediate
  • Attempt 2: wait 1 second
  • Attempt 3: wait 2 seconds
  • Attempt 4: wait 4 seconds
  • Attempt 5: wait 8 seconds (then give up)

With Jitter (Recommended)

// Add randomness to prevent thundering herd function getBackoffDelay(attempt, baseDelay = 1000) { const exponentialDelay = baseDelay * Math.pow(2, attempt); const jitter = Math.random() * 1000; // 0-1000ms random return exponentialDelay + jitter; } async function fetchWithJitter(url, options = {}) { const config = { maxRetries: 4, baseDelay: 1000, retryOn: [408, 429, 500, 502, 503, 504] }; for (let attempt = 0; attempt <= config.maxRetries; attempt++) { try { const response = await fetch(url, options); if (config.retryOn.includes(response.status) && attempt < config.maxRetries) { const delay = getBackoffDelay(attempt, config.baseDelay); await sleep(delay); continue; } return response; } catch (error) { if (attempt === config.maxRetries) throw error; const delay = getBackoffDelay(attempt, config.baseDelay); await sleep(delay); } } }

Jitter adds randomness to prevent many clients from retrying at the exact same time (thundering herd problem). This is especially important for rate-limited APIs.

Try It Live