Making HTTP requests in JavaScript is something you'll do constantly as a developer. Whether you're fetching user data from an API, submitting forms, or syncing with a backend, understanding how to handle these requests properly can make or break your application's reliability.

The good news? JavaScript offers multiple ways to make HTTP requests, from the modern Fetch API to battle-tested libraries like Axios. The bad news? Each approach has quirks, gotchas, and performance implications that most tutorials gloss over. This guide covers everything from basic GET requests to advanced patterns like retry logic and request caching—the stuff you actually need in production.

Understanding HTTP Requests in JavaScript

Before we dive into code, let's clarify what we're actually doing. An HTTP request is your browser (or Node.js app) asking a server for data or telling it to do something. The server processes that request and sends back a response—usually with some data and a status code telling you whether things went well (200 OK) or sideways (404 Not Found, 500 Internal Server Error).

There are several HTTP methods you'll use:

  • GET: Retrieve data from a server
  • POST: Send data to create something new
  • PUT/PATCH: Update existing data
  • DELETE: Remove data

The tricky part isn't making the request—it's handling errors gracefully, managing timeouts, dealing with flaky networks, and making your code maintainable. Let's start with the basics and work our way up to production-grade patterns.

Method 1: The Fetch API (The Modern Standard)

The Fetch API is built into modern browsers and Node.js (v18+). It's promise-based, which means you can use async/await syntax to write cleaner code than the callback hell of older approaches.

Here's the simplest GET request you can make:

fetch('https://api.example.com/users')
  .then(response => response.json())
  .then(data => console.log(data))
  .catch(error => console.error('Error:', error));

This looks clean, but there's a catch (pun intended). The Fetch API only rejects on network failures, not HTTP errors like 404 or 500. That means if your server returns an error, your .then() block will still run. Here's how to fix that:

fetch('https://api.example.com/users')
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
  })
  .then(data => console.log(data))
  .catch(error => console.error('Error:', error));

Now we're checking response.ok, which is false for any status code outside the 200-299 range. This is a common gotcha that trips up developers switching from libraries like Axios.

Making POST Requests with Fetch

Sending data to a server requires a bit more configuration:

const userData = {
  name: 'Sarah Chen',
  email: 'sarah@example.com'
};

fetch('https://api.example.com/users', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
  },
  body: JSON.stringify(userData)
})
  .then(response => {
    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }
    return response.json();
  })
  .then(data => console.log('User created:', data))
  .catch(error => console.error('Error:', error));

The key points here:

  • Set the method to 'POST' (or 'PUT', 'PATCH', 'DELETE')
  • Specify the Content-Type header so the server knows you're sending JSON
  • Convert your JavaScript object to a JSON string with JSON.stringify()

Using Async/Await for Cleaner Code

Promise chains can get messy. Here's the same request using async/await:

async function createUser(userData) {
  try {
    const response = await fetch('https://api.example.com/users', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
      },
      body: JSON.stringify(userData)
    });

    if (!response.ok) {
      throw new Error(`HTTP error! Status: ${response.status}`);
    }

    const data = await response.json();
    console.log('User created:', data);
    return data;
  } catch (error) {
    console.error('Error creating user:', error);
    throw error;
  }
}

This is easier to read and reason about, especially when you need to make multiple sequential requests.

Method 2: XMLHttpRequest (Legacy but Still Relevant)

Before Fetch existed, we had XMLHttpRequest. It's more verbose and uses callbacks instead of promises, but you'll still see it in older codebases. Here's what a GET request looks like:

const xhr = new XMLHttpRequest();

xhr.open('GET', 'https://api.example.com/users', true);

xhr.onload = function() {
  if (xhr.status >= 200 && xhr.status < 300) {
    const data = JSON.parse(xhr.responseText);
    console.log(data);
  } else {
    console.error('Request failed with status:', xhr.status);
  }
};

xhr.onerror = function() {
  console.error('Network error occurred');
};

xhr.send();

The third parameter in xhr.open() determines whether the request is asynchronous (true) or synchronous (false). Never use synchronous requests—they'll freeze the entire browser tab until the response comes back.

Why would you use XMLHttpRequest in 2025? You probably wouldn't for new projects, but understanding it helps when maintaining legacy code or working with libraries that haven't fully migrated to Fetch.

Method 3: Third-Party Libraries (Axios)

Axios is the most popular HTTP client library in JavaScript, with over 100 million downloads per month on npm. It wraps XMLHttpRequest (and now optionally Fetch) with a cleaner API and better defaults.

First, install it:

npm install axios

Here's a basic GET request:

import axios from 'axios';

axios.get('https://api.example.com/users')
  .then(response => console.log(response.data))
  .catch(error => console.error('Error:', error));

Notice anything different? Axios automatically:

  • Parses JSON responses (no need to call .json())
  • Throws errors for HTTP error statuses (4xx, 5xx)
  • Has shorter syntax for common operations

Why Developers Still Choose Axios

Despite Fetch being native to the browser, Axios remains popular because of features like:

Request/Response Interceptors: Run code before every request or after every response.

// Add authentication token to all requests
axios.interceptors.request.use(config => {
  const token = localStorage.getItem('authToken');
  if (token) {
    config.headers.Authorization = `Bearer ${token}`;
  }
  return config;
});

// Handle errors globally
axios.interceptors.response.use(
  response => response,
  error => {
    if (error.response?.status === 401) {
      // Redirect to login
      window.location.href = '/login';
    }
    return Promise.reject(error);
  }
);

Automatic Timeouts: Set a timeout easily without extra code.

axios.get('https://api.example.com/users', {
  timeout: 5000 // 5 seconds
})
  .catch(error => {
    if (error.code === 'ECONNABORTED') {
      console.log('Request timeout');
    }
  });

With Fetch, you'd need to implement timeout logic manually using AbortController:

const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), 5000);

fetch('https://api.example.com/users', {
  signal: controller.signal
})
  .then(response => {
    clearTimeout(timeoutId);
    return response.json();
  })
  .catch(error => {
    if (error.name === 'AbortError') {
      console.log('Request timeout');
    }
  });

The Trade-off: Axios adds about 13.5KB (gzipped) to your bundle size. For simple apps making a few API calls, Fetch is probably sufficient. For complex applications with dozens of endpoints, authentication, and error handling, Axios can save you from writing a lot of boilerplate.

Advanced: Error Handling That Actually Works

Most tutorials show you the happy path. Here's how to handle errors properly in production.

Differentiating Error Types

Not all errors are created equal. Network failures need different handling than validation errors:

async function fetchUserData(userId) {
  try {
    const response = await fetch(`https://api.example.com/users/${userId}`);
    
    if (!response.ok) {
      // HTTP errors (4xx, 5xx)
      if (response.status === 404) {
        throw new Error('User not found');
      }
      if (response.status === 401) {
        throw new Error('Unauthorized - please log in');
      }
      if (response.status >= 500) {
        throw new Error('Server error - please try again later');
      }
      throw new Error(`HTTP error ${response.status}`);
    }

    return await response.json();
  } catch (error) {
    // Network errors (no response from server)
    if (error instanceof TypeError && error.message.includes('fetch')) {
      console.error('Network error - check your internet connection');
      throw new Error('Network error - please check your connection');
    }
    
    // Re-throw other errors
    throw error;
  }
}

This approach lets you show appropriate error messages to users instead of generic "something went wrong" messages.

Creating Custom Error Classes

For larger applications, custom error classes make error handling more structured:

class HTTPError extends Error {
  constructor(response) {
    super(`HTTP Error ${response.status}`);
    this.name = 'HTTPError';
    this.status = response.status;
    this.response = response;
  }
}

class NetworkError extends Error {
  constructor(message) {
    super(message);
    this.name = 'NetworkError';
  }
}

async function fetchWithErrorHandling(url, options = {}) {
  try {
    const response = await fetch(url, options);
    
    if (!response.ok) {
      throw new HTTPError(response);
    }
    
    return await response.json();
  } catch (error) {
    if (error instanceof TypeError) {
      throw new NetworkError('Failed to connect to server');
    }
    throw error;
  }
}

// Usage
try {
  const data = await fetchWithErrorHandling('https://api.example.com/users');
} catch (error) {
  if (error instanceof NetworkError) {
    showNotification('Check your internet connection');
  } else if (error instanceof HTTPError && error.status === 401) {
    redirectToLogin();
  } else {
    showNotification('Something went wrong. Please try again.');
  }
}

Advanced: Implementing Retry Logic

Network requests fail. Servers hiccup. It happens. Implementing smart retry logic can turn a 500 error into a successful request without bothering the user.

Simple Retry with Fixed Delay

Here's a basic retry function that attempts a request up to 3 times with a 1-second delay:

async function fetchWithRetry(url, options = {}, retries = 3, delay = 1000) {
  for (let i = 0; i < retries; i++) {
    try {
      const response = await fetch(url, options);
      
      if (!response.ok) {
        throw new Error(`HTTP error ${response.status}`);
      }
      
      return await response.json();
    } catch (error) {
      const isLastAttempt = i === retries - 1;
      
      if (isLastAttempt) {
        throw error;
      }
      
      console.log(`Attempt ${i + 1} failed. Retrying in ${delay}ms...`);
      await new Promise(resolve => setTimeout(resolve, delay));
    }
  }
}

This works, but there's a better approach.

Exponential Backoff (The Professional Way)

Exponential backoff increases the delay between retries. This prevents overwhelming a struggling server with retry requests:

async function fetchWithExponentialBackoff(url, options = {}, maxRetries = 3) {
  let retries = 0;
  
  while (retries < maxRetries) {
    try {
      const response = await fetch(url, options);
      
      if (!response.ok) {
        throw new Error(`HTTP error ${response.status}`);
      }
      
      return await response.json();
    } catch (error) {
      retries++;
      
      if (retries === maxRetries) {
        throw new Error(`Request failed after ${maxRetries} retries: ${error.message}`);
      }
      
      // Calculate delay: 1s, 2s, 4s, 8s, etc.
      const delayMs = Math.min(1000 * Math.pow(2, retries), 10000);
      
      // Add jitter to prevent thundering herd
      const jitter = Math.random() * 1000;
      const finalDelay = delayMs + jitter;
      
      console.log(`Retry ${retries}/${maxRetries} after ${Math.round(finalDelay)}ms`);
      
      await new Promise(resolve => setTimeout(resolve, finalDelay));
    }
  }
}

The jitter (random delay) is important. Without it, if 100 clients all fail simultaneously, they'll all retry at exactly the same time, potentially overwhelming the server again.

When NOT to Retry

Don't blindly retry everything. Some errors shouldn't be retried:

function shouldRetry(error, response) {
  // Don't retry client errors (4xx) except 429 (rate limit)
  if (response?.status >= 400 && response?.status < 500) {
    return response.status === 429;
  }
  
  // Retry server errors (5xx)
  if (response?.status >= 500) {
    return true;
  }
  
  // Retry network errors
  return error instanceof TypeError;
}

async function smartFetchWithRetry(url, options = {}, maxRetries = 3) {
  let retries = 0;
  
  while (retries <= maxRetries) {
    try {
      const response = await fetch(url, options);
      
      if (!response.ok) {
        if (!shouldRetry(null, response)) {
          throw new Error(`HTTP error ${response.status}`);
        }
        throw new Error(`Retryable error ${response.status}`);
      }
      
      return await response.json();
    } catch (error) {
      if (retries === maxRetries || !shouldRetry(error, error.response)) {
        throw error;
      }
      
      retries++;
      const delay = Math.min(1000 * Math.pow(2, retries), 10000);
      await new Promise(resolve => setTimeout(resolve, delay + Math.random() * 1000));
    }
  }
}

Advanced: Request Caching Strategies

Making the same API call repeatedly wastes bandwidth and slows your app. Smart caching can eliminate unnecessary requests entirely.

Simple In-Memory Cache

Here's a basic cache that stores responses for 5 minutes:

const cache = new Map();
const CACHE_DURATION = 5 * 60 * 1000; // 5 minutes

async function fetchWithCache(url, options = {}) {
  const cacheKey = url + JSON.stringify(options);
  const cached = cache.get(cacheKey);
  
  if (cached && Date.now() - cached.timestamp < CACHE_DURATION) {
    console.log('Returning cached response');
    return cached.data;
  }
  
  const response = await fetch(url, options);
  
  if (!response.ok) {
    throw new Error(`HTTP error ${response.status}`);
  }
  
  const data = await response.json();
  
  cache.set(cacheKey, {
    data,
    timestamp: Date.now()
  });
  
  return data;
}

This works great for read-heavy applications where data doesn't change frequently.

Invalidating Cache on Mutations

When you POST, PUT, or DELETE, you need to invalidate related cached data:

const cache = new Map();

async function fetchWithCache(url, options = {}) {
  const method = options.method || 'GET';
  const cacheKey = url;
  
  // For mutations, clear cache and don't cache the response
  if (method !== 'GET') {
    cache.delete(cacheKey);
    const response = await fetch(url, options);
    if (!response.ok) throw new Error(`HTTP error ${response.status}`);
    return await response.json();
  }
  
  // For GET requests, check cache
  const cached = cache.get(cacheKey);
  if (cached && Date.now() - cached.timestamp < 5 * 60 * 1000) {
    return cached.data;
  }
  
  const response = await fetch(url, options);
  if (!response.ok) throw new Error(`HTTP error ${response.status}`);
  
  const data = await response.json();
  cache.set(cacheKey, { data, timestamp: Date.now() });
  
  return data;
}

// Usage
await fetchWithCache('https://api.example.com/users'); // Fetches and caches
await fetchWithCache('https://api.example.com/users'); // Returns from cache

// Update a user
await fetchWithCache('https://api.example.com/users/123', {
  method: 'PUT',
  body: JSON.stringify({ name: 'Updated Name' })
}); // Clears the cache

await fetchWithCache('https://api.example.com/users'); // Fresh fetch, not cached

Stale-While-Revalidate Pattern

This advanced pattern returns cached data immediately while fetching fresh data in the background:

async function staleWhileRevalidate(url, options = {}) {
  const cacheKey = url;
  const cached = cache.get(cacheKey);
  
  if (cached) {
    // Return cached data immediately
    setTimeout(() => {
      // Fetch fresh data in background
      fetch(url, options)
        .then(response => response.json())
        .then(data => {
          cache.set(cacheKey, {
            data,
            timestamp: Date.now()
          });
        })
        .catch(error => console.error('Background refresh failed:', error));
    }, 0);
    
    return cached.data;
  }
  
  // No cache, fetch normally
  const response = await fetch(url, options);
  const data = await response.json();
  
  cache.set(cacheKey, { data, timestamp: Date.now() });
  
  return data;
}

This gives users instant responses while keeping data fresh. It's perfect for dashboards or feeds where you want snappy performance but can tolerate slightly stale data.

Performance Tips and Best Practices

1. Use Request Deduplication

If multiple parts of your app request the same data simultaneously, you're making redundant requests:

const inflightRequests = new Map();

async function fetchWithDeduplication(url, options = {}) {
  const key = url + JSON.stringify(options);
  
  // Return existing promise if request is in flight
  if (inflightRequests.has(key)) {
    console.log('Returning in-flight request');
    return inflightRequests.get(key);
  }
  
  const promise = fetch(url, options)
    .then(response => response.json())
    .finally(() => {
      inflightRequests.delete(key);
    });
  
  inflightRequests.set(key, promise);
  return promise;
}

2. Cancel Unnecessary Requests

If a user navigates away before a request completes, cancel it:

const controller = new AbortController();

fetch('https://api.example.com/large-dataset', {
  signal: controller.signal
})
  .then(response => response.json())
  .catch(error => {
    if (error.name === 'AbortError') {
      console.log('Request cancelled');
    }
  });

// Cancel the request if user navigates away
window.addEventListener('beforeunload', () => {
  controller.abort();
});

3. Batch Multiple Requests

Instead of making 10 separate API calls, batch them:

async function fetchMultipleUsers(userIds) {
  // Bad: 10 separate requests
  // const users = await Promise.all(
  //   userIds.map(id => fetch(`/api/users/${id}`))
  // );
  
  // Good: Single batched request
  const response = await fetch('/api/users/batch', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ ids: userIds })
  });
  
  return response.json();
}

This requires server support, but it dramatically reduces network overhead.

4. Compress Request Payloads

For large POST requests, consider compressing the data:

async function fetchWithCompression(url, data) {
  const blob = new Blob([JSON.stringify(data)], {
    type: 'application/json'
  });
  
  // Use CompressionStream API (modern browsers)
  const compressedStream = blob.stream().pipeThrough(
    new CompressionStream('gzip')
  );
  
  const compressedBlob = await new Response(compressedStream).blob();
  
  return fetch(url, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Content-Encoding': 'gzip'
    },
    body: compressedBlob
  });
}

Choosing the Right Approach for Your Project

After all this, which method should you use? Here's my take:

Use Fetch if:

  • You're building a small to medium-sized app
  • You don't need complex features like interceptors
  • Bundle size is a primary concern
  • You're working in an environment where third-party dependencies are restricted

Use Axios if:

  • You're building a large application with many API endpoints
  • You need request/response interceptors for authentication or logging
  • You want automatic request timeout handling
  • You need better TypeScript support out of the box
  • Your team is already familiar with Axios

Use XMLHttpRequest if:

  • You're maintaining legacy code
  • You need to support very old browsers (IE10 and below)
  • You're working with specific features XMLHttpRequest offers (like upload progress tracking with xhr.upload.onprogress)

Wrapping Up

Making HTTP requests in JavaScript starts simple but gets complex fast when you consider production requirements. The Fetch API gives you a solid foundation, but understanding error handling, retry logic, and caching strategies is what separates hobby projects from professional applications.

The patterns we've covered—exponential backoff, stale-while-revalidate, request deduplication—aren't just theoretical exercises. They're battle-tested solutions that improve reliability and user experience. Start with the basics, but don't be afraid to implement these advanced patterns when your app needs them.

And remember: the best HTTP client is the one that fits your specific needs. Don't cargo-cult Axios into every project because it's popular, but don't avoid it just because Fetch is native. Make informed decisions based on your requirements, and your users (and future maintainers) will thank you.