Making HTTPS requests in Node.js is fundamental for any backend developer. Whether you're calling third-party APIs, scraping data, or building microservices, you need a reliable way to handle encrypted HTTP traffic.

In this guide, we'll walk through 12 different methods to make HTTPS requests—from Node's native modules to third-party libraries, and even some edge cases that most developers miss.

Why HTTPS Request Methods Matter

The method you choose affects performance, bundle size, and how you handle errors. Native modules give you maximum control but require more boilerplate. Libraries like Axios make life easier but add dependencies. Understanding your options helps you pick the right tool for each scenario.

Quick comparison: The native https module offers maximum control with zero dependencies. Undici delivers blazing speed (3x faster than Axios in some benchmarks). The native Fetch API (Node 18+) provides cross-platform compatibility. Each has its place.

1. Native HTTPS Module (Built-in)

The https module ships with Node.js and gives you complete control over requests. It's event-driven, which means you handle chunks of data as they arrive.

const https = require('https');

const options = {
  hostname: 'api.github.com',
  port: 443,
  path: '/users/github',
  method: 'GET',
  headers: {
    'User-Agent': 'Node.js Client'
  }
};

Here we're setting up the basic configuration. GitHub's API requires a User-Agent header, so we include it. The port is 443 for HTTPS.

const req = https.request(options, (res) => {
  let data = '';
  
  res.on('data', (chunk) => {
    data += chunk;
  });
  
  res.on('end', () => {
    console.log(JSON.parse(data));
  });
});

The response comes in chunks through the data event. We accumulate them into a string, then parse the JSON once the stream ends. This streaming approach is memory-efficient for large responses.

req.on('error', (error) => {
  console.error('Request failed:', error.message);
});

req.end();

Always call req.end() to actually fire the request. The error handler catches network issues, DNS failures, or connection timeouts.

When to use: When you need zero dependencies and full control over the request lifecycle. Perfect for libraries or when bundle size matters.

2. Native Fetch API (Node 18+)

Starting with Node 18, fetch is available globally—no imports needed. It's the same API you use in browsers, powered by Undici under the hood.

async function fetchUser() {
  try {
    const response = await fetch('https://api.github.com/users/github', {
      method: 'GET',
      headers: {
        'User-Agent': 'Node.js Client'
      }
    });
    
    if (!response.ok) {
      throw new Error(`HTTP error: ${response.status}`);
    }
    
    const data = await response.json();
    console.log(data);
  } catch (error) {
    console.error('Fetch failed:', error.message);
  }
}

fetchUser();

The fetch API is promise-based, making it cleaner than the callback-style https module. The response.ok check catches HTTP errors (4xx, 5xx), which fetch doesn't automatically treat as exceptions.

When to use: Default choice for Node 18+. Great for cross-platform code that runs in both browser and server environments.

Axios dominated the Node.js HTTP landscape for years. It automatically transforms JSON, handles errors better than fetch, and supports request/response interceptors.

const axios = require('axios');

axios.get('https://api.github.com/users/github', {
  headers: {
    'User-Agent': 'Node.js Client'
  }
})
.then(response => {
  console.log(response.data);
})
.catch(error => {
  if (error.response) {
    console.error(`Error: ${error.response.status}`);
  } else {
    console.error('Request failed:', error.message);
  }
});

Axios automatically parses JSON responses—no need to call .json(). The error handling distinguishes between network failures and HTTP errors, giving you more context.

For POST requests with retry logic:

const axiosRetry = require('axios-retry');

axiosRetry(axios, { 
  retries: 3,
  retryDelay: axiosRetry.exponentialDelay
});

axios.post('https://api.example.com/data', {
  title: 'Test',
  body: 'Content'
})
.then(response => console.log(response.data));

When to use: When you need interceptors, automatic transforms, or widespread community support. Not the fastest option, but the most feature-rich.

4. Undici (Fastest Native Option)

Undici is maintained by the Node.js team and powers the native fetch implementation. It's significantly faster than Axios—up to 3x in some benchmarks.

const { request } = require('undici');

async function makeRequest() {
  const { statusCode, body } = await request('https://api.github.com/users/github', {
    method: 'GET',
    headers: {
      'User-Agent': 'Node.js Client'
    }
  });
  
  const data = await body.json();
  console.log(data);
}

makeRequest();

The API returns both the status code and body separately. You still need to call .json() on the body, similar to fetch.

For connection pooling (performance boost):

const { Pool } = require('undici');

const pool = new Pool('https://api.github.com', {
  connections: 10
});

async function makePooledRequest() {
  const { body } = await pool.request({
    path: '/users/github',
    method: 'GET'
  });
  
  const data = await body.json();
  console.log(data);
}

The pool reuses connections, reducing handshake overhead. This is massive for high-throughput applications hitting the same host repeatedly.

When to use: Performance-critical applications, especially when making many requests to the same host. Connection pooling makes it shine.

5. Got (Feature-Rich Alternative)

Got is a powerful HTTP client built specifically for Node.js. It supports retry logic, hooks, pagination helpers, and HTTP/2 out of the box.

const got = require('got');

(async () => {
  try {
    const response = await got('https://api.github.com/users/github', {
      headers: {
        'User-Agent': 'Node.js Client'
      },
      responseType: 'json'
    });
    
    console.log(response.body);
  } catch (error) {
    console.error(error.response?.statusCode, error.message);
  }
})();

Setting responseType: 'json' automatically parses the response. Got throws on HTTP errors by default, unlike fetch.

With automatic retries and timeout:

const response = await got('https://api.example.com/data', {
  retry: {
    limit: 3,
    statusCodes: [408, 413, 429, 500, 502, 503, 504]
  },
  timeout: {
    request: 10000
  }
});

When to use: Node.js-only projects needing advanced features like streams, pagination, or fine-grained retry control. Heavier than Undici but more batteries-included.

6. Native HTTP/2 Module

HTTP/2 allows multiplexing—sending multiple requests over a single TCP connection. Node's http2 module lets you leverage this.

const http2 = require('http2');

const client = http2.connect('https://nghttp2.org');

const req = client.request({ 
  ':path': '/' 
});

HTTP/2 uses pseudo-headers (prefixed with :) for method, path, and scheme. The syntax differs from HTTP/1.1.

req.on('response', (headers) => {
  console.log(headers[':status']);
});

let data = '';
req.on('data', (chunk) => {
  data += chunk;
});

req.on('end', () => {
  console.log(data);
  client.close();
});

req.end();

Always close the client when done to free up resources. HTTP/2 connections are persistent by design.

When to use: High-performance scenarios with many requests to HTTP/2-enabled servers. Great for gRPC clients or modern API gateways.

7. Superagent (Browser + Node.js)

Superagent works in both browsers and Node.js. Its chaining API feels intuitive, and it supports plugins for things like caching or mocking.

const superagent = require('superagent');

superagent
  .get('https://api.github.com/users/github')
  .set('User-Agent', 'Node.js Client')
  .then(res => {
    console.log(res.body);
  })
  .catch(err => {
    console.error(err.message);
  });

The .set() method chains header configuration. Superagent automatically parses JSON responses.

For POST with form data:

superagent
  .post('https://api.example.com/users')
  .send({ name: 'John', email: 'john@example.com' })
  .set('Content-Type', 'application/json')
  .then(res => console.log(res.body));

When to use: Isomorphic applications sharing code between browser and server. Good if you like jQuery-style chaining syntax.

8. Node-fetch (Legacy Polyfill)

Before Node 18, node-fetch was the go-to polyfill for the Fetch API. If you're stuck on older Node versions, this is your best bet.

const fetch = require('node-fetch');

async function getData() {
  const response = await fetch('https://api.github.com/users/github');
  const data = await response.json();
  console.log(data);
}

The API is identical to browser fetch. However, node-fetch v3+ is ESM-only, which can cause issues in CommonJS projects.

For CommonJS compatibility, stick with v2:

// package.json
{
  "dependencies": {
    "node-fetch": "^2.6.7"
  }
}

When to use: Node versions before 18, or when you need a fetch polyfill for consistent API across environments.

9. Request with Custom Agent (Connection Pooling)

The request library is deprecated, but its pattern of using custom agents is still relevant. You can apply it with the native https module.

const https = require('https');

const agent = new https.Agent({
  keepAlive: true,
  maxSockets: 50,
  maxFreeSockets: 10,
  timeout: 60000
});

This agent maintains a pool of sockets. maxSockets limits concurrent connections per host. keepAlive reuses connections, dramatically reducing latency.

const options = {
  hostname: 'api.github.com',
  path: '/users/github',
  method: 'GET',
  agent: agent
};

https.request(options, (res) => {
  // Handle response
}).end();

When to use: High-frequency API calls where connection reuse matters. This pattern works with Axios too—just pass the agent in the config.

10. Handling Self-Signed Certificates

In dev environments, you often hit APIs with self-signed SSL certificates. Node rejects these by default. Here's how to bypass verification.

const https = require('https');

const options = {
  hostname: 'localhost',
  port: 8443,
  path: '/api/data',
  method: 'GET',
  rejectUnauthorized: false  // Skip certificate validation
};

https.request(options, (res) => {
  // Process response
}).end();

Setting rejectUnauthorized: false disables certificate checks. Never use this in production—it exposes you to man-in-the-middle attacks.

For a safer approach with custom CA:

const fs = require('fs');

const ca = fs.readFileSync('./custom-ca.pem');

const options = {
  hostname: 'localhost',
  port: 8443,
  path: '/api/data',
  method: 'GET',
  ca: ca  // Trust this specific CA
};

This trusts only your custom certificate authority, maintaining security while supporting self-signed certs.

When to use: Local development against HTTPS services. Absolutely avoid rejectUnauthorized: false in production code.

11. Streaming Large Responses

For massive responses (like file downloads), streaming prevents memory bloat. Pipe the response directly to a writable stream.

const https = require('https');
const fs = require('fs');

const file = fs.createWriteStream('downloaded.json');

https.get('https://api.example.com/large-dataset', (response) => {
  response.pipe(file);
  
  file.on('finish', () => {
    file.close();
    console.log('Download complete');
  });
});

The pipe() method streams data in chunks without loading everything into memory. This scales to multi-gigabyte responses.

With progress tracking:

let receivedBytes = 0;
const totalBytes = parseInt(response.headers['content-length'], 10);

response.on('data', (chunk) => {
  receivedBytes += chunk.length;
  const percent = ((receivedBytes / totalBytes) * 100).toFixed(2);
  console.log(`Downloaded: ${percent}%`);
});

response.pipe(file);

When to use: File downloads, large API responses, or any scenario where memory usage is a concern.

12. HTTP/2 with Got (Unified Interface)

Got supports HTTP/2 while maintaining the same API as HTTP/1.1. No need to rewrite your code.

const got = require('got');

(async () => {
  const response = await got('https://nghttp2.org/', {
    http2: true,
    responseType: 'json'
  });
  
  console.log('Protocol:', response.httpVersion);
  console.log('Data:', response.body);
})();

Enabling http2: true negotiates HTTP/2 if the server supports it, falling back to HTTP/1.1 otherwise. Check response.httpVersion to confirm which protocol was used.

For multiplexing multiple requests:

const requests = [
  got('https://nghttp2.org/page1', { http2: true }),
  got('https://nghttp2.org/page2', { http2: true }),
  got('https://nghttp2.org/page3', { http2: true })
];

const responses = await Promise.all(requests);

All three requests use the same TCP connection, reducing handshake overhead.

When to use: Modern APIs that support HTTP/2. Got makes the transition seamless without changing your request logic.

Wrapping Up

You've now got 12 ways to make HTTPS requests in Node.js, from bare-metal native modules to high-level libraries. Here's a quick decision tree:

  • Need maximum performance? Use Undici with connection pooling
  • Want cross-platform compatibility? Native fetch (Node 18+)
  • Building a library? Stick with the native https module
  • Need interceptors and transforms? Axios remains king
  • Working with HTTP/2? Got or native http2 module
  • Streaming large files? Native https with pipes

The fastest way to level up? Pick one method and master it. Then experiment with others as your needs evolve. Most projects only need 2-3 of these patterns.