Web scraping at scale requires proxies to avoid IP bans and access geo-restricted content. Without proxies, your Selenium scraper will trigger anti-bot systems after just a few requests.
This guide shows you how to configure proxies in Selenium across Chrome, Firefox, and Edge browsers, including authenticated proxies and rotation strategies.
What Are Proxies in Selenium?
Proxies in Selenium act as intermediaries between your browser automation script and target websites. They route your requests through different IP addresses, making your scraper appear as multiple users from various locations. This prevents rate limiting and allows you to bypass geographic restrictions without getting blocked.
Selenium supports HTTP, HTTPS, and SOCKS5 proxies through ChromeOptions, FirefoxOptions, and EdgeOptions. The configuration varies slightly between browsers but follows the same core principle.
Why Use Proxies With Selenium
Proxies solve three critical problems in browser automation.
Avoid IP Bans
Websites track request frequency per IP address. Make too many requests and you'll get blocked.
Proxies distribute your requests across multiple IPs. This makes your automation look like organic traffic from different users.
Bypass Geo-Restrictions
Many websites serve different content based on location. E-commerce sites show different prices, streaming platforms restrict content by region.
Proxies let you test how your application behaves for users in different countries. Essential for localization testing and competitive analysis.
Increase Scraping Speed
Using multiple proxies simultaneously means you can run parallel Selenium instances. Each instance uses a different proxy.
This multiplies your scraping throughput without triggering anti-bot measures.
Setup Unauthenticated Proxy in Selenium Chrome
Unauthenticated proxies don't require username and password. They're the simplest to configure but offer less security.
Here's how to add a proxy to Chrome in Selenium:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
PROXY = "185.199.229.156:7492"
chrome_options = Options()
chrome_options.add_argument(f'--proxy-server={PROXY}')
driver = webdriver.Chrome(options=chrome_options)
driver.get("https://httpbin.org/ip")
print(driver.page_source)
driver.quit()
The --proxy-server argument tells Chrome to route all traffic through the specified proxy.
You can verify it's working by visiting httpbin.org/ip, which returns your current IP address in JSON format.
Important: Replace the proxy IP with a working proxy from your provider. Free proxies often fail or get blocked quickly.
Setup Unauthenticated Proxy in Selenium Firefox
Firefox requires a different configuration approach using the Proxy class.
from selenium import webdriver
from selenium.webdriver.common.proxy import Proxy, ProxyType
PROXY = "185.199.229.156:7492"
proxy = Proxy({
'proxyType': ProxyType.MANUAL,
'httpProxy': PROXY,
'ftpProxy': PROXY,
'sslProxy': PROXY,
'noProxy': ''
})
driver = webdriver.Firefox(proxy=proxy)
driver.get("https://httpbin.org/ip")
print(driver.page_source)
driver.quit()
The Proxy class requires you to specify separate proxy servers for HTTP, FTP, and SSL connections.
Setting all three to the same proxy ensures consistent routing regardless of protocol.
ProxyType.MANUAL tells Selenium you're manually configuring the proxy rather than using system settings.
Setup Authenticated Proxy in Selenium
Most commercial proxy providers require authentication with username and password. Standard Selenium doesn't support this directly.
You need Selenium Wire, an extension that intercepts browser requests.
Install Selenium Wire:
pip install selenium-wire
Configure authenticated proxy:
from seleniumwire import webdriver
PROXY_USER = "username"
PROXY_PASS = "password"
PROXY_HOST = "proxy.provider.com"
PROXY_PORT = "8080"
proxy_options = {
'proxy': {
'http': f'http://{PROXY_USER}:{PROXY_PASS}@{PROXY_HOST}:{PROXY_PORT}',
'https': f'https://{PROXY_USER}:{PROXY_PASS}@{PROXY_HOST}:{PROXY_PORT}',
'no_proxy': 'localhost,127.0.0.1'
}
}
driver = webdriver.Chrome(seleniumwire_options=proxy_options)
driver.get("https://httpbin.org/ip")
print(driver.page_source)
driver.quit()
Selenium Wire handles the authentication automatically by intercepting requests and adding credentials.
The no_proxy setting ensures local requests don't go through the proxy. This speeds up localhost debugging.
Note: Selenium Wire adds a small performance overhead (10-15%) because it intercepts every request. Worth it for authenticated proxy support.
Setup Proxy in Selenium Edge
Edge is Chromium-based, so the configuration mirrors Chrome exactly.
from selenium import webdriver
from selenium.webdriver.edge.options import Options
PROXY = "185.199.229.156:7492"
edge_options = Options()
edge_options.add_argument(f'--proxy-server={PROXY}')
driver = webdriver.Edge(options=edge_options)
driver.get("https://httpbin.org/ip")
print(driver.page_source)
driver.quit()
For authenticated proxies in Edge, use the Selenium Wire method shown earlier. Just replace webdriver.Chrome with webdriver.Edge.
Configure SOCKS5 Proxy in Selenium
SOCKS5 proxies handle more protocols than HTTP proxies. They're useful for applications beyond web scraping.
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
SOCKS_PROXY = "185.199.229.156:7492"
chrome_options = Options()
chrome_options.add_argument(f'--proxy-server=socks5://{SOCKS_PROXY}')
driver = webdriver.Chrome(options=chrome_options)
driver.get("https://httpbin.org/ip")
print(driver.page_source)
driver.quit()
The only difference is adding socks5:// prefix to the proxy address.
SOCKS5 proxies support UDP traffic and offer better performance for certain use cases. HTTP/HTTPS proxies work fine for most web scraping.
Implement Proxy Rotation in Selenium
Using a single proxy for multiple requests increases detection risk. Proxy rotation switches IPs between requests.
Here's a simple rotation strategy:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
import random
PROXY_LIST = [
"185.199.229.156:7492",
"194.126.37.94:8080",
"178.79.172.11:3128",
"165.232.73.180:8080"
]
def create_driver_with_random_proxy():
proxy = random.choice(PROXY_LIST)
chrome_options = Options()
chrome_options.add_argument(f'--proxy-server={proxy}')
return webdriver.Chrome(options=chrome_options)
# Scrape multiple pages with rotating proxies
urls = [
"https://example.com/page1",
"https://example.com/page2",
"https://example.com/page3"
]
for url in urls:
driver = create_driver_with_random_proxy()
driver.get(url)
# Extract data here
print(f"Scraped {url}")
driver.quit()
This approach creates a new browser instance with a different proxy for each request.
The downside is performance overhead from starting new browsers repeatedly.
Alternative: Keep browser open, rotate via extension
For better performance with authenticated rotating proxies, use a proxy service that handles rotation server-side. You configure one endpoint, they rotate IPs automatically.
Handle Proxy Authentication With Chrome Extension
Another method for authenticated proxies is creating a Chrome extension. More complex but avoids Selenium Wire dependency.
Create manifest.json:
{
"version": "1.0.0",
"manifest_version": 3,
"name": "Chrome Proxy Auth",
"permissions": [
"proxy",
"tabs",
"webRequest",
"webRequestAuthProvider"
],
"host_permissions": [
"<all_urls>"
],
"background": {
"service_worker": "background.js"
}
}
Create background.js:
const config = {
mode: "fixed_servers",
rules: {
singleProxy: {
scheme: "http",
host: "proxy.provider.com",
port: 8080
}
}
};
chrome.proxy.settings.set({value: config, scope: "regular"});
chrome.webRequest.onAuthRequired.addListener(
(details) => {
return {
authCredentials: {
username: "your_username",
password: "your_password"
}
};
},
{urls: ["<all_urls>"]},
["blocking"]
);
Load extension in Selenium:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
chrome_options = Options()
chrome_options.add_extension("proxy_auth_extension.zip")
driver = webdriver.Chrome(options=chrome_options)
driver.get("https://httpbin.org/ip")
Zip the manifest.json and background.js into proxy_auth_extension.zip before running.
This method requires more setup but gives you full control over authentication logic.
Debug Common Proxy Issues in Selenium
407 Proxy Authentication Required
This error means your credentials are incorrect or the proxy doesn't recognize them.
Verify username and password are correct. Check if your IP needs whitelisting with the proxy provider.
ERR_PROXY_CONNECTION_FAILED
The proxy server is unreachable. Either the IP is wrong or the proxy is offline.
Test the proxy with curl before using it in Selenium:
curl -x http://185.199.229.156:7492 https://httpbin.org/ip
If curl fails, the proxy itself has issues.
ERR_TUNNEL_CONNECTION_FAILED
This happens with HTTPS sites when the proxy doesn't support CONNECT tunneling.
Switch to a different proxy provider or use HTTP-only scraping if possible.
Proxy Rotation Not Working
Make sure you're creating a new driver instance for each proxy change. Selenium doesn't support changing proxies mid-session.
# Wrong - proxy won't change
driver = webdriver.Chrome(options=chrome_options)
for url in urls:
# Change proxy here doesn't work
driver.get(url)
# Correct - new driver per proxy
for url in urls:
driver = webdriver.Chrome(options=chrome_options)
driver.get(url)
driver.quit()
Compare Proxy Configuration Methods
| Method | Authentication | Setup Complexity | Performance | Best For |
|---|---|---|---|---|
| ChromeOptions | No | Simple | Fast | Free proxies |
| Selenium Wire | Yes | Medium | Moderate | Most use cases |
| Chrome Extension | Yes | Complex | Fast | Production systems |
| Server-side Rotation | Yes | Simple | Fastest | High-volume scraping |
Selenium Wire offers the best balance of features and ease of use. Choose Chrome extension for production environments where performance matters.
Verify Your Proxy Setup
Always test your proxy configuration before running large scraping jobs.
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
def test_proxy(proxy_address):
chrome_options = Options()
chrome_options.add_argument(f'--proxy-server={proxy_address}')
chrome_options.add_argument('--headless')
driver = webdriver.Chrome(options=chrome_options)
try:
driver.get("https://httpbin.org/ip")
response = driver.page_source
if proxy_address.split(':')[0] in response:
print(f"✓ Proxy {proxy_address} working")
return True
else:
print(f"✗ Proxy {proxy_address} failed")
return False
except Exception as e:
print(f"✗ Proxy {proxy_address} error: {e}")
return False
finally:
driver.quit()
# Test list of proxies
proxies = [
"185.199.229.156:7492",
"194.126.37.94:8080"
]
working_proxies = [p for p in proxies if test_proxy(p)]
print(f"\nWorking proxies: {len(working_proxies)}/{len(proxies)}")
This script checks each proxy and returns only working ones. Run this before starting your scraping job to avoid wasting time on dead proxies.
Conclusion
Setting up proxies in Selenium requires different approaches for Chrome, Firefox, and Edge. Unauthenticated proxies work with standard ChromeOptions, while authenticated proxies need Selenium Wire or Chrome extensions.
Proxy rotation prevents IP bans by distributing requests across multiple IPs. Test your proxies before running production scrapes to avoid failures.
For high-volume scraping, use commercial residential proxies with server-side rotation. They handle the complexity while you focus on extracting data.
FAQ
Can I use free proxies with Selenium?
Yes, but free proxies are unreliable and often blocked. They work for testing but fail quickly under load. Commercial residential proxies offer 99%+ uptime.
How many proxies do I need for web scraping?
Start with 10-20 proxies for small projects. Scale to 100+ for large-scale scraping. More proxies mean lower request frequency per IP.
Does Selenium support proxy authentication natively?
No. You need Selenium Wire extension or a custom Chrome extension to handle authenticated proxies in Selenium.
What's the difference between HTTP and SOCKS5 proxies?
HTTP proxies only handle web traffic. SOCKS5 supports all protocols including FTP and email. Use HTTP/HTTPS for web scraping unless you need protocol flexibility.
How do I rotate proxies without restarting the browser?
You can't. Selenium locks proxy settings at browser startup. Use server-side rotating proxies or accept the overhead of restarting browsers.