wget Commands Reference

Essential wget commands for downloading files, testing HTTPS, and inspecting certificates

📥 Basic Downloads

Download a file

wget https://example.com/file.zip

Downloads file with original name to current directory.

Save with different filename

wget -O myfile.zip https://example.com/file.zip

Use -O - to output to stdout.

Download to specific directory

wget -P /path/to/directory https://example.com/file.zip

Continue partial download

wget -c https://example.com/largefile.zip

Resumes interrupted downloads. Very useful for large files.

Download in background

wget -b https://example.com/largefile.zip

Runs in background, logs to wget-log.

Quiet mode (suppress output)

wget -q https://example.com/file.zip

Use -nv for non-verbose (shows errors only).

🔒 Certificate & HTTPS Testing

Test HTTPS connection (verbose)

wget --spider --server-response https://example.com

--spider checks without downloading, --server-response shows headers.

Show certificate information

wget --debug --spider https://example.com 2>&1 | grep -i certificate

Shows certificate details during TLS handshake.

Ignore certificate errors (UNSAFE - testing only)

wget --no-check-certificate https://self-signed.example.com

⚠️ Warning: Never use in production. Only for testing self-signed certificates.

Use custom CA certificate

wget --ca-certificate=/path/to/ca-cert.pem https://example.com

Validate server certificate against specific CA bundle.

Use client certificate (mTLS)

wget --certificate=/path/to/client.pem \
  --private-key=/path/to/client-key.pem \
  https://example.com

For mutual TLS authentication.

Specify TLS version

wget --secure-protocol=TLSv1_2 https://example.com

Options: TLSv1, TLSv1_1, TLSv1_2, TLSv1_3, PFS (Perfect Forward Secrecy).

📋 Multiple & Batch Downloads

Download list of URLs from file

wget -i urls.txt

File contains one URL per line. Useful for batch downloads.

Mirror entire website

wget --mirror --convert-links --page-requisites https://example.com

Creates local browsable copy with all assets and converted links.

Recursive download with depth limit

wget -r -l 2 https://example.com/docs/

Download recursively up to 2 levels deep. Use -l 0 for unlimited depth.

Download specific file types only

wget -r -A pdf,zip https://example.com/downloads/

Accept only PDF and ZIP files. Use -R to reject file types.

Download with wait between requests

wget -w 2 -r https://example.com

Wait 2 seconds between requests. Polite scraping to avoid overloading servers.

Random wait between requests

wget --random-wait -r https://example.com

Random delay between 0.5 and 1.5 times the wait value. More human-like behavior.

🔑 Authentication

HTTP Basic authentication

wget --user=username --password=password https://example.com

Use --ask-password to prompt securely.

FTP authentication

wget --ftp-user=username --ftp-password=password ftp://ftp.example.com/file.zip

Custom header authentication

wget --header="Authorization: Bearer TOKEN" https://api.example.com/data

Load cookies from file

wget --load-cookies cookies.txt https://example.com/protected

Save and load cookies (session)

wget --save-cookies cookies.txt --keep-session-cookies https://example.com/login

🎛️ Request Customization

Custom User-Agent

wget --user-agent="Mozilla/5.0 (Custom Agent)" https://example.com

Some servers block default wget user agent.

Set HTTP referer

wget --referer="https://example.com" https://example.com/download

Add custom headers

wget --header="X-Custom-Header: value" \
  --header="Accept: application/json" \
  https://api.example.com

POST request with data

wget --post-data="username=user&password=pass" https://example.com/login

Use --post-file to send file contents.

HTTP method override

wget --method=PUT --body-data='{"key":"value"}' https://api.example.com/resource

Supports GET, HEAD, POST, PUT, DELETE. Requires wget 1.15+.

⏱️ Speed & Performance

Limit download speed

wget --limit-rate=200k https://example.com/largefile.zip

Use k (kilobytes), m (megabytes), or g (gigabytes). Prevents bandwidth saturation.

Set timeout values

wget --timeout=30 --tries=3 https://example.com

30 second timeout, retry 3 times on failure.

Show progress bar

wget --progress=bar:force https://example.com/file.zip

Options: dot (default), bar, or bar:noscroll.

Show only download statistics

wget --report-speed=bits https://example.com/file.zip

Retry on connection failure

wget --retry-connrefused --waitretry=5 https://example.com

Retry even if connection refused, wait 5 seconds between retries.

⚙️ Advanced Options

Use proxy server

wget -e use_proxy=yes -e http_proxy=proxy.example.com:8080 https://example.com

Download only if newer than local file

wget -N https://example.com/file.zip

Timestamping mode - only downloads if server file is newer.

Download only if modified after date

wget --timestamping --no-if-modified-since https://example.com/file.zip

Exclude specific domains

wget -r --exclude-domains ads.example.com,tracker.example.com https://example.com

Follow only relative links

wget -r --relative https://example.com

Prevents following external links during recursive downloads.

Create log file

wget -o download.log https://example.com/file.zip

Use -a to append to existing log.

Read configuration from file

wget --config=/path/to/wgetrc https://example.com

Default locations: /etc/wgetrc, ~/.wgetrc.

🔄 wget vs curl

When to use wget:

  • Recursive downloads and website mirroring
  • Resuming interrupted downloads (-c)
  • Background downloads with automatic retry
  • Batch downloads from URL list
  • Timestamping and conditional downloads

When to use curl:

  • API testing and development
  • Complex HTTP requests (PUT, DELETE, PATCH)
  • More detailed certificate inspection
  • Support for more protocols (SCP, SFTP, IMAP, etc.)
  • Fine-grained control over HTTP/2 and HTTP/3

Quick comparison:

Featurewgetcurl
Recursive download✅ Built-in❌ Not supported
Resume downloads-c-C -
Output to stdout-O -✅ Default behavior
API testing⚠️ Limited✅ Excellent
Multiple protocolsHTTP(S), FTP(S)20+ protocols