You’re staring at a terminal—on your Linux server or MacBook—needing to download a file or hit an API endpoint. Two tools come to mind: wget and curl. You’ve seen both used interchangeably in Stack Overflow answers, but they aren’t the same thing. Pick the wrong one and you’ll waste time fighting flags that don’t exist or re-downloading files you thought you had.
Here’s exactly how they differ, when to use each, and why your operating system might already be nudging you toward one of them.
The Personalities: “Downloader” vs “Conversation Starter”
wget: The Relentless File Hunter
wget’s entire personality is: “Give me stuff and I’ll save it to disk. End of story.”
- It’s non-interactive: you start it, it goes off, it finishes
- It handles retries, network hiccups, and recursive downloads really well
- It assumes you want a file, not a conversation
Very often, a wget command is “fire and forget”:
wget https://example.com/big-dataset.csv
You close your laptop lid, catch a train, and wget just… keeps going.
curl: The Network Multi-Tool
curl’s vibe is: “Tell me exactly what you want to say to that server.”
- It talks to many protocols: HTTP(S), FTP, SFTP, etc.
- It’s great for APIs, auth, custom headers, debugging HTTP
- It can download files but doesn’t assume that’s the only goal
A “curl conversation” might look like:
curl -X POST https://api.example.com/login \
-H "Content-Type: application/json" \
-d '{"user": "alice", "password": "secret"}'
Here you’re not just “downloading a page.” You’re logging in, exchanging JSON, and reading the response.
How They Behave by Default
Here’s how they act the moment you run them:
wget behavior:
- Input: URL
- Action: Fetch content
- Output: Save directly as file
curl behavior:
- Input: URL
- Action: Fetch content
- Output: Print to terminal (stdout)
Side by side:
# wget: saves file directly (e.g., index.html)
wget https://example.com
# curl: prints HTML in your terminal
curl https://example.com
# curl "wget-style": save to file with original name
curl -O https://example.com/index.html
# curl follow redirects and save:
curl -L -O https://example.com/file.zip
So one default is “write to disk,” and the other is “show me the raw data.”
Real-Life Scenario 1: You’re Downloading a Huge Backup
You’re on a server, you need last night’s backup from object storage or a static server.
Goal: Download backup-2026-04-30.tar.gz safely, even if the connection is flaky.
Using wget
wget https://downloads.example.com/backups/backup-2026-04-30.tar.gz
Why this shines:
- Automatic retries: wget will try again when things fail
- Resuming is baked in:
wget -c https://downloads.example.com/backups/backup-2026-04-30.tar.gz
The -c flag (“continue”) will resume a partially downloaded file.
Using curl
curl can also resume:
curl -C - -O https://downloads.example.com/backups/backup-2026-04-30.tar.gz
Here:
-C -tells curl to auto-detect where to resume from-Osaves the file with the original name
But you had to know about these flags; wget’s mental model is more “I’m a downloader; resuming/retrying is my thing.”
Real-Life Scenario 2: You’re Debugging a Failing API Call
Now imagine you’re building a frontend or mobile app, and a login request keeps failing. You want to reproduce it from the terminal, tweak headers, and inspect responses.
This is curl’s home turf.
Seeing the Raw HTTP “Conversation”
curl -v https://api.example.com/login
The -v (verbose) flag shows:
- Request headers sent
- Response headers received
- Status code, redirects, etc.
For even more detailed output:
curl -i https://api.example.com/login
-i includes the HTTP response headers in the output, so you see things like:
HTTP/1.1 401 Unauthorized
Date: Thu, 30 Apr 2026 08:45:00 GMT
Content-Type: application/json
...
You can’t comfortably do this kind of deep HTTP debugging with wget. It’s possible to see some headers, but curl is basically the de facto standard tool for this.
Making a Realistic POST Request
Let’s say your app sends JSON with a bearer token:
curl -X POST https://api.example.com/items \
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
-H "Content-Type: application/json" \
-d '{"name": "New item", "priority": "high"}'
You can tweak headers and body over and over until it matches exactly what your app should send.
Real-Life Scenario 3: Mirroring a Documentation Site
Imagine you’re going on a flight with no Wi-Fi, but you need your company’s internal docs website offline.
Goal: Download the website recursively and rewrite links so it works locally.
This is exactly what wget was born for:
wget \
--mirror \
--convert-links \
--adjust-extension \
--page-requisites \
--no-parent \
https://docs.internal.example.com/
What each flag does:
--mirror: Turn on recursion with sensible defaults--convert-links: Rewrite links for local browsing--adjust-extension: Add.htmlwhere needed--page-requisites: Grab images, CSS, JS--no-parent: Don’t go above the start directory
After this finishes, you open index.html locally in your browser and have a near-complete offline copy.
curl has no native concept of recursion or site mirroring. You’d need to:
- Parse HTML
- Extract links
- Follow them manually in a script
- Download assets individually
That’s pain. For site mirroring, wget is the tool.
A Mental Model: “Tree vs Pipe”
A helpful way to remember the difference:
- wget thinks in trees of files and links
- curl thinks in streams of data (pipes)
You point wget at a URL and it can say: “I see references to CSS, JS, and images. I’ll go get those, too, forming a little tree of related files.”
You point curl at a URL and it says: “Here is the raw stuff that came back from the server; I’ll push it through stdout, and you decide what to do.”
For example:
# Using curl in a Unix pipeline:
curl https://example.com/data.json | jq '.items[] | select(.active == true)'
Here curl is part of a data pipeline rather than a “download manager.”
Authentication, Cookies, and Headers
Both tools can handle logins, cookies, and headers, but curl’s syntax makes it feel more like a HTTP lab bench.
Example: Download a file requiring a login cookie
Step 1: Log in and save cookies (curl):
curl -c cookies.txt -d "user=alice&password=secret" https://example.com/login
-c cookies.txtsaves cookies to a file
Step 2: Use the cookie to download a protected file:
curl -b cookies.txt -O https://example.com/secret-report.pdf
-b cookies.txtsends cookies from that file
wget can also do cookies:
wget --save-cookies cookies.txt --post-data="user=alice&password=secret" https://example.com/login
wget --load-cookies cookies.txt https://example.com/secret-report.pdf
But when things get more exotic—custom auth schemes, tricky headers, specific HTTP methods—curl generally feels clearer and more flexible.
Exit Codes and Error Handling
Both tools return non-zero exit codes on failures when configured correctly. The difference is more about defaults than capability.
Protocols and Flexibility
curl supports a larger basket of protocols out of the box: HTTP, HTTPS, FTP, FTPS, SCP, SFTP, IMAP, POP3, SMTP, and more (depending on how it’s built).
wget focuses on fewer but covers big ones very well: HTTP(S), FTP, and some others.
If you think “I need to talk to this weird old FTP or mail server or SFTP host,” curl is very often the more modern, flexible choice.
Example: Upload a file over SFTP with curl
curl -T ./report.pdf sftp://user@server.example.com/home/user/reports/
wget is mostly about downloading. curl comfortably does uploading as a first-class citizen.
Quick Translation Cheat Sheet
| Task | wget | curl |
|---|---|---|
| Download single file | wget URL | curl -O URL |
| Download with follow-redirects | wget URL | curl -L -O URL |
| Download with custom name | wget -O custom.txt URL | curl -L URL -o custom.txt |
| Resume interrupted download | wget -c URL | curl -C - -O URL |
| Mirror a website for offline use | wget --mirror --convert-links --page-requisites --no-parent URL | No short equivalent; you’d script it |
| Send POST request with form data | wget --post-data="user=alice&password=secret" URL | curl -d "user=alice&password=secret" URL |
| Send JSON to an API | wget possible but awkward | curl -X POST -H "Content-Type: application/json" -d '{"a":1}' URL |
Which Should You Use Day-to-Day?
Here’s a realistic rule of thumb that aligns with how people actually work:
-
If your task is “download a file or a whole site, maybe resume, maybe run in the background” → reach for wget
-
If your task is “interact with web APIs, debug HTTP, send/inspect headers, do POST/PUT/PATCH, upload data, or work with many protocols” → reach for curl
-
On macOS: curl is built in; wget often requires Homebrew. So for many Mac users, curl becomes the default even for plain downloading.
In other words:
- wget is like a dedicated download manager for the terminal
- curl is like a network Swiss army knife that also happens to download things
A Real-World Story to Cement It
Picture this:
You’re the DevOps person on your team. A new developer joins and asks:
“How do I grab yesterday’s production logs from that storage bucket?”
You might answer:
wget https://logs.example.com/2026-04-29/app.log.gz
“This just saves the file. If your connection dies, run the same command with -c to resume.”
Later the same day, your backend API starts returning 500 errors.
Now your task is different:
“Why is the /checkout endpoint failing only for certain payloads?”
You immediately reach for curl:
curl -v -X POST https://api.example.com/checkout \
-H "Content-Type: application/json" \
-d '{"userId": 42, "items": [{"sku": "SHIRT-XL", "qty": 1}]}'
You inspect headers, see response bodies, compare working vs failing payloads. Here, wget would feel like using a hammer to type an email.
Summary
Both tools are excellent; they just occupy different “mental slots”:
- Think “files and sites” → wget
- Think “requests and responses” → curl
If you keep that distinction in your head, it becomes much easier to decide which one to grab, and the flags and features will start to make sense around that core idea.
Remember: Both can do most tasks. These recommendations are about ergonomics and typical usage, not strict capability limits.
Frequently Asked Questions
Can curl do everything wget can do?
Mostly, yes—but not always as conveniently. curl can download files, resume transfers, and handle cookies. However, wget’s built-in recursive downloading and site mirroring (--mirror, --convert-links) have no direct curl equivalent. You’d need to script around curl to replicate that functionality.
Does macOS come with wget?
No. macOS ships with curl pre-installed. To use wget on macOS, you typically install it via Homebrew: brew install wget. This is why many Mac developers default to curl even for simple downloads.
How do I resume a failed download with curl?
Use the -C - flag combined with -O:
curl -C - -O https://example.com/large-file.zip
The -C - tells curl to auto-detect the resume point. Note: The server must support partial content (HTTP Range requests) for this to work.
Can wget send POST requests?
Yes, but the syntax is less intuitive than curl:
wget --post-data='{"key":"value"}' --header='Content-Type: application/json' https://api.example.com/endpoint
For API work, curl’s -d and -H flags feel more natural.
Why does curl print to the terminal instead of saving?
This is by design. curl outputs to stdout so you can pipe it to other tools (like jq, grep, or less). To save to a file, use -O (save with original filename) or -o filename (save with custom name).
What to Read Next
- curl vs wcurl: When to Use the Wrapper vs Raw curl — Ubuntu replaced wget with wcurl. Learn when to use the wrapper vs raw curl
- How to Set Up SSH Keys for GitHub — Secure your Git workflow with key-based authentication
- Tmux Cheatsheet — Master terminal multiplexing for better CLI productivity
Related Articles
Deepen your understanding with these curated continuations.
curl vs wcurl: When to Use the Wrapper vs Raw curl
Ubuntu replaced wget with wcurl—a curl wrapper. Learn when to use wcurl for simple downloads and when you still need raw curl for full control.
Install and Configure Git on Ubuntu 26.04
Complete guide to install Git on Ubuntu 26.04. Configure user identity, generate SSH keys for GitHub/GitLab, master essential workflow commands, and customize with aliases.
Install .deb Packages on Ubuntu 26.04
Complete guide to install .deb packages on Ubuntu 26.04 using apt, dpkg, and GDebi. Handle dependencies, verify installations, and troubleshoot common errors.