How to download a website page on Linux terminal?

Linux command line provides powerful tools for downloading web content and offline browsing. The most popular tools are wget and cURL, which can download individual webpages or entire websites from various protocols.

Using wget

Wget is the most famous downloading tool that supports HTTP, HTTPS, and FTP protocols. It can download entire websites and supports proxy browsing.

Check if wget is Available

ubuntu@ubuntu:~$ which wget ; echo $?

Running the above code gives us the following result ?

/usr/bin/wget
0

If the exit code ($?) is 1, then install wget using the following command ?

ubuntu@ubuntu:~$ sudo apt-get install wget

Download a Webpage

Use wget to download a specific webpage or entire website ?

# Download a webpage
wget https://en.wikipedia.org/wiki/Linux_distribution

# Download entire website
wget -r -np -k abc.com

The output shows the download progress and saves the file in the current directory ?

ubuntu@ubuntu:~$ wget https://en.wikipedia.org/wiki/Linux_distribution
--2019-12-29 23:31:41-- https://en.wikipedia.org/wiki/Linux_distribution
Resolving en.wikipedia.org (en.wikipedia.org)... 103.102.166.224
Connecting to en.wikipedia.org (en.wikipedia.org)|103.102.166.224|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 216878 (212K) [text/html]
Saving to: 'Linux_distribution'

Linux_distribution 100%[==================>] 211.79K 1.00MB/s in 0.2s

2019-12-29 23:31:42 (1.00 MB/s) - 'Linux_distribution' saved [216878/216878]

Using cURL

cURL is a client-side application that supports downloading from HTTP, HTTPS, FTP, FTPS, Telnet, IMAP, and other protocols. It offers more download options compared to wget.

Check if cURL is Available

ubuntu@ubuntu:~$ which curl ; echo $?

Running the above code gives us the following result ?

1

The value of 1 indicates cURL is not available. Install it using the following command ?

ubuntu@ubuntu:~$ sudo apt-get install curl

The installation output shows the package being downloaded and configured ?

[sudo] password for ubuntu:
Reading package lists... Done
...
Get:1 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 curl amd64 7.47.0-1ubuntu2.14 [139 kB]
Fetched 139 kB in 21s (6,518 B/s)
...
Setting up curl (7.47.0-1ubuntu2.14) ...

Download with cURL

Use cURL with the -O flag to download and save the webpage ?

curl -O https://en.wikipedia.org/wiki/Linux_distribution

The output shows download progress and statistics ?

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  211k  100  211k    0     0   312k      0 --:--:-- --:--:-- --:--:--  311k

Key Differences

Feature wget cURL
Recursive download Yes (-r flag) No
Protocol support HTTP, HTTPS, FTP HTTP, HTTPS, FTP, FTPS, Telnet, IMAP
Best for Downloading websites API testing and data transfer

Conclusion

Both wget and cURL are powerful tools for downloading web content on Linux. Use wget for downloading entire websites recursively, and cURL for more complex data transfer scenarios and API interactions.

Updated on: 2026-03-15T17:23:33+05:30

8K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements