The *nix commands curl and wget are useful for accessing URLs without resorting to a browser. Both commands allow you to transfer data from a network server, with curl being the more robust of the two. You could use either of them to automate downloads from various servers.

The curl command

As mentioned, the curl command allows you to transfer data from a network server, but it also enables you to move data to a network server. In addition to HTTP, you can use other protocols, including HTTPS, FTP, POP3, SMTP, and Telnet. Administrators commonly rely on curl to interact with APIs using the DELETE, GET, POST, and PUT methods, as explained here.

The syntax for curl is fairly straight-forward at first glance. Here is an example:

$ curl http://www.example.com/help.txt

curl Options

You can supply various options to your command syntax:

curl [options] [url]

It is the options which make curl so robust. The following are some of the available options used with curl and examples of their use.

-a, --append

When uploading a file, this option allows you to append to the target file instead of overwriting it (FTP, SFTP).

$ curl --append file.txt ftp://ftp.example.com/file.txt

--connect-timeout

The --connect-timeout option sets the maximum time in seconds that curl can use to make its connection to the remote server. This option is handy to prevent the connection from terminating too quickly, and to minimize the amount of time you want the command to attempt the connection.

$ curl --connect-timeout 600 http://www.example.com/

--dns-servers

This option allows you to list DNS servers curl should use instead of the system default. This list can be handy when troubleshooting DNS issues or if you need to resolve an address against a specific nameserver.

$ curl --dns-servers 8.8.8.8 http://www.example.com/

--http3

You can specifically tell curl to use the http3 protocol to connect to the host and port provided with a https URL. --http2 and --http1.1 function in the same way and can be used to verify a webserver.

$ curl --http3 http://www.example.com:8080/

--output

If you need to retrieve a file from a remote server via a URL, --output is an easy way to save the file locally.

$ curl http://www.example.com/help.txt --output file.txt

--progress-bar

This option displays the progress of the file transfer when combined with the --output option.

$ curl --progress-bar http://www.example.com/help.txt --output file.txt

--sslv2

Like with HTTP, you can specifically tell curl to use a specific SSL option for the command to connect to and in this case we are specifying version 2. --ssl specifies SSL needs to be used and --sslv3 specifies SSL version 3. Note: sslv2 and sslv3 are considered legacy by the maintainer though still available.

$ curl --sslv2 https://www.example.com/

--verbose

The --verbose option with curl is useful for debugging and displaying what is going on during the call to the URL.

$ curl --verbose http://www.example.com

The wget command

Unlike curl, the wget command is solely for the retrieval of information from a remote server. By default, the information received is saved with the same name as in the provided URL.

Here is an example of the basic wget syntax:

$ wget http://www.example.com/help.txt

wget Options

Like curl, you can supply various options to your wget command syntax:

wget [option] [url]

--dns-servers=ADDRESSES

You can specify one or more specific DNS servers to use when utilizing wget to access a remote server. The syntax differs, however, if the option and nameserver addresses are joined with an =.

$ wget --dns-servers=8.8.8.8 http://www.example.com

-O

To save a file with a new name when using wget, utilize the --output-document option, or more simply -O.

$ wget http://www.example.com/help.txt -O file.txt

--progress=type

With wget, you can supply a type (dot or bar) to determine the ASCII visual of the progress bar. If a type is not specified, it will default to dot.

$ wget --progress=dot http://www.example.com

Wrap up

The curl and wget commands can be very useful when added to scripts to automatically download RPM packages or other files. This post only touches some of the most common features of what these commands can do. Check the related man pages for a complete list of options available for both curl and wget.

[ Want to test your sysadmin skills? Take a skills assessment today. ]


Sull'autore

Amy Marrich is a Principal Technical Marketing Manager at Red Hat. She previously worked at a small open source e-assessment company in Luxembourg, where she was the Open Source Community and Global Training Manager.  Previously, she was the OpenStack Instructor at Linux Academy and a Linux System Engineer on the Platform Engineering Cloud Operations team at Rackspace. She currently serves on the OpenStack Board, is an active member of the Openstack Ansible project, and was previously the chair of the OpenStack User Committee. Amy spends her free time competing in performance events (agility, FASt Cat, and dock diving) with her Dalmatians and competing in dressage with her Connemara pony.

UI_Icon-Red_Hat-Close-A-Black-RGB

Ricerca per canale

automation icon

Automazione

Novità sull'automazione IT di tecnologie, team e ambienti

AI icon

Intelligenza artificiale

Aggiornamenti sulle piattaforme che consentono alle aziende di eseguire carichi di lavoro IA ovunque

open hybrid cloud icon

Hybrid cloud open source

Scopri come affrontare il futuro in modo più agile grazie al cloud ibrido

security icon

Sicurezza

Le ultime novità sulle nostre soluzioni per ridurre i rischi nelle tecnologie e negli ambienti

edge icon

Edge computing

Aggiornamenti sulle piattaforme che semplificano l'operatività edge

Infrastructure icon

Infrastruttura

Le ultime novità sulla piattaforma Linux aziendale leader a livello mondiale

application development icon

Applicazioni

Approfondimenti sulle nostre soluzioni alle sfide applicative più difficili

Virtualization icon

Virtualizzazione

Il futuro della virtualizzazione negli ambienti aziendali per i carichi di lavoro on premise o nel cloud