The *nix commands curl and wget are useful for accessing URLs without resorting to a browser. Both commands allow you to transfer data from a network server, with curl being the more robust of the two. You could use either of them to automate downloads from various servers.
The curl command
As mentioned, the curl command allows you to transfer data from a network server, but it also enables you to move data to a network server. In addition to HTTP, you can use other protocols, including HTTPS, FTP, POP3, SMTP, and Telnet. Administrators commonly rely on curl to interact with APIs using the DELETE, GET, POST, and PUT methods, as explained here.
The syntax for curl is fairly straight-forward at first glance. Here is an example:
$ curl http://www.example.com/help.txt
curl Options
You can supply various options to your command syntax:
curl [options] [url]
It is the options which make curl so robust. The following are some of the available options used with curl and examples of their use.
-a, --append
When uploading a file, this option allows you to append to the target file instead of overwriting it (FTP, SFTP).
$ curl --append file.txt ftp://ftp.example.com/file.txt
--connect-timeout
The --connect-timeout option sets the maximum time in seconds that curl can use to make its connection to the remote server. This option is handy to prevent the connection from terminating too quickly, and to minimize the amount of time you want the command to attempt the connection.
$ curl --connect-timeout 600 http://www.example.com/
--dns-servers
This option allows you to list DNS servers curl should use instead of the system default. This list can be handy when troubleshooting DNS issues or if you need to resolve an address against a specific nameserver.
$ curl --dns-servers 8.8.8.8 http://www.example.com/
--http3
You can specifically tell curl to use the http3 protocol to connect to the host and port provided with a https URL. --http2 and --http1.1 function in the same way and can be used to verify a webserver.
$ curl --http3 http://www.example.com:8080/
--output
If you need to retrieve a file from a remote server via a URL, --output is an easy way to save the file locally.
$ curl http://www.example.com/help.txt --output file.txt
--progress-bar
This option displays the progress of the file transfer when combined with the --output option.
$ curl --progress-bar http://www.example.com/help.txt --output file.txt
--sslv2
Like with HTTP, you can specifically tell curl to use a specific SSL option for the command to connect to and in this case we are specifying version 2. --ssl specifies SSL needs to be used and --sslv3 specifies SSL version 3. Note: sslv2 and sslv3 are considered legacy by the maintainer though still available.
$ curl --sslv2 https://www.example.com/
--verbose
The --verbose option with curl is useful for debugging and displaying what is going on during the call to the URL.
$ curl --verbose http://www.example.com
The wget command
Unlike curl, the wget command is solely for the retrieval of information from a remote server. By default, the information received is saved with the same name as in the provided URL.
Here is an example of the basic wget syntax:
$ wget http://www.example.com/help.txt
wget Options
Like curl, you can supply various options to your wget command syntax:
wget [option] [url]
--dns-servers=ADDRESSES
You can specify one or more specific DNS servers to use when utilizing wget to access a remote server. The syntax differs, however, if the option and nameserver addresses are joined with an =.
$ wget --dns-servers=8.8.8.8 http://www.example.com
-O
To save a file with a new name when using wget, utilize the --output-document option, or more simply -O.
$ wget http://www.example.com/help.txt -O file.txt
--progress=type
With wget, you can supply a type (dot or bar) to determine the ASCII visual of the progress bar. If a type is not specified, it will default to dot.
$ wget --progress=dot http://www.example.com
Wrap up
The curl and wget commands can be very useful when added to scripts to automatically download RPM packages or other files. This post only touches some of the most common features of what these commands can do. Check the related man pages for a complete list of options available for both curl and wget.
[ Want to test your sysadmin skills? Take a skills assessment today. ]
Sobre o autor
Amy Marrich is a Principal Technical Marketing Manager at Red Hat. She previously worked at a small open source e-assessment company in Luxembourg, where she was the Open Source Community and Global Training Manager. Previously, she was the OpenStack Instructor at Linux Academy and a Linux System Engineer on the Platform Engineering Cloud Operations team at Rackspace. She currently serves on the OpenStack Board, is an active member of the Openstack Ansible project, and was previously the chair of the OpenStack User Committee. Amy spends her free time competing in performance events (agility, FASt Cat, and dock diving) with her Dalmatians and competing in dressage with her Connemara pony.
Mais como este
More than meets the eye: Behind the scenes of Red Hat Enterprise Linux 10 (Part 4)
Looking ahead to 2026: Red Hat’s view across the hybrid cloud
The Overlooked Operating System | Compiler: Stack/Unstuck
Linux, Shadowman, And Open Source Spirit | Compiler
Navegue por canal
Automação
Últimas novidades em automação de TI para empresas de tecnologia, equipes e ambientes
Inteligência artificial
Descubra as atualizações nas plataformas que proporcionam aos clientes executar suas cargas de trabalho de IA em qualquer ambiente
Nuvem híbrida aberta
Veja como construímos um futuro mais flexível com a nuvem híbrida
Segurança
Veja as últimas novidades sobre como reduzimos riscos em ambientes e tecnologias
Edge computing
Saiba quais são as atualizações nas plataformas que simplificam as operações na borda
Infraestrutura
Saiba o que há de mais recente na plataforma Linux empresarial líder mundial
Aplicações
Conheça nossas soluções desenvolvidas para ajudar você a superar os desafios mais complexos de aplicações
Virtualização
O futuro da virtualização empresarial para suas cargas de trabalho on-premise ou na nuvem