The *nix commands curl and wget are useful for accessing URLs without resorting to a browser. Both commands allow you to transfer data from a network server, with curl being the more robust of the two. You could use either of them to automate downloads from various servers.

The curl command

As mentioned, the curl command allows you to transfer data from a network server, but it also enables you to move data to a network server. In addition to HTTP, you can use other protocols, including HTTPS, FTP, POP3, SMTP, and Telnet. Administrators commonly rely on curl to interact with APIs using the DELETE, GET, POST, and PUT methods, as explained here.

The syntax for curl is fairly straight-forward at first glance. Here is an example:

$ curl http://www.example.com/help.txt

curl Options

You can supply various options to your command syntax:

curl [options] [url]

It is the options which make curl so robust. The following are some of the available options used with curl and examples of their use.

-a, --append

When uploading a file, this option allows you to append to the target file instead of overwriting it (FTP, SFTP).

$ curl --append file.txt ftp://ftp.example.com/file.txt

--connect-timeout

The --connect-timeout option sets the maximum time in seconds that curl can use to make its connection to the remote server. This option is handy to prevent the connection from terminating too quickly, and to minimize the amount of time you want the command to attempt the connection.

$ curl --connect-timeout 600 http://www.example.com/

--dns-servers

This option allows you to list DNS servers curl should use instead of the system default. This list can be handy when troubleshooting DNS issues or if you need to resolve an address against a specific nameserver.

$ curl --dns-servers 8.8.8.8 http://www.example.com/

--http3

You can specifically tell curl to use the http3 protocol to connect to the host and port provided with a https URL. --http2 and --http1.1 function in the same way and can be used to verify a webserver.

$ curl --http3 http://www.example.com:8080/

--output

If you need to retrieve a file from a remote server via a URL, --output is an easy way to save the file locally.

$ curl http://www.example.com/help.txt --output file.txt

--progress-bar

This option displays the progress of the file transfer when combined with the --output option.

$ curl --progress-bar http://www.example.com/help.txt --output file.txt

--sslv2

Like with HTTP, you can specifically tell curl to use a specific SSL option for the command to connect to and in this case we are specifying version 2. --ssl specifies SSL needs to be used and --sslv3 specifies SSL version 3. Note: sslv2 and sslv3 are considered legacy by the maintainer though still available.

$ curl --sslv2 https://www.example.com/

--verbose

The --verbose option with curl is useful for debugging and displaying what is going on during the call to the URL.

$ curl --verbose http://www.example.com

The wget command

Unlike curl, the wget command is solely for the retrieval of information from a remote server. By default, the information received is saved with the same name as in the provided URL.

Here is an example of the basic wget syntax:

$ wget http://www.example.com/help.txt

wget Options

Like curl, you can supply various options to your wget command syntax:

wget [option] [url]

--dns-servers=ADDRESSES

You can specify one or more specific DNS servers to use when utilizing wget to access a remote server. The syntax differs, however, if the option and nameserver addresses are joined with an =.

$ wget --dns-servers=8.8.8.8 http://www.example.com

-O

To save a file with a new name when using wget, utilize the --output-document option, or more simply -O.

$ wget http://www.example.com/help.txt -O file.txt

--progress=type

With wget, you can supply a type (dot or bar) to determine the ASCII visual of the progress bar. If a type is not specified, it will default to dot.

$ wget --progress=dot http://www.example.com

Wrap up

The curl and wget commands can be very useful when added to scripts to automatically download RPM packages or other files. This post only touches some of the most common features of what these commands can do. Check the related man pages for a complete list of options available for both curl and wget.

[ Want to test your sysadmin skills? Take a skills assessment today. ]


저자 소개

Amy Marrich is a Principal Technical Marketing Manager at Red Hat. She previously worked at a small open source e-assessment company in Luxembourg, where she was the Open Source Community and Global Training Manager.  Previously, she was the OpenStack Instructor at Linux Academy and a Linux System Engineer on the Platform Engineering Cloud Operations team at Rackspace. She currently serves on the OpenStack Board, is an active member of the Openstack Ansible project, and was previously the chair of the OpenStack User Committee. Amy spends her free time competing in performance events (agility, FASt Cat, and dock diving) with her Dalmatians and competing in dressage with her Connemara pony.

UI_Icon-Red_Hat-Close-A-Black-RGB

채널별 검색

automation icon

오토메이션

기술, 팀, 인프라를 위한 IT 자동화 최신 동향

AI icon

인공지능

고객이 어디서나 AI 워크로드를 실행할 수 있도록 지원하는 플랫폼 업데이트

open hybrid cloud icon

오픈 하이브리드 클라우드

하이브리드 클라우드로 더욱 유연한 미래를 구축하는 방법을 알아보세요

security icon

보안

환경과 기술 전반에 걸쳐 리스크를 감소하는 방법에 대한 최신 정보

edge icon

엣지 컴퓨팅

엣지에서의 운영을 단순화하는 플랫폼 업데이트

Infrastructure icon

인프라

세계적으로 인정받은 기업용 Linux 플랫폼에 대한 최신 정보

application development icon

애플리케이션

복잡한 애플리케이션에 대한 솔루션 더 보기

Virtualization icon

가상화

온프레미스와 클라우드 환경에서 워크로드를 유연하게 운영하기 위한 엔터프라이즈 가상화의 미래