Rate this article :
This article was useful to you ?
Yes
No
Vous avez noté 0 étoile(s)
Procédure
The curl
and wget
commands are essential tools for transferring and downloading data via the web terminal. Used to interact with servers using various protocols such as HTTP, HTTPS and FTP, these commands allow users to efficiently download files, manage API sessions and automate network interactions directly from the command line.
Before reading this documentation, we invite you to access your hosting's Web Terminal.
The curl
(URL Client) command is a powerful and versatile command line tool used to transfer data to or from a server using various network protocols. Curl
is frequently used to automate interactions with web servers and other types of systems accessed via protocols such as HTTP, HTTPS, FTP, SCP, and many others.
Basic functions
curl
can download data (such as web pages or files) from a specified URL.Basic syntax
curl [options] [URL...]
curl
's behaviour, for example to set HTTP headers, choose HTTP methods, manage cookies, etc.Common options
o
[file]
: Saves the result of the request to a specified file instead of displaying it in the console.X
[command]
: Specifies the HTTP method to be used (for example, GET, POST, PUT, DELETE, etc.).H
[header]
: Used to add a specific header to the HTTP request.d
[data]
: Sends data (such as form parameters) as the body of the POST request.-u
[user:password]
: Used to transmit authentication data.--cookie [name=value]
: Sends a cookie with the request.--insecure
: Enables curl
to process and ignore SSL certificate errors.Examples of use
Download a file and save it locally:
curl -o example.html http://example.com
This downloads the example.
com home page and saves it in example.
html.Sending a POST request with data:
curl -X POST -d "login=example&password=secret" https://example.com/login Sends
a POST request to example.com/login
with form data.Use HTTP authentication:
curl -u username:password http://example.com
Sends a request to example.com
with the necessary HTTP authentication.Send a custom header in a request:
curl -H "X-My-Header: 123" http://example.com
Sends a request to example.
com with a custom HTTP header.Typical use
curl
is extremely useful for testing and interacting with web APIs, automating data downloads, and scripting complex network interactions. In addition, it is often used in deployment and continuous integration scripts to automate interactions with online services.
The wget
command is a command-line utility for downloading files from the web. It supports HTTP, HTTPS, and FTP protocols, and is known for its ability to download files recursively and resume interrupted downloads.
Basic functions
wget
lets you download files from the Internet directly to a server or local computer.wget
can resume unfinished downloads, which is particularly useful for large files.Basic syntax
wget [options] [URL]
wget'
s behaviour.Common options
O [file]
: Used to specify the name under which the file will be saved.-c
: Resume downloading a partially downloaded file.-r
: Recursive download, which is often used to mirror complete sites.--no-check-certificate
: Ignores SSL/TLS certificate verification errors, useful for HTTPS connections with self-signed certificates.P [directory]
: Specifies the download directory for files.--http-user=[user]
and --http-password=[password]
: For sites that require authentication.Examples of use
Downloading a single file:
wget http://example.com/fichier.pdf
Downloads .pdf file
from example.com
and saves it in the current directory.Download and save with a specific name:
wget -O myfile.pdf http://example.com/fichier.pdf
Downloads .pdf file
and renames it to myfile.pdf
.Resume an interrupted download:
wget -c http://example.com/fichier.pdf
Attempts to resume the download of a .pdf file
if it had previously been interrupted.
Recursive downloading of a website:
wget -r -l 3 http://example.com
Recursively downloads the content of example.com
, following links to a depth of 3.
Typical use
wget
is frequently used for automatic downloads in scripts, for mirroring websites or for downloading large volumes of data from online servers. Thanks to its power and simplicity, wget
is a popular tool for managing downloads in environments without a graphical interface.
You now know how to :
curl
command to perform a variety of data transfers and automate interactions with APIs or web services.curl
, including downloading and uploading data, managing HTTP headers and using options to refine your requests.wget
command to download files from the Internet, manage recursive downloads and resume interrupted downloads.wget
for maximum efficiency in environments without a graphical interface.Whether you need to retrieve data, manage API sessions or set up automation systems, the curl
and wget
commands are real assets in the toolbox of any developer or system administrator 🛠️. Thanks to this article, you'll now be well equipped to take advantage of these powerful command line tools and make your interactions with the web more efficient and fluid 💻.
We hope this guide has been useful and inspired you to explore the possibilities of curl
and wget
further. If you have any questions, tips to share, or simply want to tell us about your experience with these tools, please don't hesitate to leave a comment below 👇. Thanks for reading and see you soon for more learning!
Rate this article :
This article was useful to you ?
Yes
No
1mn reading
How do I connect to the Web Terminal with LWS Panel? (ssh web console)
4mn reading
How can I use GIT with the Web terminal on my LWS shared hosting?
2mn reading
How to change the PHP version of the LWS Panel Web Terminal
0mn reading
What can I do on my WordPress site using the Web Terminal?