- GNU Wget README
-
-GNU Wget is a free network utility to retrieve files from the World
-Wide Web using HTTP and FTP, the two most widely used Internet
-protocols. It works non-interactively, thus enabling work in the
-background, after having logged off.
-
-The recursive retrieval of HTML pages, as well as FTP sites is
-supported -- you can use Wget to make mirrors of archives and home
-pages, or traverse the web like a WWW robot (Wget understands
-/robots.txt).
-
-Wget works exceedingly well on slow or unstable connections, keeping
-getting the document until it is fully retrieved. Re-getting files
-from where it left off works on servers (both HTTP and FTP) that
-support it. Matching of wildcards and recursive mirroring of
-directories are available when retrieving via FTP. Both HTTP and FTP
-retrievals can be time-stamped, thus Wget can see if the remote file
-has changed since last retrieval and automatically retrieve the new
-version if it has.
-
-Wget supports proxy servers, which can lighten the network load, speed
-up retrieval and provide access behind firewalls. If you are behind a
-firewall that requires the use of a socks style gateway, you can get
-the socks library and compile wget with support for socks.
+GNU Wget
+========
+ Current Web home: http://www.gnu.org/software/wget/
+
+GNU Wget is a free utility for non-interactive download of files from
+the Web. It supports HTTP, HTTPS, and FTP protocols, as well as
+retrieval through HTTP proxies.
+
+It can follow links in HTML pages and create local versions of remote
+web sites, fully recreating the directory structure of the original
+site. This is sometimes referred to as "recursive downloading."
+While doing that, Wget respects the Robot Exclusion Standard
+(/robots.txt). Wget can be instructed to convert the links in
+downloaded HTML files to the local files for offline viewing.
+
+Recursive downloading also works with FTP, where Wget can retrieves a
+hierarchy of directories and files.
+
+With both HTTP and FTP, Wget can check whether a remote file has
+changed on the server since the previous run, and only download the
+newer files.
+
+Wget has been designed for robustness over slow or unstable network
+connections; if a download fails due to a network problem, it will
+keep retrying until the whole file has been retrieved. If the server
+supports regetting, it will instruct the server to continue the
+download from where it left off.
+
+If you are behind a firewall that requires the use of a socks style
+gateway, you can get the socks library and compile wget with support
+for socks.