@c %**start of header
@setfilename wget.info
@include version.texi
-@set UPDATED Jul 2006
@settitle GNU Wget @value{VERSION} Manual
@c Disable the monstrous rectangles beside overfull hbox-es.
@finalout
* Wget: (wget). The non-interactive network downloader.
@end direntry
-@ifnottex
-This file documents the the GNU Wget utility for downloading network
+@copying
+This file documents the GNU Wget utility for downloading network
data.
@c man begin COPYRIGHT
-Copyright @copyright{} 1996--2006 Free Software Foundation, Inc.
+Copyright @copyright{} 1996, 1997, 1998, 1999, 2000, 2001, 2002,
+2003, 2004, 2005, 2006, 2007, 2008 Free Software Foundation, Inc.
+@iftex
Permission is granted to make and distribute verbatim copies of
this manual provided the copyright notice and this permission notice
are preserved on all copies.
+@end iftex
@ignore
Permission is granted to process this file through TeX and print the
copy of the license is included in the section entitled ``GNU Free
Documentation License''.
@c man end
-@end ifnottex
+@end copying
@titlepage
@title GNU Wget @value{VERSION}
Currently maintained by Micah Cowan <micah@cowan.name>.
@c man end
@c man begin SEEALSO
-GNU Info entry for @file{wget}.
+This is @strong{not} the complete manual for GNU Wget.
+For more complete information, including more detailed explanations of
+some of the options, and a number of commands available
+for use with @file{.wgetrc} files and the @samp{-e} option, see the GNU
+Info entry for @file{wget}.
@c man end
@end ignore
@page
@vskip 0pt plus 1filll
-Copyright @copyright{} 1996--2006, Free Software Foundation, Inc.
-
-Permission is granted to copy, distribute and/or modify this document
-under the terms of the GNU Free Documentation License, Version 1.2 or
-any later version published by the Free Software Foundation; with no
-Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A
-copy of the license is included in the section entitled ``GNU Free
-Documentation License''.
+@insertcopying
@end titlepage
+@contents
+
@ifnottex
@node Top
@top Wget @value{VERSION}
-This manual documents version @value{VERSION} of GNU Wget, the freely
-available utility for network downloads.
-
-Copyright @copyright{} 1996--2006 Free Software Foundation, Inc.
+@insertcopying
+@end ifnottex
@menu
* Overview:: Features of Wget.
* Copying this manual:: You may give out copies of Wget and of this manual.
* Concept Index:: Topics covered by this manual.
@end menu
-@end ifnottex
@node Overview
@chapter Overview
@c man end
@end ignore
@c man begin DESCRIPTION
-Wget can follow links in @sc{html} and @sc{xhtml} pages and create local
-versions of remote web sites, fully recreating the directory structure of
-the original site. This is sometimes referred to as ``recursive
-downloading.'' While doing that, Wget respects the Robot Exclusion
-Standard (@file{/robots.txt}). Wget can be instructed to convert the
-links in downloaded @sc{html} files to the local files for offline
-viewing.
+Wget can follow links in @sc{html}, @sc{xhtml}, and @sc{css} pages, to
+create local versions of remote web sites, fully recreating the
+directory structure of the original site. This is sometimes referred to
+as ``recursive downloading.'' While doing that, Wget respects the Robot
+Exclusion Standard (@file{/robots.txt}). Wget can be instructed to
+convert the links in downloaded files to point at the local files, for
+offline viewing.
@c man end
@item
@item
Wget supports proxy servers, which can lighten the network load, speed
-up retrieval and provide access behind firewalls. However, if you are
-behind a firewall that requires that you use a socks style gateway,
-you can get the socks library and build Wget with support for socks.
-Wget uses the passive @sc{ftp} downloading by default, active @sc{ftp}
-being an option.
+up retrieval and provide access behind firewalls. Wget uses the passive
+@sc{ftp} downloading by default, active @sc{ftp} being an option.
@item
Wget supports IP version 6, the next generation of IP. IPv6 is
@end example
The space between the option accepting an argument and the argument may
-be omitted. Instead @samp{-o log} you can write @samp{-olog}.
+be omitted. Instead of @samp{-o log} you can write @samp{-olog}.
You may put several options that do not require arguments together,
like:
@cindex input-file
@item -i @var{file}
@itemx --input-file=@var{file}
-Read @sc{url}s from @var{file}. If @samp{-} is specified as
-@var{file}, @sc{url}s are read from the standard input. (Use
-@samp{./-} to read from a file literally named @samp{-}.)
+Read @sc{url}s from a local or external @var{file}. If @samp{-} is
+specified as @var{file}, @sc{url}s are read from the standard input.
+(Use @samp{./-} to read from a file literally named @samp{-}.)
If this function is used, no @sc{url}s need be present on the command
line. If there are @sc{url}s both on the command line and in an input
href="@var{url}">} to the documents or by specifying
@samp{--base=@var{url}} on the command line.
+If the @var{file} is an external one, the document will be automatically
+treated as @samp{html} if the Content-Type matches @samp{text/html}.
+Furthermore, the @var{file}'s location will be implicitly used as base
+href if none was specified.
+
@cindex force html
@item -F
@itemx --force-html
@samp{wget -O - http://foo > file}; @file{file} will be truncated
immediately, and @emph{all} downloaded content will be written there.
+For this reason, @samp{-N} (for timestamp-checking) is not supported
+in combination with @samp{-O}: since @var{file} is always newly
+created, it will always have a very new timestamp. A warning will be
+issued if this combination is used.
+
+Similarly, using @samp{-r} or @samp{-p} with @samp{-O} may not work as
+you expect: Wget won't just download the first file to @var{file} and
+then download the rest to their normal names: @emph{all} downloaded
+content will be placed in @var{file}. This was disabled in version
+1.11, but has been reinstated (with a warning) in 1.11.2, as there are
+some cases where this behavior can actually have some use.
+
Note that a combination with @samp{-k} is only permitted when
-downloading a single document, and combination with any of @samp{-r},
-@samp{-p}, or @samp{-N} is not allowed.
+downloading a single document, as in that case it will just convert
+all relative URIs to external ones; @samp{-k} makes no sense for
+multiple URIs when they're all being downloaded to a single file.
@cindex clobbering, file
@cindex downloading multiple times
@cindex timeout, read
@item --read-timeout=@var{seconds}
Set the read (and write) timeout to @var{seconds} seconds. The
-``time'' of this timeout refers @dfn{idle time}: if, at any point in
+``time'' of this timeout refers to @dfn{idle time}: if, at any point in
the download, no data is received for more than the specified number
of seconds, reading fails and the download is restarted. This option
does not directly affect the duration of the entire download.
same time. Neither option is available in Wget compiled without IPv6
support.
-@item --prefer-family=IPv4/IPv6/none
+@item --prefer-family=none/IPv4/IPv6
When given a choice of several addresses, connect to the addresses
-with specified address family first. IPv4 addresses are preferred by
-default.
+with specified address family first. The address order returned by
+DNS is used without change by default.
This avoids spurious errors and connect attempts when accessing hosts
that resolve to both IPv6 and IPv4 addresses from IPv4 networks. For
using the @samp{--ftp-user} and @samp{--ftp-password} options for
@sc{ftp} connections and the @samp{--http-user} and @samp{--http-password}
options for @sc{http} connections.
+
+@item --ask-password
+Prompt for a password for each connection established. Cannot be specified
+when @samp{--password} is being used, because they are mutually exclusive.
@end table
@node Directory Options
@section Directory Options
-@table @samp
+@table @samp
@item -nd
@itemx --no-directories
Do not create a hierarchy of directories when retrieving recursively.
@section HTTP Options
@table @samp
+@cindex default page name
+@cindex index.html
+@item --default-page=@var{name}
+Use @var{name} as the default file name when it isn't known (i.e., for
+URLs that end in a slash), instead of @file{index.html}.
+
@cindex .html extension
@item -E
@itemx --html-extension
version of the file will be saved as @file{@var{X}.orig} (@pxref{Recursive
Retrieval Options}).
+As of version 1.12, Wget will also ensure that any downloaded files of
+type @samp{text/css} end in the suffix @samp{.css}. Obviously, this
+makes the name @samp{--html-extension} misleading; a better name is
+expected to be offered as an alternative in the near future.
+
@cindex http user
@cindex http password
@cindex authentication
@itemx --http-password=@var{password}
Specify the username @var{user} and password @var{password} on an
@sc{http} server. According to the type of the challenge, Wget will
-encode them using either the @code{basic} (insecure) or the
-@code{digest} authentication scheme.
+encode them using either the @code{basic} (insecure),
+the @code{digest}, or the Windows @code{NTLM} authentication scheme.
Another way to specify username and password is in the @sc{url} itself
(@pxref{URL Format}). Either method reveals your password to anyone who
them (and neither will browsers) and the @file{cookies.txt} file will
be empty. In that case use @samp{--keep-session-cookies} along with
@samp{--save-cookies} to force saving of session cookies.
+
+@cindex Content-Disposition
+@item --content-disposition
+
+If this is set to on, experimental (not fully-functional) support for
+@code{Content-Disposition} headers is enabled. This can currently result in
+extra round-trips to the server for a @code{HEAD} request, and is known
+to suffer from a few bugs, which is why it is not currently enabled by default.
+
+This option is useful for some file-downloading CGI programs that use
+@code{Content-Disposition} headers to describe what the name of a
+downloaded file should be.
+
+@cindex authentication
+@item --auth-no-challenge
+
+If this option is given, Wget will send Basic HTTP authentication
+information (plaintext username and password) for all requests, just
+like Wget 1.10.2 and prior did by default.
+
+Use of this option is not recommended, and is intended only to support
+some few obscure servers, which never send HTTP authentication
+challenges, but accept unsolicited auth info, say, in addition to
+form-based authentication.
+
@end table
@node HTTPS (SSL/TLS) Options
@item -A @var{acclist} --accept @var{acclist}
@itemx -R @var{rejlist} --reject @var{rejlist}
Specify comma-separated lists of file name suffixes or patterns to
-accept or reject (@pxref{Types of Files} for more details). Note that if
+accept or reject (@pxref{Types of Files}). Note that if
any of the wildcard characters, @samp{*}, @samp{?}, @samp{[} or
@samp{]}, appear in an element of @var{acclist} or @var{rejlist},
it will be treated as a pattern, rather than a suffix.
@item -I @var{list}
@itemx --include-directories=@var{list}
Specify a comma-separated list of directories you wish to follow when
-downloading (@pxref{Directory-Based Limits} for more details.) Elements
+downloading (@pxref{Directory-Based Limits}). Elements
of @var{list} may contain wildcards.
@item -X @var{list}
@itemx --exclude-directories=@var{list}
Specify a comma-separated list of directories you wish to exclude from
-download (@pxref{Directory-Based Limits} for more details.) Elements of
+download (@pxref{Directory-Based Limits}). Elements of
@var{list} may contain wildcards.
@item -np
@sc{http} or @sc{ftp} server), following links and directory structure.
We refer to this as to @dfn{recursive retrieval}, or @dfn{recursion}.
-With @sc{http} @sc{url}s, Wget retrieves and parses the @sc{html} from
-the given @sc{url}, documents, retrieving the files the @sc{html}
-document was referring to, through markup like @code{href}, or
-@code{src}. If the freshly downloaded file is also of type
-@code{text/html} or @code{application/xhtml+xml}, it will be parsed and
-followed further.
+With @sc{http} @sc{url}s, Wget retrieves and parses the @sc{html} or
+@sc{css} from the given @sc{url}, retrieving the files the document
+refers to, through markup like @code{href} or @code{src}, or @sc{css}
+@sc{uri} values specified using the @samp{url()} functional notation.
+If the freshly downloaded file is also of type @code{text/html},
+@code{application/xhtml+xml}, or @code{text/css}, it will be parsed
+and followed further.
-Recursive retrieval of @sc{http} and @sc{html} content is
+Recursive retrieval of @sc{http} and @sc{html}/@sc{css} content is
@dfn{breadth-first}. This means that Wget first downloads the requested
-@sc{html} document, then the documents linked from that document, then the
+document, then the documents linked from that document, then the
documents linked by them, and so on. In other words, Wget first
downloads the documents at depth 1, then those at depth 2, and so on
until the specified maximum depth.
expansion by the shell.
@end table
+@noindent
The @samp{-A} and @samp{-R} options may be combined to achieve even
better fine-tuning of which files to retrieve. E.g. @samp{wget -A
"*zelazny*" -R .ps} will download all the files having @samp{zelazny} as
a part of their name, but @emph{not} the PostScript files.
Note that these two options do not affect the downloading of @sc{html}
-files; Wget must load all the @sc{html}s to know where to go at
-all---recursive retrieval would make no sense otherwise.
+files (as determined by a @samp{.htm} or @samp{.html} filename
+prefix). This behavior may not be desirable for all users, and may be
+changed for future versions of Wget.
+
+Note, too, that query strings (strings at the end of a URL beginning
+with a question mark (@samp{?}) are not included as part of the
+filename for accept/reject rules, even though these will actually
+contribute to the name chosen for the local file. It is expected that
+a future version of Wget will provide an option to allow matching
+against query strings.
+
+Finally, it's worth noting that the accept/reject lists are matched
+@emph{twice} against downloaded files: once against the URL's filename
+portion, to determine if the file should be downloaded in the first
+place; then, after it has been accepted and successfully downloaded,
+the local file's name is also checked against the accept/reject lists
+to see if it should be removed. The rationale was that, since
+@samp{.htm} and @samp{.html} files are always downloaded regardless of
+accept/reject rules, they should be removed @emph{after} being
+downloaded and scanned for links, if they did match the accept/reject
+lists. However, this can lead to unexpected results, since the local
+filenames can differ from the original URL filenames in the following
+ways, all of which can change whether an accept/reject rule matches:
+
+@itemize @bullet
+@item
+If the local file already exists and @samp{--no-directories} was
+specified, a numeric suffix will be appended to the original name.
+@item
+If @samp{--html-extension} was specified, the local filename will have
+@samp{.html} appended to it. If Wget is invoked with @samp{-E -A.php},
+a filename such as @samp{index.php} will match be accepted, but upon
+download will be named @samp{index.php.html}, which no longer matches,
+and so the file will be deleted.
+@item
+Query strings do not contribute to URL matching, but are included in
+local filenames, and so @emph{do} contribute to filename matching.
+@end itemize
+
+@noindent
+This behavior, too, is considered less-than-desirable, and may change
+in a future version of Wget.
@node Directory-Based Limits
@section Directory-Based Limits
Essentially, @samp{--no-parent} is similar to
@samp{-I/~luzer/my-archive}, only it handles redirections in a more
intelligent fashion.
+
+@strong{Note} that, for HTTP (and HTTPS), the trailing slash is very
+important to @samp{--no-parent}. HTTP has no concept of a ``directory''---Wget
+relies on you to indicate what's a directory and what isn't. In
+@samp{http://foo/bar/}, Wget will consider @samp{bar} to be a
+directory, while in @samp{http://foo/bar} (no trailing slash),
+@samp{bar} will be considered a filename (so @samp{--no-parent} would be
+meaningless, as its parent is @samp{/}).
@end table
@node Relative Links
@item add_hostdir = on/off
Enable/disable host-prefixed file names. @samp{-nH} disables it.
-@item continue = on/off
-If set to on, force continuation of preexistent partially retrieved
-files. See @samp{-c} before setting it.
-
@item background = on/off
Enable/disable going to background---the same as @samp{-b} (which
enables it).
the specified client authorities. The default is ``on''. The same as
@samp{--check-certificate}.
+@item connect_timeout = @var{n}
+Set the connect timeout---the same as @samp{--connect-timeout}.
+
+@item content_disposition = on/off
+Turn on recognition of the (non-standard) @samp{Content-Disposition}
+HTTP header---if set to @samp{on}, the same as @samp{--content-disposition}.
+
+@item continue = on/off
+If set to on, force continuation of preexistent partially retrieved
+files. See @samp{-c} before setting it.
+
@item convert_links = on/off
Convert non-relative links locally. The same as @samp{-k}.
@item cookies = on/off
When set to off, disallow cookies. See the @samp{--cookies} option.
-@item connect_timeout = @var{n}
-Set the connect timeout---the same as @samp{--connect-timeout}.
-
@item cut_dirs = @var{n}
Ignore @var{n} remote directory components. Equivalent to
@samp{--cut-dirs=@var{n}}.
@item debug = on/off
Debug mode, same as @samp{-d}.
+@item default_page = @var{string}
+Default page name---the same as @samp{--default-page=@var{string}}.
+
@item delete_after = on/off
Delete after download---the same as @samp{--delete-after}.
suit your needs, or you can use the predefined @dfn{styles}
(@pxref{Download Options}).
+@item dot_spacing = @var{n}
+Specify the number of dots in a single cluster (10 by default).
+
@item dots_in_line = @var{n}
Specify the number of dots that will be printed in each line throughout
the retrieval (50 by default).
-@item dot_spacing = @var{n}
-Specify the number of dots in a single cluster (10 by default).
-
@item egd_file = @var{file}
Use @var{string} as the EGD socket file name. The same as
@samp{--egd-file=@var{file}}.
Turn globbing on/off---the same as @samp{--glob} and @samp{--no-glob}.
@item header = @var{string}
-Define a header for HTTP doewnloads, like using
+Define a header for HTTP downloads, like using
@samp{--header=@var{string}}.
@item html_extension = on/off
Add a @samp{.html} extension to @samp{text/html} or
-@samp{application/xhtml+xml} files without it, like @samp{-E}.
+@samp{application/xhtml+xml} files without it, or a @samp{.css}
+extension to @samp{text/css} files without it, like @samp{-E}.
@item http_keep_alive = on/off
Turn the keep-alive feature on or off (defaults to on). Turning it
@item logfile = @var{file}
Set logfile to @var{file}, the same as @samp{-o @var{file}}.
+@item max_redirect = @var{number}
+Specifies the maximum number of redirections to follow for a resource.
+See @samp{--max-redirect=@var{number}}.
+
@item mirror = on/off
Turn mirroring on/off. The same as @samp{-m}.
@item netrc = on/off
Turn reading netrc on or off.
-@item noclobber = on/off
+@item no_clobber = on/off
Same as @samp{-nc}.
@item no_parent = on/off
@var{file} in the request body. The same as
@samp{--post-file=@var{file}}.
-@item prefer_family = IPv4/IPv6/none
+@item prefer_family = none/IPv4/IPv6
When given a choice of several addresses, connect to the addresses
-with specified address family first. IPv4 addresses are preferred by
-default. The same as @samp{--prefer-family}, which see for a detailed
-discussion of why this is useful.
+with specified address family first. The address order returned by
+DNS is used without change by default. The same as @samp{--prefer-family},
+which see for a detailed discussion of why this is useful.
@item private_key = @var{file}
Set the private key file to @var{file}. The same as
When set, use the protocol name as a directory component of local file
names. The same as @samp{--protocol-directories}.
-@item proxy_user = @var{string}
-Set proxy authentication user name to @var{string}, like
-@samp{--proxy-user=@var{string}}.
-
@item proxy_password = @var{string}
Set proxy authentication password to @var{string}, like
@samp{--proxy-password=@var{string}}.
+@item proxy_user = @var{string}
+Set proxy authentication user name to @var{string}, like
+@samp{--proxy-user=@var{string}}.
+
@item quiet = on/off
Quiet mode---the same as @samp{-q}.
@item referer = @var{string}
Set HTTP @samp{Referer:} header just like
-@samp{--referer=@var{string}}. (Note it was the folks who wrote the
-@sc{http} spec who got the spelling of ``referrer'' wrong.)
+@samp{--referer=@var{string}}. (Note that it was the folks who wrote
+the @sc{http} spec who got the spelling of ``referrer'' wrong.)
@item relative_only = on/off
Follow only relative links---the same as @samp{-L} (@pxref{Relative
Save cookies to @var{file}. The same as @samp{--save-cookies
@var{file}}.
+@item save_headers = on/off
+Same as @samp{--save-headers}.
+
@item secure_protocol = @var{string}
Choose the secure protocol to be used. Legal values are @samp{auto}
(the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same
@item span_hosts = on/off
Same as @samp{-H}.
+@item spider = on/off
+Same as @samp{--spider}.
+
@item strict_comments = on/off
Same as @samp{--strict-comments}.
This command can be overridden using the @samp{ftp_user} and
@samp{http_user} command for @sc{ftp} and @sc{http} respectively.
+@item user_agent = @var{string}
+User agent identification sent to the HTTP Server---the same as
+@samp{--user-agent=@var{string}}.
+
@item verbose = on/off
Turn verbose on/off---the same as @samp{-v}/@samp{-nv}.
Wait @var{n} seconds between retrievals---the same as @samp{-w
@var{n}}.
-@item waitretry = @var{n}
+@item wait_retry = @var{n}
Wait up to @var{n} seconds between retries of failed retrievals
only---the same as @samp{--waitretry=@var{n}}. Note that this is
turned on by default in the global @file{wgetrc}.
@end example
@item
-The same as the above, but convert the links in the @sc{html} files to
+The same as the above, but convert the links in the downloaded files to
point to local files, so you can view the documents off-line:
@example
This chapter contains all the stuff that could not fit anywhere else.
@menu
-* Proxies:: Support for proxy servers
+* Proxies:: Support for proxy servers.
* Distribution:: Getting the latest version.
+* Web Site:: GNU Wget's presence on the World Wide Web.
* Mailing List:: Wget mailing list for announcements and discussion.
+* Internet Relay Chat:: Wget's presence on IRC.
* Reporting Bugs:: How and where to report bugs.
* Portability:: The systems Wget works on.
* Signals:: Signal-handling performed by Wget.
Wget @value{VERSION} can be found at
@url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz}
+@node Web Site
+@section Web Site
+@cindex web site
+
+The official web site for GNU Wget is at
+@url{http://www.gnu.org/software/wget/}. However, most useful
+information resides at ``The Wget Wgiki'',
+@url{http://wget.addictivecode.org/}.
+
@node Mailing List
@section Mailing List
@cindex mailing list
@cindex list
-There are several Wget-related mailing lists, all hosted by
-SunSITE.dk. The general discussion list is at
-@email{wget@@sunsite.dk}. It is the preferred place for bug reports
-and suggestions, as well as for discussion of development. You are
-invited to subscribe.
+There are several Wget-related mailing lists. The general discussion
+list is at @email{wget@@sunsite.dk}. It is the preferred place for
+support requests and suggestions, as well as for discussion of
+development. You are invited to subscribe.
To subscribe, simply send mail to @email{wget-subscribe@@sunsite.dk}
and follow the instructions. Unsubscribe by mailing to
@url{http://www.mail-archive.com/wget%40sunsite.dk/} and at
@url{http://news.gmane.org/gmane.comp.web.wget.general}.
-The second mailing list is at @email{wget-patches@@sunsite.dk}, and is
+Another mailing list is at @email{wget-patches@@sunsite.dk}, and is
used to submit patches for review by Wget developers. A ``patch'' is
a textual representation of change to source code, readable by both
-humans and programs. The file @file{PATCHES} that comes with Wget
+humans and programs. The
+@url{http://wget.addictivecode.org/PatchGuidelines} page
covers the creation and submitting of patches in detail. Please don't
send general suggestions or bug reports to @samp{wget-patches}; use it
only for patch submissions.
-To subscribe, simply send mail to @email{wget-subscribe@@sunsite.dk}
-and follow the instructions. Unsubscribe by mailing to
-@email{wget-unsubscribe@@sunsite.dk}. The mailing list is archived at
+Subscription is the same as above for @email{wget@@sunsite.dk}, except
+that you send to @email{wget-patches-subscribe@@sunsite.dk}, instead.
+The mailing list is archived at
@url{http://news.gmane.org/gmane.comp.web.wget.patches}.
+Finally, there is the @email{wget-notify@@addictivecode.org} mailing
+list. This is a non-discussion list that receives bug report-change
+notifications from the bug-tracker. Unlike for the other mailing lists,
+subscription is through the @code{mailman} interface at
+@url{http://addictivecode.org/mailman/listinfo/wget-notify}.
+
+@node Internet Relay Chat
+@section Internet Relay Chat
+@cindex Internet Relay Chat
+@cindex IRC
+@cindex #wget
+
+In addition to the mailinglists, we also have a support channel set up
+via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out!
+
@node Reporting Bugs
@section Reporting Bugs
@cindex bugs
@cindex bug reports
@c man begin BUGS
-You are welcome to send bug reports about GNU Wget to
-@email{bug-wget@@gnu.org}.
+You are welcome to submit bug reports via the GNU Wget bug tracker (see
+@url{http://wget.addictivecode.org/BugTracker}).
Before actually submitting a bug report, please try to follow a few
simple guidelines.
Please try to ascertain that the behavior you see really is a bug. If
Wget crashes, it's a bug. If Wget does not behave as documented,
it's a bug. If things work strange, but you are not sure about the way
-they are supposed to work, it might well be a bug.
+they are supposed to work, it might well be a bug, but you might want to
+double-check the documentation and the mailing lists (@pxref{Mailing
+List}).
@item
Try to repeat the bug in as simple circumstances as possible. E.g. if
-Wget crashes while downloading @samp{wget -rl0 -kKE -t5 -Y0
+Wget crashes while downloading @samp{wget -rl0 -kKE -t5 --no-proxy
http://yoyodyne.com -o /tmp/log}, you should try to see if the crash is
repeatable, and if will occur with a simpler set of options. You might
even try to start the download at the page where the crash occurred to
``special'' features of any particular Unix, it should compile (and
work) on all common Unix flavors.
-Various Wget versions have been compiled and tested under many kinds
-of Unix systems, including GNU/Linux, Solaris, SunOS 4.x, OSF (aka
-Digital Unix or Tru64), Ultrix, *BSD, IRIX, AIX, and others. Some of
-those systems are no longer in widespread use and may not be able to
+Various Wget versions have been compiled and tested under many kinds of
+Unix systems, including GNU/Linux, Solaris, SunOS 4.x, Mac OS X, OSF
+(aka Digital Unix or Tru64), Ultrix, *BSD, IRIX, AIX, and others. Some
+of those systems are no longer in widespread use and may not be able to
support recent versions of Wget. If Wget fails to compile on your
system, we would like to know about it.
@email{wget@@sunsite.dk} where the volunteers who maintain the
Windows-related features might look at them.
+Support for building on MS-DOS via DJGPP has been contributed by Gisle
+Vanem; a port to VMS is maintained by Steven Schweda, and is available
+at @url{http://antinode.org/}.
+
@node Signals
@section Signals
@cindex signal handling
download and parse.
Although Wget is not a web robot in the strictest sense of the word, it
-can downloads large parts of the site without the user's intervention to
+can download large parts of the site without the user's intervention to
download an individual page. Because of that, Wget honors RES when
downloading recursively. For instance, when you issue:
authentication.
@item
-Mauro Tortonesi---Improved IPv6 support, adding support for dual
+Mauro Tortonesi---improved IPv6 support, adding support for dual
family systems. Refactored and enhanced FTP IPv6 code. Maintained GNU
Wget from 2004--2007.
@item
-Christopher G.@: Lewis---Maintenance of the Windows version of GNU WGet.
+Christopher G.@: Lewis---maintenance of the Windows version of GNU WGet.
@item
-Gisle Vanem---Many helpful patches and improvements, especially for
+Gisle Vanem---many helpful patches and improvements, especially for
Windows and MS-DOS support.
+@item
+Ralf Wildenhues---contributed patches to convert Wget to use Automake as
+part of its build process, and various bugfixes.
+
+@item
+Steven Schubiger---Many helpful patches, bugfixes and improvements.
+Notably, conversion of Wget to use the Gnulib quotes and quoteargs
+modules, and the addition of password prompts at the console, via the
+Gnulib getpasswd-gnu module.
+
+@item
+Ted Mielczarek---donated support for CSS.
+
@item
People who provided donations for development---including Brian Gough.
@end itemize
Daniel Bodea,
Mark Boyns,
John Burden,
+Julien Buty,
Wanderlei Cavassin,
Gilles Cedoc,
Tim Charron,
Ahmon Dancy,
Andrew Davison,
Bertrand Demiddelaer,
+Alexander Dergachev,
Andrew Deryabin,
Ulrich Drepper,
Marc Duponcheel,
Aleksandar Erkalovic,
@end ifnottex
Andy Eskilsson,
+@iftex
+Jo@~{a}o Ferreira,
+@end iftex
+@ifnottex
+Joao Ferreira,
+@end ifnottex
Christian Fraenkel,
David Fritz,
+Mike Frysinger,
Charles C.@: Fu,
FUJISHIMA Satsuki,
Masashi Fujita,
Marcel Gerrits,
Lemble Gregory,
Hans Grobler,
+Alain Guibert,
Mathieu Guillaume,
Aaron Hawley,
Jochen Hein,
Karl Heuer,
+Madhusudan Hosaagrahara,
HIROSE Masaaki,
Ulf Harnhammar,
Gregor Hoffleit,
Aurelien Marchand,
Matthew J.@: Mellon,
Jordan Mendelson,
+Ted Mielczarek,
Lin Zhe Min,
Jan Minar,
Tim Mooney,
Simon Munton,
Charlie Negyesi,
R.@: K.@: Owen,
+Jim Paris,
+Kenny Parnell,
Leonid Petrov,
Simone Piunno,
Andrew Pollock,
Heinz Salzmann,
Robert Schmidt,
Nicolas Schodet,
+Benno Schulenberg,
Andreas Schwab,
Steven M.@: Schweda,
Chris Seawood,
+Pranab Shenoy,
Dennis Smit,
Toomas Soome,
Tage Stabell-Kulo,
Mauro Tortonesi,
Dave Turner,
Gisle Vanem,
+Rabin Vincent,
Russell Vincent,
@iftex
@v{Z}eljko Vrba,