+2008-11-10 Micah Cowan <micah@cowan.name>
+
+ * MAILING-LIST: Mention Gmane, introduce subsections.
+
+2008-11-05 Micah Cowan <micah@cowan.name>
+
+ * MAILING-LIST: Mention moderation for unsubscribed posts, and
+ archive location.
+
+2008-10-31 Micah Cowan <micah@cowan.name>
+
+ * MAILING-LIST: Update information.
+
+ * NEWS: Add mention of mailing list move.
+
2008-08-01 Joao Ferreira <joao@joaoff.com>
* NEWS: Added option --default-page to support alternative
-Mailing List
-================
-
-There are several Wget-related mailing lists. The general discussion
-list is at <wget@sunsite.dk>. It is the preferred place for support
-requests and suggestions, as well as for discussion of development.
-You are invited to subscribe.
-
- To subscribe, simply send mail to <wget-subscribe@sunsite.dk> and
-follow the instructions. Unsubscribe by mailing to
-<wget-unsubscribe@sunsite.dk>. The mailing list is archived at
-`http://www.mail-archive.com/wget%40sunsite.dk/' and at
-`http://news.gmane.org/gmane.comp.web.wget.general'.
-
- Another mailing list is at <wget-patches@sunsite.dk>, and is used to
-submit patches for review by Wget developers. A "patch" is a textual
-representation of change to source code, readable by both humans and
-programs. The file `PATCHES' that comes with Wget covers the creation
-and submitting of patches in detail. Please don't send general
-suggestions or bug reports to `wget-patches'; use it only for patch
-submissions.
-
- Subscription is the same as above for <wget@sunsite.dk>, except that
-you send to <wget-patches-subscribe@sunsite.dk>, instead. The mailing
-list is archived at `http://news.gmane.org/gmane.comp.web.wget.patches'.
-
- Finally, there is the <wget-notify@addictivecode.org> mailing list.
-This is a non-discussion list that receives commit notifications from
-the source repository, and also bug report-change notifications. This
-is the highest-traffic list for Wget, and is recommended only for
-people who are seriously interested in ongoing Wget development.
-Subscription is through the `mailman' interface at
+Mailing Lists
+=============
+
+Primary List
+------------
+
+The primary mailinglist for discussion, bug-reports, or questions about
+GNU Wget is at <bug-wget@gnu.org>. To subscribe, send an email to
+<bug-wget-join@gnu.org>, or visit
+`http://lists.gnu.org/mailman/listinfo/bug-wget'.
+
+ You do not need to subscribe to send a message to the list; however,
+please note that unsubscribed messages are moderated, and may take a
+while before they hit the list--*usually around a day*. If you want
+your message to show up immediately, please subscribe to the list
+before posting. Archives for the list may be found at
+`http://lists.gnu.org/pipermail/bug-wget/'.
+
+ An NNTP/Usenettish gateway is also available via Gmane
+(http://gmane.org/about.php). You can see the Gmane archives at
+`http://news.gmane.org/gmane.comp.web.wget.general'. Note that the
+Gmane archives conveniently include messages from both the current
+list, and the previous one. Messages also show up in the Gmane archives
+sooner than they do at `lists.gnu.org'.
+
+Bug Notices List
+----------------
+
+Additionally, there is the <wget-notify@addictivecode.org> mailing
+list. This is a non-discussion list that receives bug report
+notifications from the bug-tracker. To subscribe to this list, send an
+email to <wget-notify-join@addictivecode.org>, or visit
`http://addictivecode.org/mailman/listinfo/wget-notify'.
+
+Obsolete Lists
+--------------
+
+Previously, the mailing list <wget@sunsite.dk> was used as the main
+discussion list, and another list, <wget-patches@sunsite.dk> was used
+for submitting and discussing patches to GNU Wget.
+
+ Messages from <wget@sunsite.dk> are archived at
+ `http://www.mail-archive.com/wget%40sunsite.dk/' and at
+
+ `http://news.gmane.org/gmane.comp.web.wget.general' (which also
+ continues to archive the current list, <bug-wget@gnu.org>).
+
+ Messages from <wget-patches@sunsite.dk> are archived at
+ `http://news.gmane.org/gmane.comp.web.wget.patches'.
+
\f
* Changes in Wget 1.12 (MAINLINE)
+** Mailing list MOVED to bug-wget@gnu.org
+
** --default-page option added to support alternative default names for
index.html.
** The --input-file option now also handles retrieving links from
an external file.
+
+** Several previously existing, but undocumented .wgetrc options
+are now documented: save_headers, spider, and user_agent.
\f
* Changes in Wget 1.11.4
+2008-11-10 Micah Cowan <micah@cowan.name>
+
+ * Makefile.am (EXTRA_DIST): Removed no-longer-present
+ README.maint (shouldn't have been there in the first place).
+
+ * wget.texi (Mailing Lists): Added information aboug Gmane portal,
+ added subsection headings.
+
+ Update node pointers.
+
+2008-11-05 Micah Cowan <micah@cowan.name>
+
+ * wget.texi: Move --no-http-keep-alive from FTP Options to HTTP
+ Options.
+ (Mailing List): Mention moderation for unsubscribed posts, and
+ archive location.
+
+2008-11-04 Micah Cowan <micah@cowan.name>
+
+ * wget.texi, fdl.texi: Updated to FDL version 1.3.
+
+2008-10-31 Micah Cowan <micah@cowan.name>
+
+ * wget.texi (Mailing List): Update info to reflect change to
+ bug-wget@gnu.org.
+
+2008-09-30 Steven Schubiger <stsc@members.fsf.org>
+
+ * wget.texi (Wgetrc Commands): Add default_page, save_headers,
+ spider and user_agent to the list of recognized commands.
+
+2008-09-10 Michael Kessler <kessler.michael@aon.at>
+
+ * wget.texi (Robot Exclusion): Fixed typo "downloads" ->
+ "download"
+
2008-08-03 Xavier Saint <wget@sxav.eu>
* wget.texi : Add option descriptions for the three new
info_TEXINFOS = wget.texi
wget_TEXINFOS = fdl.texi sample.wgetrc.munged_for_texi_inclusion
-EXTRA_DIST = README.maint sample.wgetrc $(SAMPLERCTEXI) \
+EXTRA_DIST = sample.wgetrc \
+ $(SAMPLERCTEXI) \
texi2pod.pl
wget.pod: $(srcdir)/wget.texi $(srcdir)/version.texi
+++ /dev/null
-
-TO RELEASE WGET X.Y.Z:
-
-1) update PO files from the TP
-
-cd po
-../util/update_po_files.sh
-
-
-2) generate tarball
-
-from the trunk:
-
-cd ~/tmp
-~/code/svn/wget/trunk/util/dist-wget --force-version X.Y.Z
-
-from a branch:
-
-cd ~/tmp
-~/code/svn/wget/branches/X.Y/util/dist-wget --force-version X.Y.Z -b branches/X.Y
-
-
-3) test the tarball
-
-
-4) set new version number "X.Y.Z" on the repository
-
-
-5) tag the sources in subversion
-
-from the trunk:
-
-svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/trunk http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/
-
-from a branch:
-
-svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/branches/X.Y/ http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/
-
-
-6) upload the tarball on gnu.org
-
-RELEASE=X.Y.Z
-TARBALL=wget-${RELEASE}.tar.gz
-gpg --default-key 7B2FD4B0 --detach-sign -b --output ${TARBALL}.sig $TARBALL
-echo -e "version: 1.1\ndirectory: wget\nfilename: $TARBALL\ncomment: Wget release ${RELEASE}" > ${TARBALL}.directive
-gpg --default-key 7B2FD4B0 --clearsign ${TARBALL}.directive
-
-lftp ftp://ftp-upload.gnu.org/incoming/ftp
-(use ftp://ftp-upload.gnu.org/incoming/alpha for pre-releases)
-
-put wget-X.Y.Z.tar.gz
-put wget-X.Y.Z.tar.gz.sig
-put wget-X.Y.Z.tar.gz.directive.asc
-
-
-
-7) update wget.sunsite.dk and gnu.org/software/wget
-
-
-8) send announcement on wget@sunsite.dk:
-
-hi to everybody,
-
-i have just uploaded the wget X.Y.Z tarball on ftp.gnu.org:
-
-ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz
-
-you can find the GPG signature of the tarball at these URLs:
-
-ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig
-
-and the GPG key i have used for the signature at this URL:
-
-http://www.tortonesi.com/GNU-GPG-Key.txt
-
-the key fingerprint is:
-
-pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer)
-<mauro@ferrara.linux.it>
- Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0
-
-the MD5 checksum of the tarball is:
-
-MD5 of tarball wget-X.Y.Z.tar.gz
-
-{DESCRIPTION OF THE CHANGES}
-
-
-9) send announcement on info-gnu@gnu.org
-
-I'm very pleased to announce the availability of GNU Wget X.Y.Z.
-
-GNU Wget is a non-interactive command-line tool for retrieving files using
-HTTP, HTTPS and FTP, which may easily be called from scripts, cron jobs,
-terminals without X-Windows support, etc.
-
-For more information, please see:
-
- http://www.gnu.org/software/wget
- http://wget.sunsite.dk
-
-Here are the compressed sources and the GPG detached signature:
-
-ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz
-ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig
-
-The MD5 checksums of the tarball is:
-
-MD5 of tarball wget-X.Y.Z.tar.gz
-
-
-The GPG key I have used for the tarball signature is available at this URL:
-
-http://www.tortonesi.com/GNU-GPG-Key.txt
-
-the key fingerprint is:
-
-pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer)
-<mauro@ferrara.linux.it>
- Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0
-
-{DESCRIPTION OF THE CHANGES}
-
-
-10) post announcement on freshmeat.net
-
-
-11) set new version number "X.Y.Z+devel" on the repository
-
-
+@c The GNU Free Documentation License.
+@center Version 1.3, 3 November 2008
-@node GNU Free Documentation License
-@appendixsec GNU Free Documentation License
-
-@cindex FDL, GNU Free Documentation License
-@center Version 1.2, November 2002
+@c This file is intended to be included within another document,
+@c hence no sectioning command or @node.
@display
-Copyright @copyright{} 2000,2001,2002 Free Software Foundation, Inc.
-51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA
+Copyright @copyright{} 2000, 2001, 2002, 2007, 2008 Free Software Foundation, Inc.
+@uref{http://fsf.org/}
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
the text near the most prominent appearance of the work's title,
preceding the beginning of the body of the text.
+The ``publisher'' means any person or entity that distributes copies
+of the Document to the public.
+
A section ``Entitled XYZ'' means a named subunit of the Document whose
title either is precisely XYZ or contains XYZ in parentheses following
text that translates XYZ in another language. (Here XYZ stands for a
@item
TERMINATION
-You may not copy, modify, sublicense, or distribute the Document except
-as expressly provided for under this License. Any other attempt to
-copy, modify, sublicense or distribute the Document is void, and will
-automatically terminate your rights under this License. However,
-parties who have received copies, or rights, from you under this
-License will not have their licenses terminated so long as such
-parties remain in full compliance.
+You may not copy, modify, sublicense, or distribute the Document
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense, or distribute it is void, and
+will automatically terminate your rights under this License.
+
+However, if you cease all violation of this License, then your license
+from a particular copyright holder is reinstated (a) provisionally,
+unless and until the copyright holder explicitly and finally
+terminates your license, and (b) permanently, if the copyright holder
+fails to notify you of the violation by some reasonable means prior to
+60 days after the cessation.
+
+Moreover, your license from a particular copyright holder is
+reinstated permanently if the copyright holder notifies you of the
+violation by some reasonable means, this is the first time you have
+received notice of violation of this License (for any work) from that
+copyright holder, and you cure the violation prior to 30 days after
+your receipt of the notice.
+
+Termination of your rights under this section does not terminate the
+licenses of parties who have received copies or rights from you under
+this License. If your rights have been terminated and not permanently
+reinstated, receipt of a copy of some or all of the same material does
+not give you any rights to use it.
@item
FUTURE REVISIONS OF THIS LICENSE
of any later version that has been published (not as a draft) by the
Free Software Foundation. If the Document does not specify a version
number of this License, you may choose any version ever published (not
-as a draft) by the Free Software Foundation.
+as a draft) by the Free Software Foundation. If the Document
+specifies that a proxy can decide which future versions of this
+License can be used, that proxy's public statement of acceptance of a
+version permanently authorizes you to choose that version for the
+Document.
+
+@item
+RELICENSING
+
+``Massive Multiauthor Collaboration Site'' (or ``MMC Site'') means any
+World Wide Web server that publishes copyrightable works and also
+provides prominent facilities for anybody to edit those works. A
+public wiki that anybody can edit is an example of such a server. A
+``Massive Multiauthor Collaboration'' (or ``MMC'') contained in the
+site means any set of copyrightable works thus published on the MMC
+site.
+
+``CC-BY-SA'' means the Creative Commons Attribution-Share Alike 3.0
+license published by Creative Commons Corporation, a not-for-profit
+corporation with a principal place of business in San Francisco,
+California, as well as future copyleft versions of that license
+published by that same organization.
+
+``Incorporate'' means to publish or republish a Document, in whole or
+in part, as part of another Document.
+
+An MMC is ``eligible for relicensing'' if it is licensed under this
+License, and if all works that were first published under this License
+somewhere other than this MMC, and subsequently incorporated in whole
+or in part into the MMC, (1) had no cover texts or invariant sections,
+and (2) were thus incorporated prior to November 1, 2008.
+
+The operator of an MMC Site may republish an MMC contained in the site
+under CC-BY-SA on the same site at any time before August 1, 2009,
+provided the MMC is eligible for relicensing.
+
@end enumerate
@page
@group
Copyright (C) @var{year} @var{your name}.
Permission is granted to copy, distribute and/or modify this document
- under the terms of the GNU Free Documentation License, Version 1.2
+ under the terms of the GNU Free Documentation License, Version 1.3
or any later version published by the Free Software Foundation;
with no Invariant Sections, no Front-Cover Texts, and no Back-Cover
Texts. A copy of the license is included in the section entitled ``GNU
@end smallexample
If you have Invariant Sections, Front-Cover Texts and Back-Cover Texts,
-replace the ``with...Texts.'' line with this:
+replace the ``with@dots{}Texts.'' line with this:
@smallexample
@group
@contents
@ifnottex
-@node Top
+@node Top, Overview, (dir), (dir)
@top Wget @value{VERSION}
@insertcopying
* Concept Index:: Topics covered by this manual.
@end menu
-@node Overview
+@node Overview, Invoking, Top, Top
@chapter Overview
@cindex overview
@cindex features
file @file{COPYING} that came with GNU Wget, for details).
@end itemize
-@node Invoking
+@node Invoking, Recursive Download, Overview, Top
@chapter Invoking
@cindex invoking
@cindex command line
* Recursive Accept/Reject Options::
@end menu
-@node URL Format
+@node URL Format, Option Syntax, Invoking, Invoking
@section URL Format
@cindex URL
@cindex URL syntax
@c man begin OPTIONS
-@node Option Syntax
+@node Option Syntax, Basic Startup Options, URL Format, Invoking
@section Option Syntax
@cindex option syntax
@cindex syntax of options
using @samp{--no-follow-ftp} is the only way to restore the factory
default from the command line.
-@node Basic Startup Options
+@node Basic Startup Options, Logging and Input File Options, Option Syntax, Invoking
@section Basic Startup Options
@table @samp
@end table
-@node Logging and Input File Options
+@node Logging and Input File Options, Download Options, Basic Startup Options, Invoking
@section Logging and Input File Options
@table @samp
the @samp{-i} option.
@end table
-@node Download Options
+@node Download Options, Directory Options, Logging and Input File Options, Invoking
@section Download Options
@table @samp
when @samp{--password} is being used, because they are mutually exclusive.
@end table
-@node Directory Options
+@node Directory Options, HTTP Options, Download Options, Invoking
@section Directory Options
@table @samp
current directory).
@end table
-@node HTTP Options
+@node HTTP Options, HTTPS (SSL/TLS) Options, Directory Options, Invoking
@section HTTP Options
@table @samp
Considerations}.
@end iftex
+@cindex Keep-Alive, turning off
+@cindex Persistent Connections, disabling
+@item --no-http-keep-alive
+Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget
+asks the server to keep the connection open so that, when you download
+more than one document from the same server, they get transferred over
+the same TCP connection. This saves time and at the same time reduces
+the load on the server.
+
+This option is useful when, for some reason, persistent (keep-alive)
+connections don't work for you, for example due to a server bug or due
+to the inability of server-side scripts to cope with the connections.
+
@cindex proxy
@cindex cache
@item --no-cache
@end table
-@node HTTPS (SSL/TLS) Options
+@node HTTPS (SSL/TLS) Options, FTP Options, HTTP Options, Invoking
@section HTTPS (SSL/TLS) Options
@cindex SSL
systems that support @file{/dev/random}.
@end table
-@node FTP Options
+@node FTP Options, Recursive Retrieval Options, HTTPS (SSL/TLS) Options, Invoking
@section FTP Options
@table @samp
specified on the command-line, rather than because it was recursed to,
this option has no effect. Symbolic links are always traversed in this
case.
-
-@cindex Keep-Alive, turning off
-@cindex Persistent Connections, disabling
-@item --no-http-keep-alive
-Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget
-asks the server to keep the connection open so that, when you download
-more than one document from the same server, they get transferred over
-the same TCP connection. This saves time and at the same time reduces
-the load on the server.
-
-This option is useful when, for some reason, persistent (keep-alive)
-connections don't work for you, for example due to a server bug or due
-to the inability of server-side scripts to cope with the connections.
@end table
-@node Recursive Retrieval Options
+@node Recursive Retrieval Options, Recursive Accept/Reject Options, FTP Options, Invoking
@section Recursive Retrieval Options
@table @samp
option to turn it on.
@end table
-@node Recursive Accept/Reject Options
+@node Recursive Accept/Reject Options, , Recursive Retrieval Options, Invoking
@section Recursive Accept/Reject Options
@table @samp
@c man end
-@node Recursive Download
+@node Recursive Download, Following Links, Invoking, Top
@chapter Recursive Download
@cindex recursion
@cindex retrieving
Recursive retrieval should be used with care. Don't say you were not
warned.
-@node Following Links
+@node Following Links, Time-Stamping, Recursive Download, Top
@chapter Following Links
@cindex links
@cindex following links
* FTP Links:: Following FTP links.
@end menu
-@node Spanning Hosts
+@node Spanning Hosts, Types of Files, Following Links, Following Links
@section Spanning Hosts
@cindex spanning hosts
@cindex hosts, spanning
@end table
-@node Types of Files
+@node Types of Files, Directory-Based Limits, Spanning Hosts, Following Links
@section Types of Files
@cindex types of files
This behavior, too, is considered less-than-desirable, and may change
in a future version of Wget.
-@node Directory-Based Limits
+@node Directory-Based Limits, Relative Links, Types of Files, Following Links
@section Directory-Based Limits
@cindex directories
@cindex directory limits
meaningless, as its parent is @samp{/}).
@end table
-@node Relative Links
+@node Relative Links, FTP Links, Directory-Based Limits, Following Links
@section Relative Links
@cindex relative links
This option is probably not very useful and might be removed in a future
release.
-@node FTP Links
+@node FTP Links, , Relative Links, Following Links
@section Following FTP Links
@cindex following ftp links
Also note that followed links to @sc{ftp} directories will not be
retrieved recursively further.
-@node Time-Stamping
+@node Time-Stamping, Startup File, Following Links, Top
@chapter Time-Stamping
@cindex time-stamping
@cindex timestamping
* FTP Time-Stamping Internals::
@end menu
-@node Time-Stamping Usage
+@node Time-Stamping Usage, HTTP Time-Stamping Internals, Time-Stamping, Time-Stamping
@section Time-Stamping Usage
@cindex time-stamping usage
@cindex usage, time-stamping
directory listing with dates in a format that Wget can parse
(@pxref{FTP Time-Stamping Internals}).
-@node HTTP Time-Stamping Internals
+@node HTTP Time-Stamping Internals, FTP Time-Stamping Internals, Time-Stamping Usage, Time-Stamping
@section HTTP Time-Stamping Internals
@cindex http time-stamping
Arguably, @sc{http} time-stamping should be implemented using the
@code{If-Modified-Since} request.
-@node FTP Time-Stamping Internals
+@node FTP Time-Stamping Internals, , HTTP Time-Stamping Internals, Time-Stamping
@section FTP Time-Stamping Internals
@cindex ftp time-stamping
@code{wu-ftpd}), which returns the exact time of the specified file.
Wget may support this command in the future.
-@node Startup File
+@node Startup File, Examples, Time-Stamping, Top
@chapter Startup File
@cindex startup file
@cindex wgetrc
* Sample Wgetrc:: A wgetrc example.
@end menu
-@node Wgetrc Location
+@node Wgetrc Location, Wgetrc Syntax, Startup File, Startup File
@section Wgetrc Location
@cindex wgetrc location
@cindex location of wgetrc
system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default).
Fascist admins, away!
-@node Wgetrc Syntax
+@node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File
@section Wgetrc Syntax
@cindex wgetrc syntax
@cindex syntax of wgetrc
reject =
@end example
-@node Wgetrc Commands
+@node Wgetrc Commands, Sample Wgetrc, Wgetrc Syntax, Startup File
@section Wgetrc Commands
@cindex wgetrc commands
@item debug = on/off
Debug mode, same as @samp{-d}.
+@item default_page = @var{string}
+Default page name---the same as @samp{--default-page=@var{string}}.
+
@item delete_after = on/off
Delete after download---the same as @samp{--delete-after}.
Save cookies to @var{file}. The same as @samp{--save-cookies
@var{file}}.
+@item save_headers = on/off
+Same as @samp{--save-headers}.
+
@item secure_protocol = @var{string}
Choose the secure protocol to be used. Legal values are @samp{auto}
(the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same
@item span_hosts = on/off
Same as @samp{-H}.
+@item spider = on/off
+Same as @samp{--spider}.
+
@item strict_comments = on/off
Same as @samp{--strict-comments}.
This command can be overridden using the @samp{ftp_user} and
@samp{http_user} command for @sc{ftp} and @sc{http} respectively.
+@item user_agent = @var{string}
+User agent identification sent to the HTTP Server---the same as
+@samp{--user-agent=@var{string}}.
+
@item verbose = on/off
Turn verbose on/off---the same as @samp{-v}/@samp{-nv}.
turned on by default in the global @file{wgetrc}.
@end table
-@node Sample Wgetrc
+@node Sample Wgetrc, , Wgetrc Commands, Startup File
@section Sample Wgetrc
@cindex sample wgetrc
@include sample.wgetrc.munged_for_texi_inclusion
@end example
-@node Examples
+@node Examples, Various, Startup File, Top
@chapter Examples
@cindex examples
* Very Advanced Usage:: The hairy stuff.
@end menu
-@node Simple Usage
+@node Simple Usage, Advanced Usage, Examples, Examples
@section Simple Usage
@itemize @bullet
@end example
@end itemize
-@node Advanced Usage
+@node Advanced Usage, Very Advanced Usage, Simple Usage, Examples
@section Advanced Usage
@itemize @bullet
@end example
@end itemize
-@node Very Advanced Usage
+@node Very Advanced Usage, , Advanced Usage, Examples
@section Very Advanced Usage
@cindex mirroring
@end itemize
@c man end
-@node Various
+@node Various, Appendices, Examples, Top
@chapter Various
@cindex various
* Proxies:: Support for proxy servers.
* Distribution:: Getting the latest version.
* Web Site:: GNU Wget's presence on the World Wide Web.
-* Mailing List:: Wget mailing list for announcements and discussion.
+* Mailing Lists:: Wget mailing list for announcements and discussion.
* Internet Relay Chat:: Wget's presence on IRC.
* Reporting Bugs:: How and where to report bugs.
* Portability:: The systems Wget works on.
* Signals:: Signal-handling performed by Wget.
@end menu
-@node Proxies
+@node Proxies, Distribution, Various, Various
@section Proxies
@cindex proxies
settings @code{proxy_user} and @code{proxy_password} to set the proxy
username and password.
-@node Distribution
+@node Distribution, Web Site, Proxies, Various
@section Distribution
@cindex latest version
Wget @value{VERSION} can be found at
@url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz}
-@node Web Site
+@node Web Site, Mailing Lists, Distribution, Various
@section Web Site
@cindex web site
information resides at ``The Wget Wgiki'',
@url{http://wget.addictivecode.org/}.
-@node Mailing List
-@section Mailing List
+@node Mailing Lists, Internet Relay Chat, Web Site, Various
+@section Mailing Lists
@cindex mailing list
@cindex list
-There are several Wget-related mailing lists. The general discussion
-list is at @email{wget@@sunsite.dk}. It is the preferred place for
-support requests and suggestions, as well as for discussion of
-development. You are invited to subscribe.
-
-To subscribe, simply send mail to @email{wget-subscribe@@sunsite.dk}
-and follow the instructions. Unsubscribe by mailing to
-@email{wget-unsubscribe@@sunsite.dk}. The mailing list is archived at
+@unnumberedsubsec Primary List
+
+The primary mailinglist for discussion, bug-reports, or questions
+about GNU Wget is at @email{bug-wget@@gnu.org}. To subscribe, send an
+email to @email{bug-wget-join@@gnu.org}, or visit
+@url{http://lists.gnu.org/mailman/listinfo/bug-wget}.
+
+You do not need to subscribe to send a message to the list; however,
+please note that unsubscribed messages are moderated, and may take a
+while before they hit the list---@strong{usually around a day}. If
+you want your message to show up immediately, please subscribe to the
+list before posting. Archives for the list may be found at
+@url{http://lists.gnu.org/pipermail/bug-wget/}.
+
+An NNTP/Usenettish gateway is also available via
+@uref{http://gmane.org/about.php,Gmane}. You can see the Gmane
+archives at
+@url{http://news.gmane.org/gmane.comp.web.wget.general}. Note that the
+Gmane archives conveniently include messages from both the current
+list, and the previous one. Messages also show up in the Gmane
+archives sooner than they do at @url{lists.gnu.org}.
+
+@unnumberedsubsec Bug Notices List
+
+Additionally, there is the @email{wget-notify@@addictivecode.org} mailing
+list. This is a non-discussion list that receives bug report
+notifications from the bug-tracker. To subscribe to this list,
+send an email to @email{wget-notify-join@@addictivecode.org},
+or visit @url{http://addictivecode.org/mailman/listinfo/wget-notify}.
+
+@unnumberedsubsec Obsolete Lists
+
+Previously, the mailing list @email{wget@@sunsite.dk} was used as the
+main discussion list, and another list,
+@email{wget-patches@@sunsite.dk} was used for submitting and
+discussing patches to GNU Wget.
+
+Messages from @email{wget@@sunsite.dk} are archived at
+@itemize @tie{}
+@item
@url{http://www.mail-archive.com/wget%40sunsite.dk/} and at
-@url{http://news.gmane.org/gmane.comp.web.wget.general}.
-
-Another mailing list is at @email{wget-patches@@sunsite.dk}, and is
-used to submit patches for review by Wget developers. A ``patch'' is
-a textual representation of change to source code, readable by both
-humans and programs. The
-@url{http://wget.addictivecode.org/PatchGuidelines} page
-covers the creation and submitting of patches in detail. Please don't
-send general suggestions or bug reports to @samp{wget-patches}; use it
-only for patch submissions.
-
-Subscription is the same as above for @email{wget@@sunsite.dk}, except
-that you send to @email{wget-patches-subscribe@@sunsite.dk}, instead.
-The mailing list is archived at
-@url{http://news.gmane.org/gmane.comp.web.wget.patches}.
+@item
+@url{http://news.gmane.org/gmane.comp.web.wget.general} (which also
+continues to archive the current list, @email{bug-wget@@gnu.org}).
+@end itemize
-Finally, there is the @email{wget-notify@@addictivecode.org} mailing
-list. This is a non-discussion list that receives bug report-change
-notifications from the bug-tracker. Unlike for the other mailing lists,
-subscription is through the @code{mailman} interface at
-@url{http://addictivecode.org/mailman/listinfo/wget-notify}.
+Messages from @email{wget-patches@@sunsite.dk} are archived at
+@itemize @tie{}
+@item
+@url{http://news.gmane.org/gmane.comp.web.wget.patches}.
+@end itemize
-@node Internet Relay Chat
+@node Internet Relay Chat, Reporting Bugs, Mailing Lists, Various
@section Internet Relay Chat
@cindex Internet Relay Chat
@cindex IRC
In addition to the mailinglists, we also have a support channel set up
via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out!
-@node Reporting Bugs
+@node Reporting Bugs, Portability, Internet Relay Chat, Various
@section Reporting Bugs
@cindex bugs
@cindex reporting bugs
it's a bug. If things work strange, but you are not sure about the way
they are supposed to work, it might well be a bug, but you might want to
double-check the documentation and the mailing lists (@pxref{Mailing
-List}).
+Lists}).
@item
Try to repeat the bug in as simple circumstances as possible. E.g. if
@end enumerate
@c man end
-@node Portability
+@node Portability, Signals, Reporting Bugs, Various
@section Portability
@cindex portability
@cindex operating systems
Vanem; a port to VMS is maintained by Steven Schweda, and is available
at @url{http://antinode.org/}.
-@node Signals
+@node Signals, , Portability, Various
@section Signals
@cindex signal handling
@cindex hangup
Other than that, Wget will not try to interfere with signals in any way.
@kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike.
-@node Appendices
+@node Appendices, Copying this manual, Various, Top
@chapter Appendices
This chapter contains some references I consider useful.
* Contributors:: People who helped.
@end menu
-@node Robot Exclusion
+@node Robot Exclusion, Security Considerations, Appendices, Appendices
@section Robot Exclusion
@cindex robot exclusion
@cindex robots.txt
download and parse.
Although Wget is not a web robot in the strictest sense of the word, it
-can downloads large parts of the site without the user's intervention to
+can download large parts of the site without the user's intervention to
download an individual page. Because of that, Wget honors RES when
downloading recursively. For instance, when you issue:
@file{.wgetrc}. You can achieve the same effect from the command line
using the @code{-e} switch, e.g. @samp{wget -e robots=off @var{url}...}.
-@node Security Considerations
+@node Security Considerations, Contributors, Robot Exclusion, Appendices
@section Security Considerations
@cindex security
me).
@end enumerate
-@node Contributors
+@node Contributors, , Security Considerations, Appendices
@section Contributors
@cindex contributors
Apologies to all who I accidentally left out, and many thanks to all the
subscribers of the Wget mailing list.
-@node Copying this manual
+@node Copying this manual, Concept Index, Appendices, Top
@appendix Copying this manual
@menu
* GNU Free Documentation License:: Licnse for copying this manual.
@end menu
+@node GNU Free Documentation License, , Copying this manual, Copying this manual
+@appendixsec GNU Free Documentation License
+@cindex FDL, GNU Free Documentation License
+
@include fdl.texi
-@node Concept Index
+@node Concept Index, , Copying this manual, Top
@unnumbered Concept Index
@printindex cp
+2008-11-13 Micah Cowan <micah@cowan.name>
+
+ * http.c (gethttp): Don't do anything when content-length >= our
+ requested range.
+
+2008-11-12 Micah Cowan <micah@cowan.name>
+
+ * ftp-ls.c (ftp_index): HTML-escape dir name in title, h1, a:href.
+
+2008-11-12 Alexander Belopolsky <alexander.belopolsky@gmail.com>
+
+ * url.c, url.h (url_escape_unsafe_and_reserved): Added.
+
+ * ftp-ls.c (ftp_index): URL-escape, rather than HTML-escape, the
+ filename appearing in the link.
+
+2008-11-12 Steven Schubiger <stsc@members.fsf.org>
+
+ * main.c (print_version): Hand the relevant
+ xstrdup/xfree calls back to format_and_print_line().
+
+2008-11-11 Steven Schubiger <stsc@members.fsf.org>
+
+ * main.c (format_and_print_line): Move both the memory
+ allocating and freeing bits upwards to print_version().
+
+2008-11-10 Saint Xavier <wget@sxav.eu>
+
+ * http.c: Make --auth-no-challenge works with user:pass@ in URLs.
+
+2008-11-05 Micah Cowan <micah@cowan.name>
+
+ * ftp.c (print_length): Should print humanized "size remaining"
+ only when it's at least 1k.
+
+2008-10-31 Micah Cowan <micah@cowan.name>
+
+ * main.c (print_version): Add information about the mailing list.
+
+2008-10-31 Alexander Drozdov <dzal_mail@mtu-net.ru>
+
+ * retr.c (fd_read_hunk): Make assert deal with maxsize == 0.
+
+ * ftp-ls.c (clean_line): Prevent underflow on empty lines.
+
+2008-10-26 Gisle Vanem <gvanem@broadpark.no>
+
+ * main.c (format_and_print_line): Put variables on top of
+ blocks (not all compilers are C99). Add an extra '\n' if
+ SYSTEM_WGETRC isn't defined and printed.
+
+2008-09-09 Gisle Vanem <gvanem@broadpark.no>
+
+ * url.c (url_error): Use aprintf, not asprintf.
+
+2008-09-09 Micah Cowan <micah@cowan.name>
+
+ * init.c (home_dir): Save the calculated value for home,
+ to avoid duplicated work on repeated calls.
+ (wgetrc_file_name) [WINDOWS]: Define and initialize home var.
+
+ * build_info.c, main.c: Remove unnecessary extern vars
+ system_wgetrc and locale_dir.
+
+ * main.c: Define program_name for lib/error.c.
+
+2008-09-02 Gisle Vanem <gvanem@broadpark.no>
+
+ * mswindows.h: Must ensure <stdio.h> is included before
+ we redefine ?vsnprintf().
+
2008-08-08 Steven Schubiger <stsc@members.fsf.org>
* main.c, utils.h: Removed some dead conditional DEBUG_MALLOC code.
#include "wget.h"
#include <stdio.h>
-char *system_wgetrc = SYSTEM_WGETRC;
-char *locale_dir = LOCALEDIR;
-
const char* (compiled_features[]) =
{
if (!len) return 0;
if (line[len - 1] == '\n')
line[--len] = '\0';
+ if (!len) return 0;
if (line[len - 1] == '\r')
line[--len] = '\0';
for ( ; *line ; line++ ) if (*line == '\t') *line = ' ';
{
FILE *fp;
char *upwd;
+ char *htcldir; /* HTML-clean dir name */
char *htclfile; /* HTML-clean file name */
+ char *urlclfile; /* URL-clean file name */
if (!output_stream)
{
}
else
upwd = xstrdup ("");
+
+ htcldir = html_quote_string (u->dir);
+
fprintf (fp, "<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n");
fprintf (fp, "<html>\n<head>\n<title>");
- fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port);
+ fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port);
fprintf (fp, "</title>\n</head>\n<body>\n<h1>");
- fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port);
+ fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port);
fprintf (fp, "</h1>\n<hr>\n<pre>\n");
+
while (f)
{
fprintf (fp, " ");
break;
}
htclfile = html_quote_string (f->name);
+ urlclfile = url_escape_unsafe_and_reserved (f->name);
fprintf (fp, "<a href=\"ftp://%s%s:%d", upwd, u->host, u->port);
if (*u->dir != '/')
putc ('/', fp);
- fprintf (fp, "%s", u->dir);
+ /* XXX: Should probably URL-escape dir components here, rather
+ * than just HTML-escape, for consistency with the next bit where
+ * we use urlclfile for the file component. Anyway, this is safer
+ * than what we had... */
+ fprintf (fp, "%s", htcldir);
if (*u->dir)
putc ('/', fp);
- fprintf (fp, "%s", htclfile);
+ fprintf (fp, "%s", urlclfile);
if (f->type == FT_DIRECTORY)
putc ('/', fp);
fprintf (fp, "\">%s", htclfile);
fprintf (fp, "-> %s", f->linkto ? f->linkto : "(nil)");
putc ('\n', fp);
xfree (htclfile);
+ xfree (urlclfile);
f = f->next;
}
fprintf (fp, "</pre>\n</body>\n</html>\n");
+ xfree (htcldir);
xfree (upwd);
if (!output_stream)
fclose (fp);
logprintf (LOG_VERBOSE, " (%s)", human_readable (size));
if (start > 0)
{
- if (start >= 1024)
+ if (size - start >= 1024)
logprintf (LOG_VERBOSE, _(", %s (%s) remaining"),
number_to_static_string (size - start),
human_readable (size - start));
user = user ? user : (opt.http_user ? opt.http_user : opt.user);
passwd = passwd ? passwd : (opt.http_passwd ? opt.http_passwd : opt.passwd);
- if (user && passwd
- && !u->user) /* We only do "site-wide" authentication with "global"
- user/password values; URL user/password info overrides. */
+ /* We only do "site-wide" authentication with "global" user/password
+ * values unless --auth-no-challange has been requested; URL user/password
+ * info overrides. */
+ if (user && passwd && (!u->user || opt.auth_without_challenge))
{
/* If this is a host for which we've already received a Basic
* challenge, we'll go ahead and send Basic authentication creds. */
}
}
- if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE)
+ if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE
+ || (hs->restval > 0 && statcode == HTTP_STATUS_OK
+ && contrange == 0 && hs->restval >= contlen)
+ )
{
/* If `-c' is in use and the file has been fully downloaded (or
the remote file has shrunk), Wget effectively requests bytes
- after the end of file and the server response with 416. */
+ after the end of file and the server response with 416
+ (or 200 with a <= Content-Length. */
logputs (LOG_VERBOSE, _("\
\n The file is already fully retrieved; nothing to do.\n\n"));
/* In case the caller inspects. */
char *
home_dir (void)
{
- char *home = getenv ("HOME");
+ static char buf[PATH_MAX];
+ static char *home;
if (!home)
{
+ home = getenv ("HOME");
+ if (!home)
+ {
#if defined(MSDOS)
- /* Under MSDOS, if $HOME isn't defined, use the directory where
- `wget.exe' resides. */
- const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */
- char *p, buf[PATH_MAX];
-
- strcpy (buf, _w32_get_argv0 ());
- p = strrchr (buf, '/'); /* djgpp */
- if (!p)
- p = strrchr (buf, '\\'); /* others */
- assert (p);
- *p = '\0';
- home = buf;
+ /* Under MSDOS, if $HOME isn't defined, use the directory where
+ `wget.exe' resides. */
+ const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */
+ char *p;
+
+ strcpy (buf, _w32_get_argv0 ());
+ p = strrchr (buf, '/'); /* djgpp */
+ if (!p)
+ p = strrchr (buf, '\\'); /* others */
+ assert (p);
+ *p = '\0';
+ home = buf;
#elif !defined(WINDOWS)
- /* If HOME is not defined, try getting it from the password
- file. */
- struct passwd *pwd = getpwuid (getuid ());
- if (!pwd || !pwd->pw_dir)
- return NULL;
- home = pwd->pw_dir;
+ /* If HOME is not defined, try getting it from the password
+ file. */
+ struct passwd *pwd = getpwuid (getuid ());
+ if (!pwd || !pwd->pw_dir)
+ return NULL;
+ strcpy (buf, pwd->pw_dir);
+ home = buf;
#else /* !WINDOWS */
- /* Under Windows, if $HOME isn't defined, use the directory where
- `wget.exe' resides. */
- home = ws_mypath ();
+ /* Under Windows, if $HOME isn't defined, use the directory where
+ `wget.exe' resides. */
+ home = ws_mypath ();
#endif /* WINDOWS */
+ }
}
return home ? xstrdup (home) : NULL;
}
return NULL;
}
+
/* Check for the existance of '$HOME/.wgetrc' and return it's path
if it exists and is set. */
char *
wgetrc_user_file_name (void)
{
- char *home = home_dir();
+ char *home = home_dir ();
char *file = NULL;
if (home)
file = aprintf ("%s/.wgetrc", home);
}
return file;
}
+
/* Return the path to the user's .wgetrc. This is either the value of
`WGETRC' environment variable, or `$HOME/.wgetrc'.
char *
wgetrc_file_name (void)
{
+ char *home = NULL;
char *file = wgetrc_env_file_name ();
if (file && *file)
return file;
-
+
file = wgetrc_user_file_name ();
#ifdef WINDOWS
`wget.ini' in the directory where `wget.exe' resides; we do this for
backward compatibility with previous versions of Wget.
SYSTEM_WGETRC should not be defined under WINDOWS. */
+ home = home_dir ();
if (!file || !file_exists_p (file))
{
xfree_null (file);
if (home)
file = aprintf ("%s/wget.ini", home);
}
+ xfree_null (home);
#endif /* WINDOWS */
if (!file)
extern char *link_string;
/* defined in build_info.c */
extern char *compiled_features[];
-extern char *system_wgetrc;
-extern char *locale_dir;
/* Used for --version output in print_version */
static const int max_chars_per_line = 72;
and an appropriate number of spaces are added on subsequent
lines.*/
static void
-format_and_print_line (char* prefix, char* line,
- int line_length)
+format_and_print_line (const char *prefix, const char *line,
+ int line_length)
{
+ int leading_spaces;
+ int remaining_chars;
+ char *line_dup, *token;
+
assert (prefix != NULL);
assert (line != NULL);
+ line_dup = xstrdup (line);
+
if (line_length <= 0)
line_length = max_chars_per_line;
- const int leading_spaces = strlen (prefix);
+ leading_spaces = strlen (prefix);
printf ("%s", prefix);
- int remaining_chars = line_length - leading_spaces;
+ remaining_chars = line_length - leading_spaces;
/* We break on spaces. */
- char* token = strtok (line, " ");
+ token = strtok (line_dup, " ");
while (token != NULL)
{
/* If however a token is much larger than the maximum
token on the next line. */
if (remaining_chars <= strlen (token))
{
+ int j;
printf ("\n");
- int j = 0;
+ j = 0;
for (j = 0; j < leading_spaces; j++)
{
printf (" ");
}
printf ("\n");
- xfree (prefix);
- xfree (line);
+
+ xfree (line_dup);
}
static void
const char *link_title = "Link : ";
const char *prefix_spaces = " ";
const int prefix_space_length = strlen (prefix_spaces);
+ char *line;
+ char *env_wgetrc, *user_wgetrc;
+ int i;
printf ("GNU Wget %s\n", version_string);
printf (options_title);
/* compiled_features is a char*[]. We limit the characters per
line to max_chars_per_line and prefix each line with a constant
number of spaces for proper alignment. */
- int i =0;
for (i = 0; compiled_features[i] != NULL; )
{
int line_length = max_chars_per_line - prefix_space_length;
/* Handle the case when $WGETRC is unset and $HOME/.wgetrc is
absent. */
printf (wgetrc_title);
- char *env_wgetrc = wgetrc_env_file_name ();
+ env_wgetrc = wgetrc_env_file_name ();
if (env_wgetrc && *env_wgetrc)
{
printf ("%s (env)\n%s", env_wgetrc, prefix_spaces);
xfree (env_wgetrc);
}
- char *user_wgetrc = wgetrc_user_file_name ();
+ user_wgetrc = wgetrc_user_file_name ();
if (user_wgetrc)
{
printf ("%s (user)\n%s", user_wgetrc, prefix_spaces);
xfree (user_wgetrc);
}
- printf ("%s (system)\n", system_wgetrc);
+#ifdef SYSTEM_WGETRC
+ printf ("%s (system)\n", SYSTEM_WGETRC);
+#else
+ putchar ('\n');
+#endif
- format_and_print_line (strdup (locale_title),
- strdup (locale_dir),
+ format_and_print_line (locale_title,
+ LOCALEDIR,
max_chars_per_line);
- format_and_print_line (strdup (compile_title),
- strdup (compilation_string),
+ format_and_print_line (compile_title,
+ compilation_string,
max_chars_per_line);
- format_and_print_line (strdup (link_title),
- strdup (link_string),
+ format_and_print_line (link_title,
+ link_string,
max_chars_per_line);
+
printf ("\n");
/* TRANSLATORS: When available, an actual copyright character
(cirle-c) should be used in preference to "(C)". */
stdout);
fputs (_("Currently maintained by Micah Cowan <micah@cowan.name>.\n"),
stdout);
+ fputs (_("Please send bug reports and questions to <bug-wget@gnu.org>.\n"),
+ stdout);
exit (0);
}
+char *program_name; /* Needed by lib/error.c. */
+
int
main (int argc, char **argv)
{
int nurl, status;
bool append_to_log = false;
+ program_name = argv[0];
+
i18n_initialize ();
/* Construct the name of the executable, without the directory part. */
# define strncasecmp strnicmp
#endif
+#include <stdio.h>
+
/* The same for snprintf() and vsnprintf(). */
#define snprintf _snprintf
#define vsnprintf _vsnprintf
char *hunk = xmalloc (bufsize);
int tail = 0; /* tail position in HUNK */
- assert (maxsize >= bufsize);
+ assert (!maxsize || maxsize >= bufsize);
while (1)
{
return url_escape_1 (s, urlchr_unsafe, false);
}
+/* URL-escape the unsafe and reserved characters (see urlchr_table) in
+ a given string, returning a freshly allocated string. */
+
+char *
+url_escape_unsafe_and_reserved (const char *s)
+{
+ return url_escape_1 (s, urlchr_unsafe|urlchr_reserved, false);
+}
+
/* URL-escape the unsafe characters (see urlchr_table) in a given
string. If no characters are unsafe, S is returned. */
if ((p = strchr (scheme, ':')))
*p = '\0';
if (!strcasecmp (scheme, "https"))
- asprintf (&error, _("HTTPS support not compiled in"));
+ error = aprintf (_("HTTPS support not compiled in"));
else
- asprintf (&error, _(parse_errors[error_code]), quote (scheme));
+ error = aprintf (_(parse_errors[error_code]), quote (scheme));
xfree (scheme);
return error;
/* Function declarations */
char *url_escape (const char *);
+char *url_escape_unsafe_and_reserved (const char *);
struct url *url_parse (const char *, int *, struct iri *iri);
char *url_error (const char *, int);
+2008-11-12 Steven Schubiger <stsc@members.fsf.org>
+
+ * Test-auth-basic.px, Test-auth-no-challenge.px,
+ Test-auth-no-challenge-url.px, Test-c-full.px,
+ Test-c-partial.px, Test-c.px, Test-c-shorter.px,
+ Test-E-k-K.px, Test-E-k.px, Test-ftp.px,
+ Test-HTTP-Content-Disposition-1.px,
+ Test-HTTP-Content-Disposition-2.px,
+ Test-HTTP-Content-Disposition.px, Test-N-current.px,
+ Test-N-HTTP-Content-Disposition.px,
+ Test-N--no-content-disposition.px,
+ Test-N--no-content-disposition-trivial.px,
+ Test-N-no-info.px, Test--no-content-disposition.px,
+ Test--no-content-disposition-trivial.px, Test-N-old.px,
+ Test-nonexisting-quiet.px, Test-noop.px, Test-np.px,
+ Test-N.px, Test-N-smaller.px,
+ Test-O-HTTP-Content-Disposition.px, Test-O-nc.px,
+ Test-O--no-content-disposition.px,
+ Test-O--no-content-disposition-trivial.px,
+ Test-O-nonexisting.px, Test-O.px,
+ Test-proxy-auth-basic.px, Test-Restrict-Lowercase.px,
+ Test-Restrict-Uppercase.px,
+ Test--spider-fail.pxm, Test--spider.px,
+ Test--spider-r-HTTP-Content-Disposition.px,
+ Test--spider-r--no-content-disposition.px,
+ Test--spider-r--no-content-disposition-trivial.px,
+ Test--spider-r.px: Enforce lexically scoped warnings.
+
+ * Test-proxied-https-auth.px, run-px: Place use strict
+ before use warnings.
+
+2008-11-12 Steven Schubiger <stsc@members.fsf.org>
+
+ * FTPServer.pm, FTPTest.pm, HTTPServer.pm, HTTPTest.pm:
+ Remove the magic interpreter line, because it cannot be
+ used fully. Substitute -w with use warnings.
+
+2008-11-11 Micah Cowan <micah@cowan.name>
+
+ * HTTPServer.pm (handle_auth): Allow testing of
+ --auth-no-challenge.
+
+ * Test-auth-no-challenge.px, Test-auth-no-challenge-url.px:
+ Added.
+
+ * run-px: Add Test-auth-no-challenge.px,
+ Test-auth-no-challenge-url.px.
+
+2008-11-07 Steven Schubiger <stsc@members.fsf.org>
+
+ * run-px: Use some colors for the summary part of the test
+ output to strengthen the distinction between a successful
+ or failing run.
+
+2008-11-06 Steven Schubiger <stsc@members.fsf.org>
+
+ * run-px: When executing test scripts, invoke them with the
+ current perl executable name as determined by env.
+
+2008-11-06 Micah Cowan <micah@cowan.name>
+
+ * run-px: Use strict (thanks Steven Schubiger!).
+
2008-09-09 Micah Cowan <micah@cowan.name>
* Test-idn-cmd.px: Added.
-#!/usr/bin/perl -w
-
# Part of this code was borrowed from Richard Jones's Net::FTPServer
# http://www.annexia.org/freeware/netftpserver
package FTPServer;
use strict;
+use warnings;
use Cwd;
use Socket;
-#!/usr/bin/perl -w
-
package FTPTest;
use strict;
+use warnings;
use FTPServer;
use WgetTest;
-#!/usr/bin/perl -w
-
package HTTPServer;
use strict;
+use warnings;
use HTTP::Daemon;
use HTTP::Status;
my $authhdr = $req->header('Authorization');
# Have we sent the challenge yet?
- unless (defined $url_rec->{auth_challenged}
- && $url_rec->{auth_challenged}) {
+ unless ($url_rec->{auth_challenged} || $url_rec->{auth_no_challenge}) {
# Since we haven't challenged yet, we'd better not
# have received authentication (for our testing purposes).
if ($authhdr) {
# failed it.
$code = 400;
$msg = "You didn't send auth after I sent challenge";
+ if ($url_rec->{auth_no_challenge}) {
+ $msg = "--auth-no-challenge but no auth sent."
+ }
} else {
my ($sent_method) = ($authhdr =~ /^(\S+)/g);
unless ($sent_method eq $url_rec->{'auth_method'}) {
-#!/usr/bin/perl -w
-
package HTTPTest;
use strict;
+use warnings;
use HTTPServer;
use WgetTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
--- /dev/null
+#!/usr/bin/perl
+
+use strict;
+use warnings;
+
+use HTTPTest;
+
+
+###############################################################################
+
+my $wholefile = "You're all authenticated.\n";
+
+# code, msg, headers, content
+my %urls = (
+ '/needs-auth.txt' => {
+ auth_no_challenge => 1,
+ auth_method => 'Basic',
+ user => 'fiddle-dee-dee',
+ passwd => 'Dodgson',
+ code => "200",
+ msg => "You want fries with that?",
+ headers => {
+ "Content-type" => "text/plain",
+ },
+ content => $wholefile,
+ },
+);
+
+my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge "
+ . "http://fiddle-dee-dee:Dodgson\@localhost:{{port}}/needs-auth.txt";
+
+my $expected_error_code = 0;
+
+my %expected_downloaded_files = (
+ 'needs-auth.txt' => {
+ content => $wholefile,
+ },
+);
+
+###############################################################################
+
+my $the_test = HTTPTest->new (name => "Test-auth-no-challenge-url",
+ input => \%urls,
+ cmdline => $cmdline,
+ errcode => $expected_error_code,
+ output => \%expected_downloaded_files);
+exit $the_test->run();
+
+# vim: et ts=4 sw=4
+
--- /dev/null
+#!/usr/bin/perl
+
+use strict;
+use warnings;
+
+use HTTPTest;
+
+
+###############################################################################
+
+my $wholefile = "You're all authenticated.\n";
+
+# code, msg, headers, content
+my %urls = (
+ '/needs-auth.txt' => {
+ auth_no_challenge => 1,
+ auth_method => 'Basic',
+ user => 'fiddle-dee-dee',
+ passwd => 'Dodgson',
+ code => "200",
+ msg => "You want fries with that?",
+ headers => {
+ "Content-type" => "text/plain",
+ },
+ content => $wholefile,
+ },
+);
+
+my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge"
+ . " --user=fiddle-dee-dee --password=Dodgson"
+ . " http://localhost:{{port}}/needs-auth.txt";
+
+my $expected_error_code = 0;
+
+my %expected_downloaded_files = (
+ 'needs-auth.txt' => {
+ content => $wholefile,
+ },
+);
+
+###############################################################################
+
+my $the_test = HTTPTest->new (name => "Test-auth-no-challenge",
+ input => \%urls,
+ cmdline => $cmdline,
+ errcode => $expected_error_code,
+ output => \%expected_downloaded_files);
+exit $the_test->run();
+
+# vim: et ts=4 sw=4
+
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use FTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
#!/usr/bin/perl
-use warnings;
+
use strict;
+use warnings;
use WgetTest; # For $WGETPATH.
-#!/usr/bin/perl -w
+#!/usr/bin/perl
use strict;
+use warnings;
use HTTPTest;
#!/usr/bin/env perl
+
+use 5.006;
+use strict;
use warnings;
+use Term::ANSIColor ':constants';
+$Term::ANSIColor::AUTORESET = 1;
+
die "Please specify the top source directory.\n" if (!@ARGV);
my $top_srcdir = shift @ARGV;
my @tests = (
'Test-auth-basic.px',
+ 'Test-auth-no-challenge.px',
+ 'Test-auth-no-challenge-url.px',
'Test-proxy-auth-basic.px',
'Test-proxied-https-auth.px',
'Test-N-HTTP-Content-Disposition.px',
'Test--spider-r.px',
);
-my @results;
+my @tested;
-for my $test (@tests) {
+foreach my $test (@tests) {
print "Running $test\n\n";
- system("$top_srcdir/tests/$test");
- push @results, $?;
+ system("$^X $top_srcdir/tests/$test");
+ push @tested, { name => $test, result => $? };
}
-for (my $i=0; $i != @tests; ++$i) {
- if ($results[$i] == 0) {
- print "pass: ";
- } else {
- print "FAIL: ";
- }
- print "$tests[$i]\n";
+print "\n";
+foreach my $test (@tested) {
+ ($test->{result} == 0)
+ ? print GREEN 'pass: '
+ : print RED 'FAIL: ';
+ print $test->{name}, "\n";
}
+my $count = sub
+{
+ return {
+ pass => sub { scalar grep $_->{result} == 0, @tested },
+ fail => sub { scalar grep $_->{result} != 0, @tested },
+ }->{$_[0]}->();
+};
+
+my $summary = sub
+{
+ my @lines = (
+ "${\scalar @tested} tests were run",
+ "${\$count->('pass')} PASS, ${\$count->('fail')} FAIL",
+ );
+ my $len_longest = sub
+ {
+ local $_ = 0;
+ foreach my $line (@lines) {
+ if (length $line > $_) {
+ $_ = length $line;
+ }
+ }
+ return $_;
+ }->();
+ return join "\n",
+ '=' x $len_longest,
+ @lines,
+ '=' x $len_longest;
+}->();
+
+print "\n";
+print $count->('fail')
+ ? RED $summary
+ : GREEN $summary;
print "\n";
-print scalar(@results) . " tests were run\n";
-print scalar(grep $_ == 0, @results) . " PASS\n";
-print scalar(grep $_ != 0, @results) . " FAIL\n";
-exit scalar (grep $_ != 0, @results);
+exit $count->('fail');
--- /dev/null
+#!/usr/bin/perl -n
+# NOTE the use of -n above; this script is called in a loop.
+use warnings;
+use strict;
+
+our $scanning;
+our %used_chars;
+BEGIN {
+ $scanning = 0;
+ %used_chars = ();
+
+ open STDIN, "../src/main.c" or die "main.c: $!\n";
+}
+
+if (/^static struct cmdline_option option_data/) {
+ $scanning = 1;
+}
+elsif (/[}];/) {
+ $scanning = 0;
+}
+elsif (
+ $scanning &&
+ /^[\t ]*\{ "[^"]*", '(.)', OPT_[A-Z0-9_]*, /
+) {
+ $used_chars{$1} = 1;
+}
+
+END {
+ my $cols = 0;
+ my $max_cols = 13;
+ my $opt_chars =
+ "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
+ print "Free chars:\n\t";
+ for (my $i = 0; $i < length $opt_chars; ++$i, ++$cols) {
+ if ($cols == $max_cols) {
+ $cols = 0;
+ print "\n\t";
+ }
+ my $opt = substr($opt_chars,$i,1);
+ print ' ';
+ if (!$used_chars{ $opt }) {
+ print "-$opt";
+ } else {
+ print ' ';
+ }
+ }
+ print "\n";
+}
+2008-09-09 Gisle Vanem <gvanem@broadpark.no>
+
+ * config-compiler.h: MingW do have <stdint.h>; added HAVE_STDINT_H.
+ Added _CRT_SECURE_NO_WARNINGS to supress warnings in MSVC8+ about
+ using "old" ANSI-functions.
+
+ * config.h: config-post.h is gone. SIZEOF_LONG_LONG is 8.
+
2008-01-25 Micah Cowan <micah@cowan.name>
* Makefile.am, Makefile.doc, Makefile.src, Makefile.top,
/* MinGW and GCC support some POSIX and C99 features. */
#define HAVE_INTTYPES_H 1
+#define HAVE_STDINT_H 1
#define HAVE__BOOL 1
#undef SIZEOF_LONG_LONG /* avoid redefinition warning */
#if _MSC_VER >= 1400
#pragma warning ( disable : 4996 )
#define _CRT_SECURE_NO_DEPRECATE
+#define _CRT_SECURE_NO_WARNINGS
#endif
\f
#define SIZEOF_LONG 4
/* The size of a `long long', as computed by sizeof. */
-#define SIZEOF_LONG_LONG 0
+#define SIZEOF_LONG_LONG 8
/* The size of a `off_t', as computed by sizeof. */
#define SIZEOF_OFF_T 4
/* Include compiler-specific defines. */
#include "config-compiler.h"
-#include "config-post.h"
-