From: Micah Cowan Date: Mon, 1 Dec 2008 15:05:29 +0000 (-0800) Subject: Automated merge. X-Git-Tag: v1.13~338^2~2 X-Git-Url: http://sjero.net/git/?p=wget;a=commitdiff_plain;h=1b4ed7dcb7bdad277a9ee2c5a42b6e70854db802;hp=66dd4bda74bb78915b92cac4e7bfd32a3fe9d957 Automated merge. --- diff --git a/ChangeLog b/ChangeLog index c19c374f..a891c52e 100644 --- a/ChangeLog +++ b/ChangeLog @@ -1,3 +1,18 @@ +2008-11-10 Micah Cowan + + * MAILING-LIST: Mention Gmane, introduce subsections. + +2008-11-05 Micah Cowan + + * MAILING-LIST: Mention moderation for unsubscribed posts, and + archive location. + +2008-10-31 Micah Cowan + + * MAILING-LIST: Update information. + + * NEWS: Add mention of mailing list move. + 2008-08-01 Joao Ferreira * NEWS: Added option --default-page to support alternative diff --git a/MAILING-LIST b/MAILING-LIST index bab1629c..9f6212a8 100644 --- a/MAILING-LIST +++ b/MAILING-LIST @@ -1,33 +1,50 @@ -Mailing List -================ - -There are several Wget-related mailing lists. The general discussion -list is at . It is the preferred place for support -requests and suggestions, as well as for discussion of development. -You are invited to subscribe. - - To subscribe, simply send mail to and -follow the instructions. Unsubscribe by mailing to -. The mailing list is archived at -`http://www.mail-archive.com/wget%40sunsite.dk/' and at -`http://news.gmane.org/gmane.comp.web.wget.general'. - - Another mailing list is at , and is used to -submit patches for review by Wget developers. A "patch" is a textual -representation of change to source code, readable by both humans and -programs. The file `PATCHES' that comes with Wget covers the creation -and submitting of patches in detail. Please don't send general -suggestions or bug reports to `wget-patches'; use it only for patch -submissions. - - Subscription is the same as above for , except that -you send to , instead. The mailing -list is archived at `http://news.gmane.org/gmane.comp.web.wget.patches'. - - Finally, there is the mailing list. -This is a non-discussion list that receives commit notifications from -the source repository, and also bug report-change notifications. This -is the highest-traffic list for Wget, and is recommended only for -people who are seriously interested in ongoing Wget development. -Subscription is through the `mailman' interface at +Mailing Lists +============= + +Primary List +------------ + +The primary mailinglist for discussion, bug-reports, or questions about +GNU Wget is at . To subscribe, send an email to +, or visit +`http://lists.gnu.org/mailman/listinfo/bug-wget'. + + You do not need to subscribe to send a message to the list; however, +please note that unsubscribed messages are moderated, and may take a +while before they hit the list--*usually around a day*. If you want +your message to show up immediately, please subscribe to the list +before posting. Archives for the list may be found at +`http://lists.gnu.org/pipermail/bug-wget/'. + + An NNTP/Usenettish gateway is also available via Gmane +(http://gmane.org/about.php). You can see the Gmane archives at +`http://news.gmane.org/gmane.comp.web.wget.general'. Note that the +Gmane archives conveniently include messages from both the current +list, and the previous one. Messages also show up in the Gmane archives +sooner than they do at `lists.gnu.org'. + +Bug Notices List +---------------- + +Additionally, there is the mailing +list. This is a non-discussion list that receives bug report +notifications from the bug-tracker. To subscribe to this list, send an +email to , or visit `http://addictivecode.org/mailman/listinfo/wget-notify'. + +Obsolete Lists +-------------- + +Previously, the mailing list was used as the main +discussion list, and another list, was used +for submitting and discussing patches to GNU Wget. + + Messages from are archived at + `http://www.mail-archive.com/wget%40sunsite.dk/' and at + + `http://news.gmane.org/gmane.comp.web.wget.general' (which also + continues to archive the current list, ). + + Messages from are archived at + `http://news.gmane.org/gmane.comp.web.wget.patches'. + diff --git a/NEWS b/NEWS index 7d15a98a..3ae5e5ad 100644 --- a/NEWS +++ b/NEWS @@ -8,6 +8,8 @@ Please send GNU Wget bug reports to . * Changes in Wget 1.12 (MAINLINE) +** Mailing list MOVED to bug-wget@gnu.org + ** --default-page option added to support alternative default names for index.html. @@ -27,6 +29,9 @@ support password prompts at the console. ** The --input-file option now also handles retrieving links from an external file. + +** Several previously existing, but undocumented .wgetrc options +are now documented: save_headers, spider, and user_agent. * Changes in Wget 1.11.4 diff --git a/doc/ChangeLog b/doc/ChangeLog index 94a06283..dc1d4084 100644 --- a/doc/ChangeLog +++ b/doc/ChangeLog @@ -1,3 +1,47 @@ +2008-11-15 Steven Schubiger + + * sample.wgetrc: Comment the waitretry "default" value, + because there is a global one now. + + * wget.texi (Download Options): Mention the global + default value. + +2008-11-10 Micah Cowan + + * Makefile.am (EXTRA_DIST): Removed no-longer-present + README.maint (shouldn't have been there in the first place). + + * wget.texi (Mailing Lists): Added information aboug Gmane portal, + added subsection headings. + + Update node pointers. + +2008-11-05 Micah Cowan + + * wget.texi: Move --no-http-keep-alive from FTP Options to HTTP + Options. + (Mailing List): Mention moderation for unsubscribed posts, and + archive location. + +2008-11-04 Micah Cowan + + * wget.texi, fdl.texi: Updated to FDL version 1.3. + +2008-10-31 Micah Cowan + + * wget.texi (Mailing List): Update info to reflect change to + bug-wget@gnu.org. + +2008-09-30 Steven Schubiger + + * wget.texi (Wgetrc Commands): Add default_page, save_headers, + spider and user_agent to the list of recognized commands. + +2008-09-10 Michael Kessler + + * wget.texi (Robot Exclusion): Fixed typo "downloads" -> + "download" + 2008-08-03 Xavier Saint * wget.texi : Add option descriptions for the three new diff --git a/doc/Makefile.am b/doc/Makefile.am index c0e10dbf..74abe7f6 100644 --- a/doc/Makefile.am +++ b/doc/Makefile.am @@ -48,7 +48,8 @@ $(SAMPLERCTEXI): $(srcdir)/sample.wgetrc info_TEXINFOS = wget.texi wget_TEXINFOS = fdl.texi sample.wgetrc.munged_for_texi_inclusion -EXTRA_DIST = README.maint sample.wgetrc $(SAMPLERCTEXI) \ +EXTRA_DIST = sample.wgetrc \ + $(SAMPLERCTEXI) \ texi2pod.pl wget.pod: $(srcdir)/wget.texi $(srcdir)/version.texi diff --git a/doc/README.maint b/doc/README.maint deleted file mode 100644 index 71eccbf1..00000000 --- a/doc/README.maint +++ /dev/null @@ -1,130 +0,0 @@ - -TO RELEASE WGET X.Y.Z: - -1) update PO files from the TP - -cd po -../util/update_po_files.sh - - -2) generate tarball - -from the trunk: - -cd ~/tmp -~/code/svn/wget/trunk/util/dist-wget --force-version X.Y.Z - -from a branch: - -cd ~/tmp -~/code/svn/wget/branches/X.Y/util/dist-wget --force-version X.Y.Z -b branches/X.Y - - -3) test the tarball - - -4) set new version number "X.Y.Z" on the repository - - -5) tag the sources in subversion - -from the trunk: - -svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/trunk http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/ - -from a branch: - -svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/branches/X.Y/ http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/ - - -6) upload the tarball on gnu.org - -RELEASE=X.Y.Z -TARBALL=wget-${RELEASE}.tar.gz -gpg --default-key 7B2FD4B0 --detach-sign -b --output ${TARBALL}.sig $TARBALL -echo -e "version: 1.1\ndirectory: wget\nfilename: $TARBALL\ncomment: Wget release ${RELEASE}" > ${TARBALL}.directive -gpg --default-key 7B2FD4B0 --clearsign ${TARBALL}.directive - -lftp ftp://ftp-upload.gnu.org/incoming/ftp -(use ftp://ftp-upload.gnu.org/incoming/alpha for pre-releases) - -put wget-X.Y.Z.tar.gz -put wget-X.Y.Z.tar.gz.sig -put wget-X.Y.Z.tar.gz.directive.asc - - - -7) update wget.sunsite.dk and gnu.org/software/wget - - -8) send announcement on wget@sunsite.dk: - -hi to everybody, - -i have just uploaded the wget X.Y.Z tarball on ftp.gnu.org: - -ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz - -you can find the GPG signature of the tarball at these URLs: - -ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig - -and the GPG key i have used for the signature at this URL: - -http://www.tortonesi.com/GNU-GPG-Key.txt - -the key fingerprint is: - -pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer) - - Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0 - -the MD5 checksum of the tarball is: - -MD5 of tarball wget-X.Y.Z.tar.gz - -{DESCRIPTION OF THE CHANGES} - - -9) send announcement on info-gnu@gnu.org - -I'm very pleased to announce the availability of GNU Wget X.Y.Z. - -GNU Wget is a non-interactive command-line tool for retrieving files using -HTTP, HTTPS and FTP, which may easily be called from scripts, cron jobs, -terminals without X-Windows support, etc. - -For more information, please see: - - http://www.gnu.org/software/wget - http://wget.sunsite.dk - -Here are the compressed sources and the GPG detached signature: - -ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz -ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig - -The MD5 checksums of the tarball is: - -MD5 of tarball wget-X.Y.Z.tar.gz - - -The GPG key I have used for the tarball signature is available at this URL: - -http://www.tortonesi.com/GNU-GPG-Key.txt - -the key fingerprint is: - -pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer) - - Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0 - -{DESCRIPTION OF THE CHANGES} - - -10) post announcement on freshmeat.net - - -11) set new version number "X.Y.Z+devel" on the repository - - diff --git a/doc/fdl.texi b/doc/fdl.texi index 9c6d9afe..8805f1a4 100644 --- a/doc/fdl.texi +++ b/doc/fdl.texi @@ -1,13 +1,12 @@ +@c The GNU Free Documentation License. +@center Version 1.3, 3 November 2008 -@node GNU Free Documentation License -@appendixsec GNU Free Documentation License - -@cindex FDL, GNU Free Documentation License -@center Version 1.2, November 2002 +@c This file is intended to be included within another document, +@c hence no sectioning command or @node. @display -Copyright @copyright{} 2000,2001,2002 Free Software Foundation, Inc. -51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA +Copyright @copyright{} 2000, 2001, 2002, 2007, 2008 Free Software Foundation, Inc. +@uref{http://fsf.org/} Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. @@ -112,6 +111,9 @@ formats which do not have any title page as such, ``Title Page'' means the text near the most prominent appearance of the work's title, preceding the beginning of the body of the text. +The ``publisher'' means any person or entity that distributes copies +of the Document to the public. + A section ``Entitled XYZ'' means a named subunit of the Document whose title either is precisely XYZ or contains XYZ in parentheses following text that translates XYZ in another language. (Here XYZ stands for a @@ -380,13 +382,30 @@ title. @item TERMINATION -You may not copy, modify, sublicense, or distribute the Document except -as expressly provided for under this License. Any other attempt to -copy, modify, sublicense or distribute the Document is void, and will -automatically terminate your rights under this License. However, -parties who have received copies, or rights, from you under this -License will not have their licenses terminated so long as such -parties remain in full compliance. +You may not copy, modify, sublicense, or distribute the Document +except as expressly provided under this License. Any attempt +otherwise to copy, modify, sublicense, or distribute it is void, and +will automatically terminate your rights under this License. + +However, if you cease all violation of this License, then your license +from a particular copyright holder is reinstated (a) provisionally, +unless and until the copyright holder explicitly and finally +terminates your license, and (b) permanently, if the copyright holder +fails to notify you of the violation by some reasonable means prior to +60 days after the cessation. + +Moreover, your license from a particular copyright holder is +reinstated permanently if the copyright holder notifies you of the +violation by some reasonable means, this is the first time you have +received notice of violation of this License (for any work) from that +copyright holder, and you cure the violation prior to 30 days after +your receipt of the notice. + +Termination of your rights under this section does not terminate the +licenses of parties who have received copies or rights from you under +this License. If your rights have been terminated and not permanently +reinstated, receipt of a copy of some or all of the same material does +not give you any rights to use it. @item FUTURE REVISIONS OF THIS LICENSE @@ -404,7 +423,42 @@ following the terms and conditions either of that specified version or of any later version that has been published (not as a draft) by the Free Software Foundation. If the Document does not specify a version number of this License, you may choose any version ever published (not -as a draft) by the Free Software Foundation. +as a draft) by the Free Software Foundation. If the Document +specifies that a proxy can decide which future versions of this +License can be used, that proxy's public statement of acceptance of a +version permanently authorizes you to choose that version for the +Document. + +@item +RELICENSING + +``Massive Multiauthor Collaboration Site'' (or ``MMC Site'') means any +World Wide Web server that publishes copyrightable works and also +provides prominent facilities for anybody to edit those works. A +public wiki that anybody can edit is an example of such a server. A +``Massive Multiauthor Collaboration'' (or ``MMC'') contained in the +site means any set of copyrightable works thus published on the MMC +site. + +``CC-BY-SA'' means the Creative Commons Attribution-Share Alike 3.0 +license published by Creative Commons Corporation, a not-for-profit +corporation with a principal place of business in San Francisco, +California, as well as future copyleft versions of that license +published by that same organization. + +``Incorporate'' means to publish or republish a Document, in whole or +in part, as part of another Document. + +An MMC is ``eligible for relicensing'' if it is licensed under this +License, and if all works that were first published under this License +somewhere other than this MMC, and subsequently incorporated in whole +or in part into the MMC, (1) had no cover texts or invariant sections, +and (2) were thus incorporated prior to November 1, 2008. + +The operator of an MMC Site may republish an MMC contained in the site +under CC-BY-SA on the same site at any time before August 1, 2009, +provided the MMC is eligible for relicensing. + @end enumerate @page @@ -418,7 +472,7 @@ license notices just after the title page: @group Copyright (C) @var{year} @var{your name}. Permission is granted to copy, distribute and/or modify this document - under the terms of the GNU Free Documentation License, Version 1.2 + under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the section entitled ``GNU @@ -427,7 +481,7 @@ license notices just after the title page: @end smallexample If you have Invariant Sections, Front-Cover Texts and Back-Cover Texts, -replace the ``with...Texts.'' line with this: +replace the ``with@dots{}Texts.'' line with this: @smallexample @group diff --git a/doc/sample.wgetrc b/doc/sample.wgetrc index 7ef9ef4a..12914aea 100644 --- a/doc/sample.wgetrc +++ b/doc/sample.wgetrc @@ -49,7 +49,7 @@ # downloads, set waitretry to maximum number of seconds to wait (Wget # will use "linear backoff", waiting 1 second after the first failure # on a file, 2 seconds after the second failure, etc. up to this max). -waitretry = 10 +#waitretry = 10 ## diff --git a/doc/wget.texi b/doc/wget.texi index 54e2eb9d..a2804fb4 100644 --- a/doc/wget.texi +++ b/doc/wget.texi @@ -82,7 +82,7 @@ Info entry for @file{wget}. @contents @ifnottex -@node Top +@node Top, Overview, (dir), (dir) @top Wget @value{VERSION} @insertcopying @@ -102,7 +102,7 @@ Info entry for @file{wget}. * Concept Index:: Topics covered by this manual. @end menu -@node Overview +@node Overview, Invoking, Top, Top @chapter Overview @cindex overview @cindex features @@ -211,7 +211,7 @@ Public License, as published by the Free Software Foundation (see the file @file{COPYING} that came with GNU Wget, for details). @end itemize -@node Invoking +@node Invoking, Recursive Download, Overview, Top @chapter Invoking @cindex invoking @cindex command line @@ -248,7 +248,7 @@ the command line. * Recursive Accept/Reject Options:: @end menu -@node URL Format +@node URL Format, Option Syntax, Invoking, Invoking @section URL Format @cindex URL @cindex URL syntax @@ -326,7 +326,7 @@ with your favorite browser, like @code{Lynx} or @code{Netscape}. @c man begin OPTIONS -@node Option Syntax +@node Option Syntax, Basic Startup Options, URL Format, Invoking @section Option Syntax @cindex option syntax @cindex syntax of options @@ -401,7 +401,7 @@ the default. For instance, using @code{follow_ftp = off} in using @samp{--no-follow-ftp} is the only way to restore the factory default from the command line. -@node Basic Startup Options +@node Basic Startup Options, Logging and Input File Options, Option Syntax, Invoking @section Basic Startup Options @table @samp @@ -429,7 +429,7 @@ instances of @samp{-e}. @end table -@node Logging and Input File Options +@node Logging and Input File Options, Download Options, Basic Startup Options, Invoking @section Logging and Input File Options @table @samp @@ -517,7 +517,7 @@ Prepends @var{URL} to relative links read from the file specified with the @samp{-i} option. @end table -@node Download Options +@node Download Options, Directory Options, Logging and Input File Options, Invoking @section Download Options @table @samp @@ -861,10 +861,9 @@ use @dfn{linear backoff}, waiting 1 second after the first failure on a given file, then waiting 2 seconds after the second failure on that file, up to the maximum number of @var{seconds} you specify. Therefore, a value of 10 will actually make Wget wait up to (1 + 2 + ... + 10) = 55 -seconds per file. +seconds per file. -Note that this option is turned on by default in the global -@file{wgetrc} file. +By default, Wget will assume a value of 10 seconds. @cindex wait, random @cindex random wait @@ -1038,7 +1037,7 @@ Prompt for a password for each connection established. Cannot be specified when @samp{--password} is being used, because they are mutually exclusive. @end table -@node Directory Options +@node Directory Options, HTTP Options, Download Options, Invoking @section Directory Options @table @samp @@ -1110,7 +1109,7 @@ i.e. the top of the retrieval tree. The default is @samp{.} (the current directory). @end table -@node HTTP Options +@node HTTP Options, HTTPS (SSL/TLS) Options, Directory Options, Invoking @section HTTP Options @table @samp @@ -1170,6 +1169,19 @@ For more information about security issues with Wget, @xref{Security Considerations}. @end iftex +@cindex Keep-Alive, turning off +@cindex Persistent Connections, disabling +@item --no-http-keep-alive +Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget +asks the server to keep the connection open so that, when you download +more than one document from the same server, they get transferred over +the same TCP connection. This saves time and at the same time reduces +the load on the server. + +This option is useful when, for some reason, persistent (keep-alive) +connections don't work for you, for example due to a server bug or due +to the inability of server-side scripts to cope with the connections. + @cindex proxy @cindex cache @item --no-cache @@ -1444,7 +1456,7 @@ form-based authentication. @end table -@node HTTPS (SSL/TLS) Options +@node HTTPS (SSL/TLS) Options, FTP Options, HTTP Options, Invoking @section HTTPS (SSL/TLS) Options @cindex SSL @@ -1569,7 +1581,7 @@ not used), EGD is never contacted. EGD is not needed on modern Unix systems that support @file{/dev/random}. @end table -@node FTP Options +@node FTP Options, Recursive Retrieval Options, HTTPS (SSL/TLS) Options, Invoking @section FTP Options @table @samp @@ -1672,22 +1684,9 @@ Note that when retrieving a file (not a directory) because it was specified on the command-line, rather than because it was recursed to, this option has no effect. Symbolic links are always traversed in this case. - -@cindex Keep-Alive, turning off -@cindex Persistent Connections, disabling -@item --no-http-keep-alive -Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget -asks the server to keep the connection open so that, when you download -more than one document from the same server, they get transferred over -the same TCP connection. This saves time and at the same time reduces -the load on the server. - -This option is useful when, for some reason, persistent (keep-alive) -connections don't work for you, for example due to a server bug or due -to the inability of server-side scripts to cope with the connections. @end table -@node Recursive Retrieval Options +@node Recursive Retrieval Options, Recursive Accept/Reject Options, FTP Options, Invoking @section Recursive Retrieval Options @table @samp @@ -1892,7 +1891,7 @@ If, for whatever reason, you want strict comment parsing, use this option to turn it on. @end table -@node Recursive Accept/Reject Options +@node Recursive Accept/Reject Options, , Recursive Retrieval Options, Invoking @section Recursive Accept/Reject Options @table @samp @@ -1987,7 +1986,7 @@ This is a useful option, since it guarantees that only the files @c man end -@node Recursive Download +@node Recursive Download, Following Links, Invoking, Top @chapter Recursive Download @cindex recursion @cindex retrieving @@ -2055,7 +2054,7 @@ about this. Recursive retrieval should be used with care. Don't say you were not warned. -@node Following Links +@node Following Links, Time-Stamping, Recursive Download, Top @chapter Following Links @cindex links @cindex following links @@ -2079,7 +2078,7 @@ links it will follow. * FTP Links:: Following FTP links. @end menu -@node Spanning Hosts +@node Spanning Hosts, Types of Files, Following Links, Following Links @section Spanning Hosts @cindex spanning hosts @cindex hosts, spanning @@ -2136,7 +2135,7 @@ wget -rH -Dfoo.edu --exclude-domains sunsite.foo.edu \ @end table -@node Types of Files +@node Types of Files, Directory-Based Limits, Spanning Hosts, Following Links @section Types of Files @cindex types of files @@ -2241,7 +2240,7 @@ local filenames, and so @emph{do} contribute to filename matching. This behavior, too, is considered less-than-desirable, and may change in a future version of Wget. -@node Directory-Based Limits +@node Directory-Based Limits, Relative Links, Types of Files, Following Links @section Directory-Based Limits @cindex directories @cindex directory limits @@ -2325,7 +2324,7 @@ directory, while in @samp{http://foo/bar} (no trailing slash), meaningless, as its parent is @samp{/}). @end table -@node Relative Links +@node Relative Links, FTP Links, Directory-Based Limits, Following Links @section Relative Links @cindex relative links @@ -2354,7 +2353,7 @@ to ``just work'' without having to convert links. This option is probably not very useful and might be removed in a future release. -@node FTP Links +@node FTP Links, , Relative Links, Following Links @section Following FTP Links @cindex following ftp links @@ -2374,7 +2373,7 @@ effect on such downloads. On the other hand, domain acceptance Also note that followed links to @sc{ftp} directories will not be retrieved recursively further. -@node Time-Stamping +@node Time-Stamping, Startup File, Following Links, Top @chapter Time-Stamping @cindex time-stamping @cindex timestamping @@ -2424,7 +2423,7 @@ say. * FTP Time-Stamping Internals:: @end menu -@node Time-Stamping Usage +@node Time-Stamping Usage, HTTP Time-Stamping Internals, Time-Stamping, Time-Stamping @section Time-Stamping Usage @cindex time-stamping usage @cindex usage, time-stamping @@ -2480,7 +2479,7 @@ gives a timestamp. For @sc{http}, this depends on getting a directory listing with dates in a format that Wget can parse (@pxref{FTP Time-Stamping Internals}). -@node HTTP Time-Stamping Internals +@node HTTP Time-Stamping Internals, FTP Time-Stamping Internals, Time-Stamping Usage, Time-Stamping @section HTTP Time-Stamping Internals @cindex http time-stamping @@ -2512,7 +2511,7 @@ with @samp{-N}, server file @samp{@var{X}} is compared to local file Arguably, @sc{http} time-stamping should be implemented using the @code{If-Modified-Since} request. -@node FTP Time-Stamping Internals +@node FTP Time-Stamping Internals, , HTTP Time-Stamping Internals, Time-Stamping @section FTP Time-Stamping Internals @cindex ftp time-stamping @@ -2541,7 +2540,7 @@ that is supported by some @sc{ftp} servers (including the popular @code{wu-ftpd}), which returns the exact time of the specified file. Wget may support this command in the future. -@node Startup File +@node Startup File, Examples, Time-Stamping, Top @chapter Startup File @cindex startup file @cindex wgetrc @@ -2569,7 +2568,7 @@ commands. * Sample Wgetrc:: A wgetrc example. @end menu -@node Wgetrc Location +@node Wgetrc Location, Wgetrc Syntax, Startup File, Startup File @section Wgetrc Location @cindex wgetrc location @cindex location of wgetrc @@ -2590,7 +2589,7 @@ means that in case of collision user's wgetrc @emph{overrides} the system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default). Fascist admins, away! -@node Wgetrc Syntax +@node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File @section Wgetrc Syntax @cindex wgetrc syntax @cindex syntax of wgetrc @@ -2617,7 +2616,7 @@ global @file{wgetrc}, you can do it with: reject = @end example -@node Wgetrc Commands +@node Wgetrc Commands, Sample Wgetrc, Wgetrc Syntax, Startup File @section Wgetrc Commands @cindex wgetrc commands @@ -2710,6 +2709,9 @@ Ignore @var{n} remote directory components. Equivalent to @item debug = on/off Debug mode, same as @samp{-d}. +@item default_page = @var{string} +Default page name---the same as @samp{--default-page=@var{string}}. + @item delete_after = on/off Delete after download---the same as @samp{--delete-after}. @@ -3002,6 +3004,9 @@ this off. Save cookies to @var{file}. The same as @samp{--save-cookies @var{file}}. +@item save_headers = on/off +Same as @samp{--save-headers}. + @item secure_protocol = @var{string} Choose the secure protocol to be used. Legal values are @samp{auto} (the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same @@ -3014,6 +3019,9 @@ responses---the same as @samp{-S}. @item span_hosts = on/off Same as @samp{-H}. +@item spider = on/off +Same as @samp{--spider}. + @item strict_comments = on/off Same as @samp{--strict-comments}. @@ -3037,6 +3045,10 @@ Specify username @var{string} for both @sc{ftp} and @sc{http} file retrieval. This command can be overridden using the @samp{ftp_user} and @samp{http_user} command for @sc{ftp} and @sc{http} respectively. +@item user_agent = @var{string} +User agent identification sent to the HTTP Server---the same as +@samp{--user-agent=@var{string}}. + @item verbose = on/off Turn verbose on/off---the same as @samp{-v}/@samp{-nv}. @@ -3050,7 +3062,7 @@ only---the same as @samp{--waitretry=@var{n}}. Note that this is turned on by default in the global @file{wgetrc}. @end table -@node Sample Wgetrc +@node Sample Wgetrc, , Wgetrc Commands, Startup File @section Sample Wgetrc @cindex sample wgetrc @@ -3067,7 +3079,7 @@ its line. @include sample.wgetrc.munged_for_texi_inclusion @end example -@node Examples +@node Examples, Various, Startup File, Top @chapter Examples @cindex examples @@ -3081,7 +3093,7 @@ complexity. * Very Advanced Usage:: The hairy stuff. @end menu -@node Simple Usage +@node Simple Usage, Advanced Usage, Examples, Examples @section Simple Usage @itemize @bullet @@ -3134,7 +3146,7 @@ links index.html @end example @end itemize -@node Advanced Usage +@node Advanced Usage, Very Advanced Usage, Simple Usage, Examples @section Advanced Usage @itemize @bullet @@ -3270,7 +3282,7 @@ wget -O - http://cool.list.com/ | wget --force-html -i - @end example @end itemize -@node Very Advanced Usage +@node Very Advanced Usage, , Advanced Usage, Examples @section Very Advanced Usage @cindex mirroring @@ -3319,7 +3331,7 @@ wget -m -k -K -E http://www.gnu.org/ -o /home/me/weeklog @end itemize @c man end -@node Various +@node Various, Appendices, Examples, Top @chapter Various @cindex various @@ -3329,14 +3341,14 @@ This chapter contains all the stuff that could not fit anywhere else. * Proxies:: Support for proxy servers. * Distribution:: Getting the latest version. * Web Site:: GNU Wget's presence on the World Wide Web. -* Mailing List:: Wget mailing list for announcements and discussion. +* Mailing Lists:: Wget mailing list for announcements and discussion. * Internet Relay Chat:: Wget's presence on IRC. * Reporting Bugs:: How and where to report bugs. * Portability:: The systems Wget works on. * Signals:: Signal-handling performed by Wget. @end menu -@node Proxies +@node Proxies, Distribution, Various, Various @section Proxies @cindex proxies @@ -3412,7 +3424,7 @@ Alternatively, you may use the @samp{proxy-user} and settings @code{proxy_user} and @code{proxy_password} to set the proxy username and password. -@node Distribution +@node Distribution, Web Site, Proxies, Various @section Distribution @cindex latest version @@ -3421,7 +3433,7 @@ master GNU archive site ftp.gnu.org, and its mirrors. For example, Wget @value{VERSION} can be found at @url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz} -@node Web Site +@node Web Site, Mailing Lists, Distribution, Various @section Web Site @cindex web site @@ -3430,43 +3442,64 @@ The official web site for GNU Wget is at information resides at ``The Wget Wgiki'', @url{http://wget.addictivecode.org/}. -@node Mailing List -@section Mailing List +@node Mailing Lists, Internet Relay Chat, Web Site, Various +@section Mailing Lists @cindex mailing list @cindex list -There are several Wget-related mailing lists. The general discussion -list is at @email{wget@@sunsite.dk}. It is the preferred place for -support requests and suggestions, as well as for discussion of -development. You are invited to subscribe. - -To subscribe, simply send mail to @email{wget-subscribe@@sunsite.dk} -and follow the instructions. Unsubscribe by mailing to -@email{wget-unsubscribe@@sunsite.dk}. The mailing list is archived at +@unnumberedsubsec Primary List + +The primary mailinglist for discussion, bug-reports, or questions +about GNU Wget is at @email{bug-wget@@gnu.org}. To subscribe, send an +email to @email{bug-wget-join@@gnu.org}, or visit +@url{http://lists.gnu.org/mailman/listinfo/bug-wget}. + +You do not need to subscribe to send a message to the list; however, +please note that unsubscribed messages are moderated, and may take a +while before they hit the list---@strong{usually around a day}. If +you want your message to show up immediately, please subscribe to the +list before posting. Archives for the list may be found at +@url{http://lists.gnu.org/pipermail/bug-wget/}. + +An NNTP/Usenettish gateway is also available via +@uref{http://gmane.org/about.php,Gmane}. You can see the Gmane +archives at +@url{http://news.gmane.org/gmane.comp.web.wget.general}. Note that the +Gmane archives conveniently include messages from both the current +list, and the previous one. Messages also show up in the Gmane +archives sooner than they do at @url{lists.gnu.org}. + +@unnumberedsubsec Bug Notices List + +Additionally, there is the @email{wget-notify@@addictivecode.org} mailing +list. This is a non-discussion list that receives bug report +notifications from the bug-tracker. To subscribe to this list, +send an email to @email{wget-notify-join@@addictivecode.org}, +or visit @url{http://addictivecode.org/mailman/listinfo/wget-notify}. + +@unnumberedsubsec Obsolete Lists + +Previously, the mailing list @email{wget@@sunsite.dk} was used as the +main discussion list, and another list, +@email{wget-patches@@sunsite.dk} was used for submitting and +discussing patches to GNU Wget. + +Messages from @email{wget@@sunsite.dk} are archived at +@itemize @tie{} +@item @url{http://www.mail-archive.com/wget%40sunsite.dk/} and at -@url{http://news.gmane.org/gmane.comp.web.wget.general}. - -Another mailing list is at @email{wget-patches@@sunsite.dk}, and is -used to submit patches for review by Wget developers. A ``patch'' is -a textual representation of change to source code, readable by both -humans and programs. The -@url{http://wget.addictivecode.org/PatchGuidelines} page -covers the creation and submitting of patches in detail. Please don't -send general suggestions or bug reports to @samp{wget-patches}; use it -only for patch submissions. - -Subscription is the same as above for @email{wget@@sunsite.dk}, except -that you send to @email{wget-patches-subscribe@@sunsite.dk}, instead. -The mailing list is archived at -@url{http://news.gmane.org/gmane.comp.web.wget.patches}. +@item +@url{http://news.gmane.org/gmane.comp.web.wget.general} (which also +continues to archive the current list, @email{bug-wget@@gnu.org}). +@end itemize -Finally, there is the @email{wget-notify@@addictivecode.org} mailing -list. This is a non-discussion list that receives bug report-change -notifications from the bug-tracker. Unlike for the other mailing lists, -subscription is through the @code{mailman} interface at -@url{http://addictivecode.org/mailman/listinfo/wget-notify}. +Messages from @email{wget-patches@@sunsite.dk} are archived at +@itemize @tie{} +@item +@url{http://news.gmane.org/gmane.comp.web.wget.patches}. +@end itemize -@node Internet Relay Chat +@node Internet Relay Chat, Reporting Bugs, Mailing Lists, Various @section Internet Relay Chat @cindex Internet Relay Chat @cindex IRC @@ -3475,7 +3508,7 @@ subscription is through the @code{mailman} interface at In addition to the mailinglists, we also have a support channel set up via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out! -@node Reporting Bugs +@node Reporting Bugs, Portability, Internet Relay Chat, Various @section Reporting Bugs @cindex bugs @cindex reporting bugs @@ -3495,7 +3528,7 @@ Wget crashes, it's a bug. If Wget does not behave as documented, it's a bug. If things work strange, but you are not sure about the way they are supposed to work, it might well be a bug, but you might want to double-check the documentation and the mailing lists (@pxref{Mailing -List}). +Lists}). @item Try to repeat the bug in as simple circumstances as possible. E.g. if @@ -3534,7 +3567,7 @@ safe to try. @end enumerate @c man end -@node Portability +@node Portability, Signals, Reporting Bugs, Various @section Portability @cindex portability @cindex operating systems @@ -3567,7 +3600,7 @@ Support for building on MS-DOS via DJGPP has been contributed by Gisle Vanem; a port to VMS is maintained by Steven Schweda, and is available at @url{http://antinode.org/}. -@node Signals +@node Signals, , Portability, Various @section Signals @cindex signal handling @cindex hangup @@ -3588,7 +3621,7 @@ SIGHUP received, redirecting output to `wget-log'. Other than that, Wget will not try to interfere with signals in any way. @kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike. -@node Appendices +@node Appendices, Copying this manual, Various, Top @chapter Appendices This chapter contains some references I consider useful. @@ -3599,7 +3632,7 @@ This chapter contains some references I consider useful. * Contributors:: People who helped. @end menu -@node Robot Exclusion +@node Robot Exclusion, Security Considerations, Appendices, Appendices @section Robot Exclusion @cindex robot exclusion @cindex robots.txt @@ -3638,7 +3671,7 @@ avoid. To be found by the robots, the specifications must be placed in download and parse. Although Wget is not a web robot in the strictest sense of the word, it -can downloads large parts of the site without the user's intervention to +can download large parts of the site without the user's intervention to download an individual page. Because of that, Wget honors RES when downloading recursively. For instance, when you issue: @@ -3682,7 +3715,7 @@ robot exclusion, set the @code{robots} variable to @samp{off} in your @file{.wgetrc}. You can achieve the same effect from the command line using the @code{-e} switch, e.g. @samp{wget -e robots=off @var{url}...}. -@node Security Considerations +@node Security Considerations, Contributors, Robot Exclusion, Appendices @section Security Considerations @cindex security @@ -3713,7 +3746,7 @@ being careful when you send debug logs (yes, even when you send them to me). @end enumerate -@node Contributors +@node Contributors, , Security Considerations, Appendices @section Contributors @cindex contributors @@ -4058,17 +4091,21 @@ Kristijan Zimmer. Apologies to all who I accidentally left out, and many thanks to all the subscribers of the Wget mailing list. -@node Copying this manual +@node Copying this manual, Concept Index, Appendices, Top @appendix Copying this manual @menu * GNU Free Documentation License:: Licnse for copying this manual. @end menu +@node GNU Free Documentation License, , Copying this manual, Copying this manual +@appendixsec GNU Free Documentation License +@cindex FDL, GNU Free Documentation License + @include fdl.texi -@node Concept Index +@node Concept Index, , Copying this manual, Top @unnumbered Concept Index @printindex cp diff --git a/src/ChangeLog b/src/ChangeLog index 5e3a8893..2d3331f1 100644 --- a/src/ChangeLog +++ b/src/ChangeLog @@ -1,3 +1,91 @@ +2008-11-13 Micah Cowan + + * http.c (gethttp): Don't do anything when content-length >= our + requested range. + +2008-11-16 Steven Schubiger + + * main.c: Declare and initialize the numurls counter. + + * ftp.c, http.c: Make the counter visible here and use it. + + * options.h: Remove old declaration from options struct. + +2008-11-15 Steven Schubiger + + * init.c (defaults): Set default waitretry value. + +2008-11-14 Steven Schubiger + + * main.c (format_and_print_line): Use a custom format + string for printing leading spaces. + +2008-11-12 Micah Cowan + + * ftp-ls.c (ftp_index): HTML-escape dir name in title, h1, a:href. + +2008-11-12 Alexander Belopolsky + + * url.c, url.h (url_escape_unsafe_and_reserved): Added. + + * ftp-ls.c (ftp_index): URL-escape, rather than HTML-escape, the + filename appearing in the link. + +2008-11-12 Steven Schubiger + + * main.c (print_version): Hand the relevant + xstrdup/xfree calls back to format_and_print_line(). + +2008-11-11 Steven Schubiger + + * main.c (format_and_print_line): Move both the memory + allocating and freeing bits upwards to print_version(). + +2008-11-10 Saint Xavier + + * http.c: Make --auth-no-challenge works with user:pass@ in URLs. + +2008-11-05 Micah Cowan + + * ftp.c (print_length): Should print humanized "size remaining" + only when it's at least 1k. + +2008-10-31 Micah Cowan + + * main.c (print_version): Add information about the mailing list. + +2008-10-31 Alexander Drozdov + + * retr.c (fd_read_hunk): Make assert deal with maxsize == 0. + + * ftp-ls.c (clean_line): Prevent underflow on empty lines. + +2008-10-26 Gisle Vanem + + * main.c (format_and_print_line): Put variables on top of + blocks (not all compilers are C99). Add an extra '\n' if + SYSTEM_WGETRC isn't defined and printed. + +2008-09-09 Gisle Vanem + + * url.c (url_error): Use aprintf, not asprintf. + +2008-09-09 Micah Cowan + + * init.c (home_dir): Save the calculated value for home, + to avoid duplicated work on repeated calls. + (wgetrc_file_name) [WINDOWS]: Define and initialize home var. + + * build_info.c, main.c: Remove unnecessary extern vars + system_wgetrc and locale_dir. + + * main.c: Define program_name for lib/error.c. + +2008-09-02 Gisle Vanem + + * mswindows.h: Must ensure is included before + we redefine ?vsnprintf(). + 2008-08-08 Steven Schubiger * main.c, utils.h: Removed some dead conditional DEBUG_MALLOC code. diff --git a/src/build_info.c b/src/build_info.c index 542fed8a..532dccaf 100644 --- a/src/build_info.c +++ b/src/build_info.c @@ -33,9 +33,6 @@ as that of the covered work. */ #include "wget.h" #include -char *system_wgetrc = SYSTEM_WGETRC; -char *locale_dir = LOCALEDIR; - const char* (compiled_features[]) = { diff --git a/src/ftp-ls.c b/src/ftp-ls.c index 409996c3..181c8d45 100644 --- a/src/ftp-ls.c +++ b/src/ftp-ls.c @@ -75,6 +75,7 @@ clean_line(char *line) if (!len) return 0; if (line[len - 1] == '\n') line[--len] = '\0'; + if (!len) return 0; if (line[len - 1] == '\r') line[--len] = '\0'; for ( ; *line ; line++ ) if (*line == '\t') *line = ' '; @@ -849,7 +850,9 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f) { FILE *fp; char *upwd; + char *htcldir; /* HTML-clean dir name */ char *htclfile; /* HTML-clean file name */ + char *urlclfile; /* URL-clean file name */ if (!output_stream) { @@ -877,12 +880,16 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f) } else upwd = xstrdup (""); + + htcldir = html_quote_string (u->dir); + fprintf (fp, "\n"); fprintf (fp, "\n\n"); - fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port); + fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port); fprintf (fp, "\n\n\n

"); - fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port); + fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port); fprintf (fp, "

\n
\n
\n");
+
   while (f)
     {
       fprintf (fp, "  ");
@@ -922,13 +929,18 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
           break;
         }
       htclfile = html_quote_string (f->name);
+      urlclfile = url_escape_unsafe_and_reserved (f->name);
       fprintf (fp, "host, u->port);
       if (*u->dir != '/')
         putc ('/', fp);
-      fprintf (fp, "%s", u->dir);
+      /* XXX: Should probably URL-escape dir components here, rather
+       * than just HTML-escape, for consistency with the next bit where
+       * we use urlclfile for the file component. Anyway, this is safer
+       * than what we had... */
+      fprintf (fp, "%s", htcldir);
       if (*u->dir)
         putc ('/', fp);
-      fprintf (fp, "%s", htclfile);
+      fprintf (fp, "%s", urlclfile);
       if (f->type == FT_DIRECTORY)
         putc ('/', fp);
       fprintf (fp, "\">%s", htclfile);
@@ -941,9 +953,11 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
         fprintf (fp, "-> %s", f->linkto ? f->linkto : "(nil)");
       putc ('\n', fp);
       xfree (htclfile);
+      xfree (urlclfile);
       f = f->next;
     }
   fprintf (fp, "
\n\n\n"); + xfree (htcldir); xfree (upwd); if (!output_stream) fclose (fp); diff --git a/src/ftp.c b/src/ftp.c index 482651be..e4b90189 100644 --- a/src/ftp.c +++ b/src/ftp.c @@ -69,6 +69,7 @@ typedef struct struct url *proxy; /* FTWK-style proxy */ } ccon; +extern int numurls; /* Look for regexp "( *[0-9]+ *byte" (literal parenthesis) anywhere in the string S, and return the number converted to wgint, if found, 0 @@ -216,7 +217,7 @@ print_length (wgint size, wgint start, bool authoritative) logprintf (LOG_VERBOSE, " (%s)", human_readable (size)); if (start > 0) { - if (start >= 1024) + if (size - start >= 1024) logprintf (LOG_VERBOSE, _(", %s (%s) remaining"), number_to_static_string (size - start), human_readable (size - start)); @@ -1295,7 +1296,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con) number of bytes and files downloaded. */ { total_downloaded_bytes += len; - opt.numurls++; + numurls++; } /* Deletion of listing files is not controlled by --delete-after, but @@ -1310,7 +1311,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con) for instance, may want to know how many bytes and files they've downloaded through it. */ total_downloaded_bytes += len; - opt.numurls++; + numurls++; if (opt.delete_after) { diff --git a/src/http.c b/src/http.c index 589e18ee..9ed226cb 100644 --- a/src/http.c +++ b/src/http.c @@ -142,6 +142,8 @@ struct request { int hcount, hcapacity; }; +extern int numurls; + /* Create a new, empty request. At least request_set_method must be called before the request can be used. */ @@ -1496,9 +1498,10 @@ gethttp (struct url *u, struct http_stat *hs, int *dt, struct url *proxy, user = user ? user : (opt.http_user ? opt.http_user : opt.user); passwd = passwd ? passwd : (opt.http_passwd ? opt.http_passwd : opt.passwd); - if (user && passwd - && !u->user) /* We only do "site-wide" authentication with "global" - user/password values; URL user/password info overrides. */ + /* We only do "site-wide" authentication with "global" user/password + * values unless --auth-no-challange has been requested; URL user/password + * info overrides. */ + if (user && passwd && (!u->user || opt.auth_without_challenge)) { /* If this is a host for which we've already received a Basic * challenge, we'll go ahead and send Basic authentication creds. */ @@ -2159,11 +2162,15 @@ File %s already there; not retrieving.\n\n"), quote (hs->local_file)); } } - if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE) + if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE + || (hs->restval > 0 && statcode == HTTP_STATUS_OK + && contrange == 0 && hs->restval >= contlen) + ) { /* If `-c' is in use and the file has been fully downloaded (or the remote file has shrunk), Wget effectively requests bytes - after the end of file and the server response with 416. */ + after the end of file and the server response with 416 + (or 200 with a <= Content-Length. */ logputs (LOG_VERBOSE, _("\ \n The file is already fully retrieved; nothing to do.\n\n")); /* In case the caller inspects. */ @@ -2773,7 +2780,7 @@ Remote file exists.\n\n")); number_to_static_string (hstat.contlen), hstat.local_file, count); } - ++opt.numurls; + ++numurls; total_downloaded_bytes += hstat.len; /* Remember that we downloaded the file for later ".orig" code. */ @@ -2801,7 +2808,7 @@ Remote file exists.\n\n")); tms, u->url, number_to_static_string (hstat.len), hstat.local_file, count); } - ++opt.numurls; + ++numurls; total_downloaded_bytes += hstat.len; /* Remember that we downloaded the file for later ".orig" code. */ diff --git a/src/init.c b/src/init.c index fd71a362..5ab0862c 100644 --- a/src/init.c +++ b/src/init.c @@ -335,6 +335,8 @@ defaults (void) opt.max_redirect = 20; + opt.waitretry = 10; + #ifdef ENABLE_IRI opt.enable_iri = true; #else @@ -349,35 +351,41 @@ defaults (void) char * home_dir (void) { - char *home = getenv ("HOME"); + static char buf[PATH_MAX]; + static char *home; if (!home) { + home = getenv ("HOME"); + if (!home) + { #if defined(MSDOS) - /* Under MSDOS, if $HOME isn't defined, use the directory where - `wget.exe' resides. */ - const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */ - char *p, buf[PATH_MAX]; - - strcpy (buf, _w32_get_argv0 ()); - p = strrchr (buf, '/'); /* djgpp */ - if (!p) - p = strrchr (buf, '\\'); /* others */ - assert (p); - *p = '\0'; - home = buf; + /* Under MSDOS, if $HOME isn't defined, use the directory where + `wget.exe' resides. */ + const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */ + char *p; + + strcpy (buf, _w32_get_argv0 ()); + p = strrchr (buf, '/'); /* djgpp */ + if (!p) + p = strrchr (buf, '\\'); /* others */ + assert (p); + *p = '\0'; + home = buf; #elif !defined(WINDOWS) - /* If HOME is not defined, try getting it from the password - file. */ - struct passwd *pwd = getpwuid (getuid ()); - if (!pwd || !pwd->pw_dir) - return NULL; - home = pwd->pw_dir; + /* If HOME is not defined, try getting it from the password + file. */ + struct passwd *pwd = getpwuid (getuid ()); + if (!pwd || !pwd->pw_dir) + return NULL; + strcpy (buf, pwd->pw_dir); + home = buf; #else /* !WINDOWS */ - /* Under Windows, if $HOME isn't defined, use the directory where - `wget.exe' resides. */ - home = ws_mypath (); + /* Under Windows, if $HOME isn't defined, use the directory where + `wget.exe' resides. */ + home = ws_mypath (); #endif /* WINDOWS */ + } } return home ? xstrdup (home) : NULL; @@ -403,12 +411,13 @@ wgetrc_env_file_name (void) } return NULL; } + /* Check for the existance of '$HOME/.wgetrc' and return it's path if it exists and is set. */ char * wgetrc_user_file_name (void) { - char *home = home_dir(); + char *home = home_dir (); char *file = NULL; if (home) file = aprintf ("%s/.wgetrc", home); @@ -422,6 +431,7 @@ wgetrc_user_file_name (void) } return file; } + /* Return the path to the user's .wgetrc. This is either the value of `WGETRC' environment variable, or `$HOME/.wgetrc'. @@ -430,10 +440,11 @@ wgetrc_user_file_name (void) char * wgetrc_file_name (void) { + char *home = NULL; char *file = wgetrc_env_file_name (); if (file && *file) return file; - + file = wgetrc_user_file_name (); #ifdef WINDOWS @@ -441,6 +452,7 @@ wgetrc_file_name (void) `wget.ini' in the directory where `wget.exe' resides; we do this for backward compatibility with previous versions of Wget. SYSTEM_WGETRC should not be defined under WINDOWS. */ + home = home_dir (); if (!file || !file_exists_p (file)) { xfree_null (file); @@ -449,6 +461,7 @@ wgetrc_file_name (void) if (home) file = aprintf ("%s/wget.ini", home); } + xfree_null (home); #endif /* WINDOWS */ if (!file) diff --git a/src/main.c b/src/main.c index 414b62bc..a2d40888 100644 --- a/src/main.c +++ b/src/main.c @@ -72,8 +72,6 @@ extern char *system_getrc; extern char *link_string; /* defined in build_info.c */ extern char *compiled_features[]; -extern char *system_wgetrc; -extern char *locale_dir; /* Used for --version output in print_version */ static const int max_chars_per_line = 72; @@ -82,6 +80,9 @@ static void redirect_output_signal (int); #endif const char *exec_name; + +/* Number of successfully downloaded URLs */ +int numurls = 0; #ifndef TESTING /* Initialize I18N/L10N. That amounts to invoking setlocale, and @@ -711,20 +712,26 @@ prompt_for_password (void) and an appropriate number of spaces are added on subsequent lines.*/ static void -format_and_print_line (char* prefix, char* line, - int line_length) +format_and_print_line (const char *prefix, const char *line, + int line_length) { + int leading_spaces; + int remaining_chars; + char *line_dup, *token; + assert (prefix != NULL); assert (line != NULL); + line_dup = xstrdup (line); + if (line_length <= 0) line_length = max_chars_per_line; - const int leading_spaces = strlen (prefix); + leading_spaces = strlen (prefix); printf ("%s", prefix); - int remaining_chars = line_length - leading_spaces; + remaining_chars = line_length - leading_spaces; /* We break on spaces. */ - char* token = strtok (line, " "); + token = strtok (line_dup, " "); while (token != NULL) { /* If however a token is much larger than the maximum @@ -732,12 +739,7 @@ format_and_print_line (char* prefix, char* line, token on the next line. */ if (remaining_chars <= strlen (token)) { - printf ("\n"); - int j = 0; - for (j = 0; j < leading_spaces; j++) - { - printf (" "); - } + printf ("\n%*c", leading_spaces, ' '); remaining_chars = line_length - leading_spaces; } printf ("%s ", token); @@ -746,8 +748,8 @@ format_and_print_line (char* prefix, char* line, } printf ("\n"); - xfree (prefix); - xfree (line); + + xfree (line_dup); } static void @@ -760,13 +762,15 @@ print_version (void) const char *link_title = "Link : "; const char *prefix_spaces = " "; const int prefix_space_length = strlen (prefix_spaces); + char *line; + char *env_wgetrc, *user_wgetrc; + int i; printf ("GNU Wget %s\n", version_string); printf (options_title); /* compiled_features is a char*[]. We limit the characters per line to max_chars_per_line and prefix each line with a constant number of spaces for proper alignment. */ - int i =0; for (i = 0; compiled_features[i] != NULL; ) { int line_length = max_chars_per_line - prefix_space_length; @@ -785,31 +789,36 @@ print_version (void) /* Handle the case when $WGETRC is unset and $HOME/.wgetrc is absent. */ printf (wgetrc_title); - char *env_wgetrc = wgetrc_env_file_name (); + env_wgetrc = wgetrc_env_file_name (); if (env_wgetrc && *env_wgetrc) { printf ("%s (env)\n%s", env_wgetrc, prefix_spaces); xfree (env_wgetrc); } - char *user_wgetrc = wgetrc_user_file_name (); + user_wgetrc = wgetrc_user_file_name (); if (user_wgetrc) { printf ("%s (user)\n%s", user_wgetrc, prefix_spaces); xfree (user_wgetrc); } - printf ("%s (system)\n", system_wgetrc); +#ifdef SYSTEM_WGETRC + printf ("%s (system)\n", SYSTEM_WGETRC); +#else + putchar ('\n'); +#endif - format_and_print_line (strdup (locale_title), - strdup (locale_dir), + format_and_print_line (locale_title, + LOCALEDIR, max_chars_per_line); - format_and_print_line (strdup (compile_title), - strdup (compilation_string), + format_and_print_line (compile_title, + compilation_string, max_chars_per_line); - format_and_print_line (strdup (link_title), - strdup (link_string), + format_and_print_line (link_title, + link_string, max_chars_per_line); + printf ("\n"); /* TRANSLATORS: When available, an actual copyright character (cirle-c) should be used in preference to "(C)". */ @@ -826,9 +835,13 @@ There is NO WARRANTY, to the extent permitted by law.\n"), stdout); stdout); fputs (_("Currently maintained by Micah Cowan .\n"), stdout); + fputs (_("Please send bug reports and questions to .\n"), + stdout); exit (0); } +char *program_name; /* Needed by lib/error.c. */ + int main (int argc, char **argv) { @@ -837,6 +850,8 @@ main (int argc, char **argv) int nurl, status; bool append_to_log = false; + program_name = argv[0]; + i18n_initialize (); /* Construct the name of the executable, without the directory part. */ @@ -1249,7 +1264,7 @@ WARNING: Can't reopen standard output in binary mode;\n\ logprintf (LOG_NOTQUIET, _("FINISHED --%s--\nDownloaded: %d files, %s in %s (%s)\n"), datetime_str (time (NULL)), - opt.numurls, + numurls, human_readable (total_downloaded_bytes), secs_to_human_time (total_download_time), retr_rate (total_downloaded_bytes, total_download_time)); diff --git a/src/mswindows.h b/src/mswindows.h index 54821a7c..71687278 100644 --- a/src/mswindows.h +++ b/src/mswindows.h @@ -78,6 +78,8 @@ as that of the covered work. */ # define strncasecmp strnicmp #endif +#include + /* The same for snprintf() and vsnprintf(). */ #define snprintf _snprintf #define vsnprintf _vsnprintf diff --git a/src/options.h b/src/options.h index 4574ab85..8dc7fee2 100644 --- a/src/options.h +++ b/src/options.h @@ -124,10 +124,6 @@ struct options SUM_SIZE_INT quota; /* Maximum file size to download and store. */ - int numurls; /* Number of successfully downloaded - URLs #### should be removed because - it's not a setting, but a global var */ - bool server_response; /* Do we print server response? */ bool save_headers; /* Do we save headers together with file? */ diff --git a/src/retr.c b/src/retr.c index fe4e3e76..1d9d7478 100644 --- a/src/retr.c +++ b/src/retr.c @@ -393,7 +393,7 @@ fd_read_hunk (int fd, hunk_terminator_t terminator, long sizehint, long maxsize) char *hunk = xmalloc (bufsize); int tail = 0; /* tail position in HUNK */ - assert (maxsize >= bufsize); + assert (!maxsize || maxsize >= bufsize); while (1) { diff --git a/src/url.c b/src/url.c index 8f067250..86d099a7 100644 --- a/src/url.c +++ b/src/url.c @@ -252,6 +252,15 @@ url_escape (const char *s) return url_escape_1 (s, urlchr_unsafe, false); } +/* URL-escape the unsafe and reserved characters (see urlchr_table) in + a given string, returning a freshly allocated string. */ + +char * +url_escape_unsafe_and_reserved (const char *s) +{ + return url_escape_1 (s, urlchr_unsafe|urlchr_reserved, false); +} + /* URL-escape the unsafe characters (see urlchr_table) in a given string. If no characters are unsafe, S is returned. */ @@ -929,9 +938,9 @@ url_error (const char *url, int error_code) if ((p = strchr (scheme, ':'))) *p = '\0'; if (!strcasecmp (scheme, "https")) - asprintf (&error, _("HTTPS support not compiled in")); + error = aprintf (_("HTTPS support not compiled in")); else - asprintf (&error, _(parse_errors[error_code]), quote (scheme)); + error = aprintf (_(parse_errors[error_code]), quote (scheme)); xfree (scheme); return error; diff --git a/src/url.h b/src/url.h index 2fa8d51c..38eafca4 100644 --- a/src/url.h +++ b/src/url.h @@ -83,6 +83,7 @@ struct url /* Function declarations */ char *url_escape (const char *); +char *url_escape_unsafe_and_reserved (const char *); struct url *url_parse (const char *, int *, struct iri *iri, bool percent_encode); char *url_error (const char *, int); diff --git a/tests/ChangeLog b/tests/ChangeLog index 7751be64..16e7bd3b 100644 --- a/tests/ChangeLog +++ b/tests/ChangeLog @@ -1,3 +1,84 @@ +2008-11-26 Micah Cowan (not copyrightable) + + * Test-ftp-iri-disabled.px, Test-ftp-iri-fallback.px, + Test-ftp-iri.px, Test-idn-cmd.px, Test-idn-headers.px, + Test-idn-meta.px, Test-iri-disabled.px, + Test-iri-forced-remote.px, Test-iri-list.px, Test-iri.px: More + module-scope warnings. + +2008-11-25 Steven Schubiger + + * WgetTest.pm.in: Remove the magic interpreter line; + replace -w with lexical warnings. + +2008-11-13 Steven Schubiger + + * FTPServer.pm, FTPTest.pm, HTTPServer.pm, HTTPTest.pm, + WgetTest.pm.in: Clean up leftover whitespace. + +2008-11-12 Steven Schubiger + + * Test-auth-basic.px, Test-auth-no-challenge.px, + Test-auth-no-challenge-url.px, Test-c-full.px, + Test-c-partial.px, Test-c.px, Test-c-shorter.px, + Test-E-k-K.px, Test-E-k.px, Test-ftp.px, + Test-HTTP-Content-Disposition-1.px, + Test-HTTP-Content-Disposition-2.px, + Test-HTTP-Content-Disposition.px, Test-N-current.px, + Test-N-HTTP-Content-Disposition.px, + Test-N--no-content-disposition.px, + Test-N--no-content-disposition-trivial.px, + Test-N-no-info.px, Test--no-content-disposition.px, + Test--no-content-disposition-trivial.px, Test-N-old.px, + Test-nonexisting-quiet.px, Test-noop.px, Test-np.px, + Test-N.px, Test-N-smaller.px, + Test-O-HTTP-Content-Disposition.px, Test-O-nc.px, + Test-O--no-content-disposition.px, + Test-O--no-content-disposition-trivial.px, + Test-O-nonexisting.px, Test-O.px, + Test-proxy-auth-basic.px, Test-Restrict-Lowercase.px, + Test-Restrict-Uppercase.px, + Test--spider-fail.pxm, Test--spider.px, + Test--spider-r-HTTP-Content-Disposition.px, + Test--spider-r--no-content-disposition.px, + Test--spider-r--no-content-disposition-trivial.px, + Test--spider-r.px: Enforce lexically scoped warnings. + + * Test-proxied-https-auth.px, run-px: Place use strict + before use warnings. + +2008-11-12 Steven Schubiger + + * FTPServer.pm, FTPTest.pm, HTTPServer.pm, HTTPTest.pm: + Remove the magic interpreter line, because it cannot be + used fully. Substitute -w with use warnings. + +2008-11-11 Micah Cowan + + * HTTPServer.pm (handle_auth): Allow testing of + --auth-no-challenge. + + * Test-auth-no-challenge.px, Test-auth-no-challenge-url.px: + Added. + + * run-px: Add Test-auth-no-challenge.px, + Test-auth-no-challenge-url.px. + +2008-11-07 Steven Schubiger + + * run-px: Use some colors for the summary part of the test + output to strengthen the distinction between a successful + or failing run. + +2008-11-06 Steven Schubiger + + * run-px: When executing test scripts, invoke them with the + current perl executable name as determined by env. + +2008-11-06 Micah Cowan + + * run-px: Use strict (thanks Steven Schubiger!). + 2008-09-09 Micah Cowan * Test-idn-cmd.px: Added. diff --git a/tests/FTPServer.pm b/tests/FTPServer.pm index 8c7cada7..edeb69dd 100644 --- a/tests/FTPServer.pm +++ b/tests/FTPServer.pm @@ -1,11 +1,10 @@ -#!/usr/bin/perl -w - # Part of this code was borrowed from Richard Jones's Net::FTPServer # http://www.annexia.org/freeware/netftpserver package FTPServer; use strict; +use warnings; use Cwd; use Socket; @@ -20,36 +19,36 @@ my $GOT_SIGURG = 0; # connection states my %_connection_states = ( - 'NEWCONN' => 0x01, - 'WAIT4PWD' => 0x02, + 'NEWCONN' => 0x01, + 'WAIT4PWD' => 0x02, 'LOGGEDIN' => 0x04, 'TWOSOCKS' => 0x08, ); # subset of FTP commands supported by these server and the respective # connection states in which they are allowed -my %_commands = ( +my %_commands = ( # Standard commands from RFC 959. 'CWD' => $_connection_states{LOGGEDIN} | - $_connection_states{TWOSOCKS}, + $_connection_states{TWOSOCKS}, # 'EPRT' => $_connection_states{LOGGEDIN}, -# 'EPSV' => $_connection_states{LOGGEDIN}, - 'LIST' => $_connection_states{TWOSOCKS}, +# 'EPSV' => $_connection_states{LOGGEDIN}, + 'LIST' => $_connection_states{TWOSOCKS}, # 'LPRT' => $_connection_states{LOGGEDIN}, -# 'LPSV' => $_connection_states{LOGGEDIN}, - 'PASS' => $_connection_states{WAIT4PWD}, - 'PASV' => $_connection_states{LOGGEDIN}, - 'PORT' => $_connection_states{LOGGEDIN}, +# 'LPSV' => $_connection_states{LOGGEDIN}, + 'PASS' => $_connection_states{WAIT4PWD}, + 'PASV' => $_connection_states{LOGGEDIN}, + 'PORT' => $_connection_states{LOGGEDIN}, 'PWD' => $_connection_states{LOGGEDIN} | - $_connection_states{TWOSOCKS}, + $_connection_states{TWOSOCKS}, 'QUIT' => $_connection_states{LOGGEDIN} | - $_connection_states{TWOSOCKS}, - 'REST' => $_connection_states{TWOSOCKS}, - 'RETR' => $_connection_states{TWOSOCKS}, + $_connection_states{TWOSOCKS}, + 'REST' => $_connection_states{TWOSOCKS}, + 'RETR' => $_connection_states{TWOSOCKS}, 'SYST' => $_connection_states{LOGGEDIN}, 'TYPE' => $_connection_states{LOGGEDIN} | $_connection_states{TWOSOCKS}, - 'USER' => $_connection_states{NEWCONN}, + 'USER' => $_connection_states{NEWCONN}, # From ftpexts Internet Draft. 'SIZE' => $_connection_states{LOGGEDIN} | $_connection_states{TWOSOCKS}, @@ -76,7 +75,7 @@ sub _CWD_command my @elems = split /\//, $path; foreach (@elems) { - if ($_ eq "" || $_ eq ".") { + if ($_ eq "" || $_ eq ".") { # Ignore these. next; } elsif ($_ eq "..") { @@ -117,7 +116,7 @@ sub _LIST_command $dir = "/"; $path =~ s,^/+,,; } - + # Parse the first elements of the path until we find the appropriate # working directory. my @elems = split /\//, $path; @@ -142,10 +141,10 @@ sub _LIST_command } $dir .= $_; } else { # It's the last element: check if it's a file, directory or wildcard. - if (-f $conn->{rootdir} . $dir . $_) { + if (-f $conn->{rootdir} . $dir . $_) { # It's a file. $filename = $_; - } elsif (-d $conn->{rootdir} . $dir . $_) { + } elsif (-d $conn->{rootdir} . $dir . $_) { # It's a directory. $dir .= $_; } elsif (/\*/ || /\?/) { @@ -158,9 +157,9 @@ sub _LIST_command } } } - + print STDERR "_LIST_command - dir is: $dir\n" if $log; - + print {$conn->{socket}} "150 Opening data connection for file listing.\r\n"; # Open a path back to the client. @@ -174,7 +173,7 @@ sub _LIST_command # If the path contains a directory name, extract it so that # we can prefix it to every filename listed. my $prefix = (($filename || $wildcard) && $path =~ /(.*\/).*/) ? $1 : ""; - + print STDERR "_LIST_command - prefix is: $prefix\n" if $log; # OK, we're either listing a full directory, listing a single @@ -191,7 +190,7 @@ sub _LIST_command __list_file ($sock, $prefix . $_); } } - + unless ($sock->close) { print {$conn->{socket}} "550 Error closing data connection: $!\r\n"; return; @@ -208,7 +207,7 @@ sub _PASS_command print STDERR "switching to LOGGEDIN state\n" if $log; $conn->{state} = $_connection_states{LOGGEDIN}; - + if ($conn->{username} eq "anonymous") { print {$conn->{socket}} "202 Anonymous user access is always granted.\r\n"; } else { @@ -219,7 +218,7 @@ sub _PASS_command sub _PASV_command { my ($conn, $cmd, $rest) = @_; - + # Open a listening socket - but don't actually accept on it yet. "0" =~ /(0)/; # Perl 5.7 / IO::Socket::INET bug workaround. my $sock = IO::Socket::INET->new (LocalHost => '127.0.0.1', @@ -246,7 +245,7 @@ sub _PASV_command my $p2 = $sockport % 256; $conn->{state} = $_connection_states{TWOSOCKS}; - + # We only accept connections from localhost. print {$conn->{socket}} "227 Entering Passive Mode (127,0,0,1,$p1,$p2)\r\n"; } @@ -294,7 +293,7 @@ sub _PORT_command sub _PWD_command { my ($conn, $cmd, $rest) = @_; - + # See RFC 959 Appendix II and draft-ietf-ftpext-mlst-11.txt section 6.2.1. my $pathname = $conn->{dir}; $pathname =~ s,/+$,, unless $pathname eq "/"; @@ -306,7 +305,7 @@ sub _PWD_command sub _REST_command { my ($conn, $cmd, $restart_from) = @_; - + unless ($restart_from =~ /^([1-9][0-9]*|0)$/) { print {$conn->{socket}} "501 REST command needs a numeric argument.\r\n"; return; @@ -320,7 +319,7 @@ sub _REST_command sub _RETR_command { my ($conn, $cmd, $path) = @_; - + my $dir = $conn->{dir}; # Absolute path? @@ -336,7 +335,7 @@ sub _RETR_command my $filename = pop @elems; foreach (@elems) { - if ($_ eq "" || $_ eq ".") { + if ($_ eq "" || $_ eq ".") { next # Ignore these. } elsif ($_ eq "..") { # Go to parent directory. @@ -354,14 +353,14 @@ sub _RETR_command unless (defined $filename && length $filename) { print {$conn->{socket}} "550 File or directory not found.\r\n"; - return; + return; } if ($filename eq "." || $filename eq "..") { print {$conn->{socket}} "550 RETR command is not supported on directories.\r\n"; - return; + return; } - + my $fullname = $conn->{rootdir} . $dir . $filename; unless (-f $fullname) { print {$conn->{socket}} "550 RETR command is only supported on plain files.\r\n"; @@ -483,7 +482,7 @@ sub _RETR_command sub _SIZE_command { my ($conn, $cmd, $path) = @_; - + my $dir = $conn->{dir}; # Absolute path? @@ -499,7 +498,7 @@ sub _SIZE_command my $filename = pop @elems; foreach (@elems) { - if ($_ eq "" || $_ eq ".") { + if ($_ eq "" || $_ eq ".") { next # Ignore these. } elsif ($_ eq "..") { # Go to parent directory. @@ -517,12 +516,12 @@ sub _SIZE_command unless (defined $filename && length $filename) { print {$conn->{socket}} "550 File or directory not found.\r\n"; - return; + return; } if ($filename eq "." || $filename eq "..") { print {$conn->{socket}} "550 SIZE command is not supported on directories.\r\n"; - return; + return; } my $fullname = $conn->{rootdir} . $dir . $filename; @@ -551,14 +550,14 @@ sub _SIZE_command sub _SYST_command { my ($conn, $cmd, $dummy) = @_; - + print {$conn->{socket}} "215 UNIX Type: L8\r\n"; } sub _TYPE_command { my ($conn, $cmd, $type) = @_; - + # See RFC 959 section 5.3.2. if ($type =~ /^([AI])$/i) { $conn->{type} = 'A'; @@ -583,7 +582,7 @@ sub _USER_command print STDERR "switching to WAIT4PWD state\n" if $log; $conn->{state} = $_connection_states{WAIT4PWD}; - + if ($conn->{username} eq "anonymous") { print {$conn->{socket}} "230 Anonymous user access granted.\r\n"; } else { @@ -709,11 +708,11 @@ sub __get_file_list my @allfiles = readdir DIRHANDLE; my @filenames = (); - + if ($wildcard) { # Get rid of . and .. @allfiles = grep !/^\.{1,2}$/, @allfiles; - + # Convert wildcard to a regular expression. $wildcard = __wildcard_to_regex ($wildcard); @@ -752,7 +751,7 @@ sub __wildcard_to_regex _reuseAddr => 1, _rootDir => Cwd::getcwd(), ); - + sub _default_for { my ($self, $attr) = @_; @@ -795,7 +794,7 @@ sub new { } -sub run +sub run { my ($self, $synch_callback) = @_; my $initialized = 0; @@ -823,11 +822,11 @@ sub run # the accept loop while (my $client_addr = accept (my $socket, $server_sock)) - { + { # turn buffering off on $socket select((select($socket), $|=1)[0]); - - # find out who connected + + # find out who connected my ($client_port, $client_ip) = sockaddr_in ($client_addr); my $client_ipnum = inet_ntoa ($client_ip); @@ -845,8 +844,8 @@ sub run if (1) { # Child process. # install signals - $SIG{URG} = sub { - $GOT_SIGURG = 1; + $SIG{URG} = sub { + $GOT_SIGURG = 1; }; $SIG{PIPE} = sub { @@ -858,7 +857,7 @@ sub run print STDERR "Connection idle timeout expired. Closing server.\n"; exit; }; - + #$SIG{CHLD} = 'IGNORE'; @@ -872,7 +871,7 @@ sub run 'idle_timeout' => 60, # 1 minute timeout 'rootdir' => $self->{_rootDir}, }; - + print {$conn->{socket}} "220 GNU Wget Testing FTP Server ready.\r\n"; # command handling loop @@ -913,7 +912,7 @@ sub run print {$conn->{socket}} "530 Not logged in.\r\n"; next; } - + # Handle the QUIT command specially. if ($cmd eq "QUIT") { print {$conn->{socket}} "221 Goodbye. Service closing connection.\r\n"; @@ -926,7 +925,7 @@ sub run } else { # Father close $socket; } - } + } $/ = $old_ils; } diff --git a/tests/FTPTest.pm b/tests/FTPTest.pm index eed2eb89..81b8b008 100644 --- a/tests/FTPTest.pm +++ b/tests/FTPTest.pm @@ -1,8 +1,7 @@ -#!/usr/bin/perl -w - package FTPTest; use strict; +use warnings; use FTPServer; use WgetTest; @@ -14,7 +13,7 @@ my $VERSION = 0.01; { my %_attr_data = ( # DEFAULT ); - + sub _default_for { my ($self, $attr) = @_; @@ -28,7 +27,7 @@ my $VERSION = 0.01; ($self->SUPER::_standard_keys(), keys %_attr_data); } } - + sub _setup_server { my $self = shift; diff --git a/tests/HTTPServer.pm b/tests/HTTPServer.pm index 01c36957..5252b5b8 100644 --- a/tests/HTTPServer.pm +++ b/tests/HTTPServer.pm @@ -1,8 +1,7 @@ -#!/usr/bin/perl -w - package HTTPServer; use strict; +use warnings; use HTTP::Daemon; use HTTP::Status; @@ -23,7 +22,7 @@ sub run { if (!$initialized) { $synch_callback->(); $initialized = 1; - } + } my $con = $self->accept(); print STDERR "Accepted a new connection\n" if $log; while (my $req = $con->get_request) { @@ -46,14 +45,14 @@ sub run { if (exists($urls->{$url_path})) { print STDERR "Serving requested URL: ", $url_path, "\n" if $log; next unless ($req->method eq "HEAD" || $req->method eq "GET"); - + my $url_rec = $urls->{$url_path}; $self->send_response($req, $url_rec, $con); } else { print STDERR "Requested wrong URL: ", $url_path, "\n" if $log; $con->send_error($HTTP::Status::RC_FORBIDDEN); last; - } + } } print STDERR "Closing connection\n" if $log; $con->close; @@ -145,8 +144,7 @@ sub handle_auth { my $authhdr = $req->header('Authorization'); # Have we sent the challenge yet? - unless (defined $url_rec->{auth_challenged} - && $url_rec->{auth_challenged}) { + unless ($url_rec->{auth_challenged} || $url_rec->{auth_no_challenge}) { # Since we haven't challenged yet, we'd better not # have received authentication (for our testing purposes). if ($authhdr) { @@ -167,6 +165,9 @@ sub handle_auth { # failed it. $code = 400; $msg = "You didn't send auth after I sent challenge"; + if ($url_rec->{auth_no_challenge}) { + $msg = "--auth-no-challenge but no auth sent." + } } else { my ($sent_method) = ($authhdr =~ /^(\S+)/g); unless ($sent_method eq $url_rec->{'auth_method'}) { diff --git a/tests/HTTPTest.pm b/tests/HTTPTest.pm index 0fdcb8f0..883213d1 100644 --- a/tests/HTTPTest.pm +++ b/tests/HTTPTest.pm @@ -1,8 +1,7 @@ -#!/usr/bin/perl -w - package HTTPTest; use strict; +use warnings; use HTTPServer; use WgetTest; @@ -14,7 +13,7 @@ my $VERSION = 0.01; { my %_attr_data = ( # DEFAULT ); - + sub _default_for { my ($self, $attr) = @_; @@ -22,13 +21,13 @@ my $VERSION = 0.01; return $self->SUPER::_default_for($attr); } - sub _standard_keys + sub _standard_keys { my ($self) = @_; ($self->SUPER::_standard_keys(), keys %_attr_data); } } - + sub _setup_server { my $self = shift; diff --git a/tests/Test--no-content-disposition-trivial.px b/tests/Test--no-content-disposition-trivial.px index 6a5b1def..43eb7bfd 100755 --- a/tests/Test--no-content-disposition-trivial.px +++ b/tests/Test--no-content-disposition-trivial.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--no-content-disposition.px b/tests/Test--no-content-disposition.px index 4975913b..7736a2e5 100755 --- a/tests/Test--no-content-disposition.px +++ b/tests/Test--no-content-disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider-fail.px b/tests/Test--spider-fail.px index b30ef755..6e5c976d 100755 --- a/tests/Test--spider-fail.px +++ b/tests/Test--spider-fail.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider-r--no-content-disposition-trivial.px b/tests/Test--spider-r--no-content-disposition-trivial.px index 1e850d40..0bd7d29e 100755 --- a/tests/Test--spider-r--no-content-disposition-trivial.px +++ b/tests/Test--spider-r--no-content-disposition-trivial.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider-r--no-content-disposition.px b/tests/Test--spider-r--no-content-disposition.px index 4eba8579..78beb18d 100755 --- a/tests/Test--spider-r--no-content-disposition.px +++ b/tests/Test--spider-r--no-content-disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider-r-HTTP-Content-Disposition.px b/tests/Test--spider-r-HTTP-Content-Disposition.px index 09f93fa3..e79152f7 100755 --- a/tests/Test--spider-r-HTTP-Content-Disposition.px +++ b/tests/Test--spider-r-HTTP-Content-Disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider-r.px b/tests/Test--spider-r.px index a315d974..b32f792d 100755 --- a/tests/Test--spider-r.px +++ b/tests/Test--spider-r.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test--spider.px b/tests/Test--spider.px index dbc97135..6e8ba499 100755 --- a/tests/Test--spider.px +++ b/tests/Test--spider.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-E-k-K.px b/tests/Test-E-k-K.px index d71c39e5..4a2cf614 100755 --- a/tests/Test-E-k-K.px +++ b/tests/Test-E-k-K.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-E-k.px b/tests/Test-E-k.px index 4581ed71..40d6b6dc 100755 --- a/tests/Test-E-k.px +++ b/tests/Test-E-k.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-HTTP-Content-Disposition-1.px b/tests/Test-HTTP-Content-Disposition-1.px index 01fb0901..3d270143 100755 --- a/tests/Test-HTTP-Content-Disposition-1.px +++ b/tests/Test-HTTP-Content-Disposition-1.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-HTTP-Content-Disposition-2.px b/tests/Test-HTTP-Content-Disposition-2.px index 46c16a17..6550d36f 100755 --- a/tests/Test-HTTP-Content-Disposition-2.px +++ b/tests/Test-HTTP-Content-Disposition-2.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-HTTP-Content-Disposition.px b/tests/Test-HTTP-Content-Disposition.px index 3b6eb2c9..afc964a4 100755 --- a/tests/Test-HTTP-Content-Disposition.px +++ b/tests/Test-HTTP-Content-Disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N--no-content-disposition-trivial.px b/tests/Test-N--no-content-disposition-trivial.px index c58f451a..83f0e4ed 100755 --- a/tests/Test-N--no-content-disposition-trivial.px +++ b/tests/Test-N--no-content-disposition-trivial.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N--no-content-disposition.px b/tests/Test-N--no-content-disposition.px index 78fe522f..f142d306 100755 --- a/tests/Test-N--no-content-disposition.px +++ b/tests/Test-N--no-content-disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N-HTTP-Content-Disposition.px b/tests/Test-N-HTTP-Content-Disposition.px index 32f87710..d33155e4 100755 --- a/tests/Test-N-HTTP-Content-Disposition.px +++ b/tests/Test-N-HTTP-Content-Disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N-current.px b/tests/Test-N-current.px index b8e05a94..0ef47289 100755 --- a/tests/Test-N-current.px +++ b/tests/Test-N-current.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N-no-info.px b/tests/Test-N-no-info.px index 301a9101..9dec6eda 100755 --- a/tests/Test-N-no-info.px +++ b/tests/Test-N-no-info.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N-old.px b/tests/Test-N-old.px index 6ae116e5..fe16dbfb 100755 --- a/tests/Test-N-old.px +++ b/tests/Test-N-old.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N-smaller.px b/tests/Test-N-smaller.px index 71e34d96..e5dceae6 100755 --- a/tests/Test-N-smaller.px +++ b/tests/Test-N-smaller.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-N.px b/tests/Test-N.px index 2e235e08..2f139b51 100755 --- a/tests/Test-N.px +++ b/tests/Test-N.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O--no-content-disposition-trivial.px b/tests/Test-O--no-content-disposition-trivial.px index 501fd44d..75a3e6f1 100755 --- a/tests/Test-O--no-content-disposition-trivial.px +++ b/tests/Test-O--no-content-disposition-trivial.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O--no-content-disposition.px b/tests/Test-O--no-content-disposition.px index 592f0fec..3369ec42 100755 --- a/tests/Test-O--no-content-disposition.px +++ b/tests/Test-O--no-content-disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O-HTTP-Content-Disposition.px b/tests/Test-O-HTTP-Content-Disposition.px index 934f54aa..e18880a7 100755 --- a/tests/Test-O-HTTP-Content-Disposition.px +++ b/tests/Test-O-HTTP-Content-Disposition.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O-nc.px b/tests/Test-O-nc.px index 08819e4b..530ac654 100755 --- a/tests/Test-O-nc.px +++ b/tests/Test-O-nc.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O-nonexisting.px b/tests/Test-O-nonexisting.px index 89744fc8..60ef7c70 100755 --- a/tests/Test-O-nonexisting.px +++ b/tests/Test-O-nonexisting.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-O.px b/tests/Test-O.px index 1f4e8efe..552c6654 100755 --- a/tests/Test-O.px +++ b/tests/Test-O.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-Restrict-Lowercase.px b/tests/Test-Restrict-Lowercase.px index 2b35f1e4..e5d270dc 100755 --- a/tests/Test-Restrict-Lowercase.px +++ b/tests/Test-Restrict-Lowercase.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-Restrict-Uppercase.px b/tests/Test-Restrict-Uppercase.px index 14fa81f4..1175fbd2 100755 --- a/tests/Test-Restrict-Uppercase.px +++ b/tests/Test-Restrict-Uppercase.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-auth-basic.px b/tests/Test-auth-basic.px index 75013609..e60be4c7 100755 --- a/tests/Test-auth-basic.px +++ b/tests/Test-auth-basic.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-auth-no-challenge-url.px b/tests/Test-auth-no-challenge-url.px new file mode 100755 index 00000000..4b947ed2 --- /dev/null +++ b/tests/Test-auth-no-challenge-url.px @@ -0,0 +1,50 @@ +#!/usr/bin/perl + +use strict; +use warnings; + +use HTTPTest; + + +############################################################################### + +my $wholefile = "You're all authenticated.\n"; + +# code, msg, headers, content +my %urls = ( + '/needs-auth.txt' => { + auth_no_challenge => 1, + auth_method => 'Basic', + user => 'fiddle-dee-dee', + passwd => 'Dodgson', + code => "200", + msg => "You want fries with that?", + headers => { + "Content-type" => "text/plain", + }, + content => $wholefile, + }, +); + +my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge " + . "http://fiddle-dee-dee:Dodgson\@localhost:{{port}}/needs-auth.txt"; + +my $expected_error_code = 0; + +my %expected_downloaded_files = ( + 'needs-auth.txt' => { + content => $wholefile, + }, +); + +############################################################################### + +my $the_test = HTTPTest->new (name => "Test-auth-no-challenge-url", + input => \%urls, + cmdline => $cmdline, + errcode => $expected_error_code, + output => \%expected_downloaded_files); +exit $the_test->run(); + +# vim: et ts=4 sw=4 + diff --git a/tests/Test-auth-no-challenge.px b/tests/Test-auth-no-challenge.px new file mode 100755 index 00000000..ec322844 --- /dev/null +++ b/tests/Test-auth-no-challenge.px @@ -0,0 +1,51 @@ +#!/usr/bin/perl + +use strict; +use warnings; + +use HTTPTest; + + +############################################################################### + +my $wholefile = "You're all authenticated.\n"; + +# code, msg, headers, content +my %urls = ( + '/needs-auth.txt' => { + auth_no_challenge => 1, + auth_method => 'Basic', + user => 'fiddle-dee-dee', + passwd => 'Dodgson', + code => "200", + msg => "You want fries with that?", + headers => { + "Content-type" => "text/plain", + }, + content => $wholefile, + }, +); + +my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge" + . " --user=fiddle-dee-dee --password=Dodgson" + . " http://localhost:{{port}}/needs-auth.txt"; + +my $expected_error_code = 0; + +my %expected_downloaded_files = ( + 'needs-auth.txt' => { + content => $wholefile, + }, +); + +############################################################################### + +my $the_test = HTTPTest->new (name => "Test-auth-no-challenge", + input => \%urls, + cmdline => $cmdline, + errcode => $expected_error_code, + output => \%expected_downloaded_files); +exit $the_test->run(); + +# vim: et ts=4 sw=4 + diff --git a/tests/Test-c-full.px b/tests/Test-c-full.px index f277a023..2d107453 100755 --- a/tests/Test-c-full.px +++ b/tests/Test-c-full.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-c-partial.px b/tests/Test-c-partial.px index 02234242..57095472 100755 --- a/tests/Test-c-partial.px +++ b/tests/Test-c-partial.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-c-shorter.px b/tests/Test-c-shorter.px index 432cab92..9823e746 100755 --- a/tests/Test-c-shorter.px +++ b/tests/Test-c-shorter.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-c.px b/tests/Test-c.px index 8c61eb06..2fb705f1 100755 --- a/tests/Test-c.px +++ b/tests/Test-c.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-ftp-iri-disabled.px b/tests/Test-ftp-iri-disabled.px index 14d849da..96122867 100755 --- a/tests/Test-ftp-iri-disabled.px +++ b/tests/Test-ftp-iri-disabled.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use FTPTest; diff --git a/tests/Test-ftp-iri-fallback.px b/tests/Test-ftp-iri-fallback.px index 8902e0f9..091fd008 100755 --- a/tests/Test-ftp-iri-fallback.px +++ b/tests/Test-ftp-iri-fallback.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use FTPTest; diff --git a/tests/Test-ftp-iri.px b/tests/Test-ftp-iri.px index d453669c..78e2622c 100755 --- a/tests/Test-ftp-iri.px +++ b/tests/Test-ftp-iri.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use FTPTest; diff --git a/tests/Test-ftp.px b/tests/Test-ftp.px index 8cef5a55..a98d745f 100755 --- a/tests/Test-ftp.px +++ b/tests/Test-ftp.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use FTPTest; diff --git a/tests/Test-idn-cmd.px b/tests/Test-idn-cmd.px index a5c156a2..dba98183 100755 --- a/tests/Test-idn-cmd.px +++ b/tests/Test-idn-cmd.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-idn-headers.px b/tests/Test-idn-headers.px index 3289d5f5..f07621c3 100755 --- a/tests/Test-idn-headers.px +++ b/tests/Test-idn-headers.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-idn-meta.px b/tests/Test-idn-meta.px index 1397cf45..3d6e0563 100755 --- a/tests/Test-idn-meta.px +++ b/tests/Test-idn-meta.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-iri-disabled.px b/tests/Test-iri-disabled.px index 17e43361..02fc4d3a 100755 --- a/tests/Test-iri-disabled.px +++ b/tests/Test-iri-disabled.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-iri-forced-remote.px b/tests/Test-iri-forced-remote.px index 1acd03a7..8341d516 100755 --- a/tests/Test-iri-forced-remote.px +++ b/tests/Test-iri-forced-remote.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-iri-list.px b/tests/Test-iri-list.px index 51bb09fe..87cc33c8 100755 --- a/tests/Test-iri-list.px +++ b/tests/Test-iri-list.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-iri.px b/tests/Test-iri.px index ca6feddf..738c304a 100755 --- a/tests/Test-iri.px +++ b/tests/Test-iri.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-nonexisting-quiet.px b/tests/Test-nonexisting-quiet.px index 2766b5c5..04e11587 100755 --- a/tests/Test-nonexisting-quiet.px +++ b/tests/Test-nonexisting-quiet.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-noop.px b/tests/Test-noop.px index 14bd851c..1e0d1871 100755 --- a/tests/Test-noop.px +++ b/tests/Test-noop.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-np.px b/tests/Test-np.px index 28d13eec..f674193a 100755 --- a/tests/Test-np.px +++ b/tests/Test-np.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/Test-proxied-https-auth.px b/tests/Test-proxied-https-auth.px index a2efe5eb..4e3fb206 100755 --- a/tests/Test-proxied-https-auth.px +++ b/tests/Test-proxied-https-auth.px @@ -1,6 +1,7 @@ #!/usr/bin/perl -use warnings; + use strict; +use warnings; use WgetTest; # For $WGETPATH. diff --git a/tests/Test-proxy-auth-basic.px b/tests/Test-proxy-auth-basic.px index e3934d7d..033ce039 100755 --- a/tests/Test-proxy-auth-basic.px +++ b/tests/Test-proxy-auth-basic.px @@ -1,6 +1,7 @@ -#!/usr/bin/perl -w +#!/usr/bin/perl use strict; +use warnings; use HTTPTest; diff --git a/tests/WgetTest.pm.in b/tests/WgetTest.pm.in index 910ac80d..2e124e3c 100644 --- a/tests/WgetTest.pm.in +++ b/tests/WgetTest.pm.in @@ -1,5 +1,3 @@ -#!/usr/bin/perl -w - # WARNING! # WgetTest.pm is a generated file! Do not edit! Edit WgetTest.pm.in # instead. @@ -8,6 +6,7 @@ package WgetTest; $VERSION = 0.01; use strict; +use warnings; use Cwd; use File::Path; @@ -26,14 +25,14 @@ my @unexpected_downloads = (); _name => "", _output => {}, ); - + sub _default_for { my ($self, $attr) = @_; $_attr_data{$attr}; } - sub _standard_keys + sub _standard_keys { keys %_attr_data; } @@ -70,29 +69,29 @@ sub new { sub run { my $self = shift; my $result_message = "Test successful.\n"; - + printf "Running test $self->{_name}\n"; - - # Setup + + # Setup $self->_setup(); chdir ("$self->{_workdir}/$self->{_name}/input"); - + # Launch server my $pid = $self->_fork_and_launch_server(); - + # Call wget chdir ("$self->{_workdir}/$self->{_name}/output"); my $cmdline = $self->{_cmdline}; $cmdline = $self->_substitute_port($cmdline); print "Calling $cmdline\n"; - my $errcode = - ($cmdline =~ m{^/.*}) + my $errcode = + ($cmdline =~ m{^/.*}) ? system ($cmdline) : system ("$self->{_workdir}/../src/$cmdline"); # Shutdown server - # if we didn't explicitely kill the server, we would have to call - # waitpid ($pid, 0) here in order to wait for the child process to + # if we didn't explicitely kill the server, we would have to call + # waitpid ($pid, 0) here in order to wait for the child process to # terminate kill ('TERM', $pid); @@ -124,11 +123,11 @@ sub _setup { chdir ($self->{_name}); mkdir ("input"); mkdir ("output"); - + # Setup existing files chdir ("output"); foreach my $filename (keys %{$self->{_existing}}) { - open (FILE, ">$filename") + open (FILE, ">$filename") or return "Test failed: cannot open pre-existing file $filename\n"; my $file = $self->{_existing}->{$filename}; @@ -141,8 +140,8 @@ sub _setup { utime $file->{timestamp}, $file->{timestamp}, $filename or return "Test failed: cannot set timestamp on pre-existing file $filename\n"; } - } - + } + chdir ("../input"); $self->_setup_server(); @@ -162,15 +161,15 @@ sub _verify_download { my $self = shift; chdir ("$self->{_workdir}/$self->{_name}/output"); - + # use slurp mode to read file content my $old_input_record_separator = $/; undef $/; - + while (my ($filename, $filedata) = each %{$self->{_output}}) { - open (FILE, $filename) + open (FILE, $filename) or return "Test failed: file $filename not downloaded\n"; - + my $content = ; my $expected_content = $filedata->{'content'}; $expected_content = $self->_substitute_port($expected_content); @@ -181,20 +180,20 @@ sub _verify_download { my ($dev, $ino, $mode, $nlink, $uid, $gid, $rdev, $size, $atime, $mtime, $ctime, $blksize, $blocks) = stat FILE; - $mtime == $filedata->{'timestamp'} + $mtime == $filedata->{'timestamp'} or return "Test failed: wrong timestamp for file $filename\n"; } - + close (FILE); - } - - $/ = $old_input_record_separator; + } + + $/ = $old_input_record_separator; # make sure no unexpected files were downloaded chdir ("$self->{_workdir}/$self->{_name}/output"); __dir_walk('.', sub { push @unexpected_downloads, $_[0] unless (exists $self->{_output}{$_[0]}) }, sub { shift; return @_ } ); - if (@unexpected_downloads) { + if (@unexpected_downloads) { return "Test failed: unexpected downloaded files [" . join(', ', @unexpected_downloads) . "]\n"; } @@ -228,7 +227,7 @@ sub __dir_walk { } -sub _fork_and_launch_server +sub _fork_and_launch_server { my $self = shift; @@ -239,7 +238,7 @@ sub _fork_and_launch_server if ($pid < 0) { die "Cannot fork"; } elsif ($pid == 0) { - # child + # child close FROM_CHILD; $self->_launch_server(sub { print TO_PARENT "SYNC\n"; close TO_PARENT }); } else { diff --git a/tests/run-px b/tests/run-px index 38520714..3ab1c444 100755 --- a/tests/run-px +++ b/tests/run-px @@ -1,11 +1,19 @@ #!/usr/bin/env perl + +use 5.006; +use strict; use warnings; +use Term::ANSIColor ':constants'; +$Term::ANSIColor::AUTORESET = 1; + die "Please specify the top source directory.\n" if (!@ARGV); my $top_srcdir = shift @ARGV; my @tests = ( 'Test-auth-basic.px', + 'Test-auth-no-challenge.px', + 'Test-auth-no-challenge-url.px', 'Test-proxy-auth-basic.px', 'Test-proxied-https-auth.px', 'Test-N-HTTP-Content-Disposition.px', @@ -57,26 +65,56 @@ my @tests = ( 'Test--spider-r.px', ); -my @results; +my @tested; -for my $test (@tests) { +foreach my $test (@tests) { print "Running $test\n\n"; - system("$top_srcdir/tests/$test"); - push @results, $?; + system("$^X $top_srcdir/tests/$test"); + push @tested, { name => $test, result => $? }; } -for (my $i=0; $i != @tests; ++$i) { - if ($results[$i] == 0) { - print "pass: "; - } else { - print "FAIL: "; - } - print "$tests[$i]\n"; +print "\n"; +foreach my $test (@tested) { + ($test->{result} == 0) + ? print GREEN 'pass: ' + : print RED 'FAIL: '; + print $test->{name}, "\n"; } +my $count = sub +{ + return { + pass => sub { scalar grep $_->{result} == 0, @tested }, + fail => sub { scalar grep $_->{result} != 0, @tested }, + }->{$_[0]}->(); +}; + +my $summary = sub +{ + my @lines = ( + "${\scalar @tested} tests were run", + "${\$count->('pass')} PASS, ${\$count->('fail')} FAIL", + ); + my $len_longest = sub + { + local $_ = 0; + foreach my $line (@lines) { + if (length $line > $_) { + $_ = length $line; + } + } + return $_; + }->(); + return join "\n", + '=' x $len_longest, + @lines, + '=' x $len_longest; +}->(); + +print "\n"; +print $count->('fail') + ? RED $summary + : GREEN $summary; print "\n"; -print scalar(@results) . " tests were run\n"; -print scalar(grep $_ == 0, @results) . " PASS\n"; -print scalar(grep $_ != 0, @results) . " FAIL\n"; -exit scalar (grep $_ != 0, @results); +exit $count->('fail'); diff --git a/util/freeopts b/util/freeopts new file mode 100755 index 00000000..75f594a1 --- /dev/null +++ b/util/freeopts @@ -0,0 +1,48 @@ +#!/usr/bin/perl -n +# NOTE the use of -n above; this script is called in a loop. +use warnings; +use strict; + +our $scanning; +our %used_chars; +BEGIN { + $scanning = 0; + %used_chars = (); + + open STDIN, "../src/main.c" or die "main.c: $!\n"; +} + +if (/^static struct cmdline_option option_data/) { + $scanning = 1; +} +elsif (/[}];/) { + $scanning = 0; +} +elsif ( + $scanning && + /^[\t ]*\{ "[^"]*", '(.)', OPT_[A-Z0-9_]*, / +) { + $used_chars{$1} = 1; +} + +END { + my $cols = 0; + my $max_cols = 13; + my $opt_chars = + "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; + print "Free chars:\n\t"; + for (my $i = 0; $i < length $opt_chars; ++$i, ++$cols) { + if ($cols == $max_cols) { + $cols = 0; + print "\n\t"; + } + my $opt = substr($opt_chars,$i,1); + print ' '; + if (!$used_chars{ $opt }) { + print "-$opt"; + } else { + print ' '; + } + } + print "\n"; +} diff --git a/windows/ChangeLog b/windows/ChangeLog index 8890265a..fd8d8157 100644 --- a/windows/ChangeLog +++ b/windows/ChangeLog @@ -1,3 +1,11 @@ +2008-09-09 Gisle Vanem + + * config-compiler.h: MingW do have ; added HAVE_STDINT_H. + Added _CRT_SECURE_NO_WARNINGS to supress warnings in MSVC8+ about + using "old" ANSI-functions. + + * config.h: config-post.h is gone. SIZEOF_LONG_LONG is 8. + 2008-01-25 Micah Cowan * Makefile.am, Makefile.doc, Makefile.src, Makefile.top, diff --git a/windows/config-compiler.h b/windows/config-compiler.h index 3ee0e638..a71fb429 100644 --- a/windows/config-compiler.h +++ b/windows/config-compiler.h @@ -83,6 +83,7 @@ as that of the covered work. */ /* MinGW and GCC support some POSIX and C99 features. */ #define HAVE_INTTYPES_H 1 +#define HAVE_STDINT_H 1 #define HAVE__BOOL 1 #undef SIZEOF_LONG_LONG /* avoid redefinition warning */ @@ -128,6 +129,7 @@ as that of the covered work. */ #if _MSC_VER >= 1400 #pragma warning ( disable : 4996 ) #define _CRT_SECURE_NO_DEPRECATE +#define _CRT_SECURE_NO_WARNINGS #endif diff --git a/windows/config.h b/windows/config.h index 9736b196..20aa7724 100644 --- a/windows/config.h +++ b/windows/config.h @@ -158,7 +158,7 @@ #define SIZEOF_LONG 4 /* The size of a `long long', as computed by sizeof. */ -#define SIZEOF_LONG_LONG 0 +#define SIZEOF_LONG_LONG 8 /* The size of a `off_t', as computed by sizeof. */ #define SIZEOF_OFF_T 4 @@ -214,5 +214,3 @@ /* Include compiler-specific defines. */ #include "config-compiler.h" -#include "config-post.h" -