1 This is Info file wget.info, produced by Makeinfo version 1.68 from the
2 input file ./wget.texi.
4 INFO-DIR-SECTION Net Utilities
5 INFO-DIR-SECTION World Wide Web
7 * Wget: (wget). The non-interactive network downloader.
10 This file documents the the GNU Wget utility for downloading network
13 Copyright (C) 1996, 1997, 1998, 2000 Free Software Foundation, Inc.
15 Permission is granted to make and distribute verbatim copies of this
16 manual provided the copyright notice and this permission notice are
17 preserved on all copies.
19 Permission is granted to copy and distribute modified versions of
20 this manual under the conditions for verbatim copying, provided also
21 that the sections entitled "Copying" and "GNU General Public License"
22 are included exactly as in the original, and provided that the entire
23 resulting derived work is distributed under the terms of a permission
24 notice identical to this one.
27 File: wget.info, Node: FTP Time-Stamping Internals, Prev: HTTP Time-Stamping Internals, Up: Time-Stamping
29 FTP Time-Stamping Internals
30 ===========================
32 In theory, FTP time-stamping works much the same as HTTP, only FTP
33 has no headers--time-stamps must be received from the directory
36 For each directory files must be retrieved from, Wget will use the
37 `LIST' command to get the listing. It will try to analyze the listing,
38 assuming that it is a Unix `ls -l' listing, and extract the
39 time-stamps. The rest is exactly the same as for HTTP.
41 Assumption that every directory listing is a Unix-style listing may
42 sound extremely constraining, but in practice it is not, as many
43 non-Unix FTP servers use the Unixoid listing format because most (all?)
44 of the clients understand it. Bear in mind that RFC959 defines no
45 standard way to get a file list, let alone the time-stamps. We can
46 only hope that a future standard will define this.
48 Another non-standard solution includes the use of `MDTM' command
49 that is supported by some FTP servers (including the popular
50 `wu-ftpd'), which returns the exact time of the specified file. Wget
51 may support this command in the future.
54 File: wget.info, Node: Startup File, Next: Examples, Prev: Time-Stamping, Up: Top
59 Once you know how to change default settings of Wget through command
60 line arguments, you may wish to make some of those settings permanent.
61 You can do that in a convenient way by creating the Wget startup
64 Besides `.wgetrc' is the "main" initialization file, it is
65 convenient to have a special facility for storing passwords. Thus Wget
66 reads and interprets the contents of `$HOME/.netrc', if it finds it.
67 You can find `.netrc' format in your system manuals.
69 Wget reads `.wgetrc' upon startup, recognizing a limited set of
74 * Wgetrc Location:: Location of various wgetrc files.
75 * Wgetrc Syntax:: Syntax of wgetrc.
76 * Wgetrc Commands:: List of available commands.
77 * Sample Wgetrc:: A wgetrc example.
80 File: wget.info, Node: Wgetrc Location, Next: Wgetrc Syntax, Prev: Startup File, Up: Startup File
85 When initializing, Wget will look for a "global" startup file,
86 `/usr/local/etc/wgetrc' by default (or some prefix other than
87 `/usr/local', if Wget was not installed there) and read commands from
90 Then it will look for the user's file. If the environmental variable
91 `WGETRC' is set, Wget will try to load that file. Failing that, no
92 further attempts will be made.
94 If `WGETRC' is not set, Wget will try to load `$HOME/.wgetrc'.
96 The fact that user's settings are loaded after the system-wide ones
97 means that in case of collision user's wgetrc *overrides* the
98 system-wide wgetrc (in `/usr/local/etc/wgetrc' by default). Fascist
102 File: wget.info, Node: Wgetrc Syntax, Next: Wgetrc Commands, Prev: Wgetrc Location, Up: Startup File
107 The syntax of a wgetrc command is simple:
111 The "variable" will also be called "command". Valid "values" are
112 different for different commands.
114 The commands are case-insensitive and underscore-insensitive. Thus
115 `DIr__PrefiX' is the same as `dirprefix'. Empty lines, lines beginning
116 with `#' and lines containing white-space only are discarded.
118 Commands that expect a comma-separated list will clear the list on an
119 empty command. So, if you wish to reset the rejection list specified in
120 global `wgetrc', you can do it with:
125 File: wget.info, Node: Wgetrc Commands, Next: Sample Wgetrc, Prev: Wgetrc Syntax, Up: Startup File
130 The complete set of commands is listed below, the letter after `='
131 denoting the value the command takes. It is `on/off' for `on' or `off'
132 (which can also be `1' or `0'), STRING for any non-empty string or N
133 for a positive integer. For example, you may specify `use_proxy = off'
134 to disable use of proxy servers by default. You may use `inf' for
135 infinite values, where appropriate.
137 Most of the commands have their equivalent command-line option
138 (*Note Invoking::), except some more obscure or rarely used ones.
140 accept/reject = STRING
141 Same as `-A'/`-R' (*Note Types of Files::).
144 Enable/disable host-prefixed file names. `-nH' disables it.
147 Enable/disable continuation of the retrieval, the same as `-c'
151 Enable/disable going to background, the same as `-b' (which enables
154 backup_converted = on/off
155 Enable/disable saving pre-converted files with the suffix `.orig'
156 - the same as `-K' (which enables it).
159 Set base for relative URLs, the same as `-B'.
162 When set to off, disallow server-caching. See the `-C' option.
164 convert links = on/off
165 Convert non-relative links locally. The same as `-k'.
168 Ignore N remote directory components.
171 Debug mode, same as `-d'.
173 delete_after = on/off
174 Delete after download, the same as `--delete-after'.
177 Top of directory tree, the same as `-P'.
180 Turning dirstruct on or off, the same as `-x' or `-nd',
184 Same as `-D' (*Note Domain Acceptance::).
187 Specify the number of bytes "contained" in a dot, as seen
188 throughout the retrieval (1024 by default). You can postfix the
189 value with `k' or `m', representing kilobytes and megabytes,
190 respectively. With dot settings you can tailor the dot retrieval
191 to suit your needs, or you can use the predefined "styles" (*Note
195 Specify the number of dots that will be printed in each line
196 throughout the retrieval (50 by default).
199 Specify the number of dots in a single cluster (10 by default).
202 Specify the dot retrieval "style", as with `--dot-style'.
204 exclude_directories = STRING
205 Specify a comma-separated list of directories you wish to exclude
206 from download, the same as `-X' (*Note Directory-Based Limits::).
208 exclude_domains = STRING
209 Same as `--exclude-domains' (*Note Domain Acceptance::).
212 Follow FTP links from HTML documents, the same as `-f'.
215 Only follow certain HTML tags when doing a recursive retrieval,
216 just like `--follow-tags'.
219 If set to on, force the input filename to be regarded as an HTML
220 document, the same as `-F'.
223 Use STRING as FTP proxy, instead of the one specified in
227 Turn globbing on/off, the same as `-g'.
230 Define an additional header, like `--header'.
236 Use STRING as HTTP proxy, instead of the one specified in
240 Set HTTP user to STRING.
242 ignore_length = on/off
243 When set to on, ignore `Content-Length' header; the same as
247 Ignore certain HTML tags when doing a recursive retrieval, just
248 like `-G' / `--ignore-tags'.
250 include_directories = STRING
251 Specify a comma-separated list of directories you wish to follow
252 when downloading, the same as `-I'.
255 Read the URLs from STRING, like `-i'.
258 Consider data longer than specified in content-length header as
259 invalid (and retry getting it). The default behaviour is to save
260 as much data as there is, provided there is more than or equal to
261 the value in `Content-Length'.
264 Set logfile, the same as `-o'.
267 Your user name on the remote machine, for FTP. Defaults to
271 Turn mirroring on/off. The same as `-m'.
274 Turn reading netrc on or off.
280 Disallow retrieving outside the directory hierarchy, like
281 `--no-parent' (*Note Directory-Based Limits::).
284 Use STRING as the comma-separated list of domains to avoid in
285 proxy loading, instead of the one specified in environment.
287 output_document = STRING
288 Set the output filename, the same as `-O'.
291 Set passive FTP, the same as `--passive-ftp'.
294 Set your FTP password to PASSWORD. Without this setting, the
295 password defaults to `username@hostname.domainname'.
298 Set proxy authentication user name to STRING, like `--proxy-user'.
300 proxy_passwd = STRING
301 Set proxy authentication password to STRING, like `--proxy-passwd'.
304 Quiet mode, the same as `-q'.
307 Specify the download quota, which is useful to put in the global
308 `wgetrc'. When download quota is specified, Wget will stop
309 retrieving after the download sum has become greater than quota.
310 The quota can be specified in bytes (default), kbytes `k'
311 appended) or mbytes (`m' appended). Thus `quota = 5m' will set
312 the quota to 5 mbytes. Note that the user's startup file overrides
316 Recursion level, the same as `-l'.
319 Recursive on/off, the same as `-r'.
321 relative_only = on/off
322 Follow only relative links, the same as `-L' (*Note Relative
325 remove_listing = on/off
326 If set to on, remove FTP listings downloaded by Wget. Setting it
327 to off is the same as `-nr'.
329 retr_symlinks = on/off
330 When set to on, retrieve symbolic links as if they were plain
331 files; the same as `--retr-symlinks'.
334 Use (or not) `/robots.txt' file (*Note Robots::). Be sure to know
335 what you are doing before changing the default (which is `on').
337 server_response = on/off
338 Choose whether or not to print the HTTP and FTP server responses,
341 simple_host_check = on/off
342 Same as `-nh' (*Note Host Checking::).
348 Set timeout value, the same as `-T'.
350 timestamping = on/off
351 Turn timestamping on/off. The same as `-N' (*Note Time-Stamping::).
354 Set number of retries per URL, the same as `-t'.
357 Turn proxy support on/off. The same as `-Y'.
360 Turn verbose on/off, the same as `-v'/`-nv'.
363 Wait N seconds between retrievals, the same as `-w'.
366 Wait up to N seconds between retries of failed retrievals only -
367 the same as `--waitretry'. Note that this is turned on by default
368 in the global `wgetrc'.
371 File: wget.info, Node: Sample Wgetrc, Prev: Wgetrc Commands, Up: Startup File
376 This is the sample initialization file, as given in the distribution.
377 It is divided in two section--one for global usage (suitable for global
378 startup file), and one for local usage (suitable for `$HOME/.wgetrc').
379 Be careful about the things you change.
381 Note that almost all the lines are commented out. For a command to
382 have any effect, you must remove the `#' character at the beginning of
386 ### Sample Wget initialization file .wgetrc
389 ## You can use this file to change the default behaviour of wget or to
390 ## avoid having to type many many command-line options. This file does
391 ## not contain a comprehensive list of commands -- look at the manual
392 ## to find out what you can put into this file.
394 ## Wget initialization file can reside in /usr/local/etc/wgetrc
395 ## (global, for all users) or $HOME/.wgetrc (for a single user).
397 ## To use the settings in this file, you will have to uncomment them,
398 ## as well as change them, in most cases, as the values on the
399 ## commented-out lines are the default values (e.g. "off").
403 ## Global settings (useful for setting up in /usr/local/etc/wgetrc).
404 ## Think well before you change them, since they may reduce wget's
405 ## functionality, and make it behave contrary to the documentation:
408 # You can set retrieve quota for beginners by specifying a value
409 # optionally followed by 'K' (kilobytes) or 'M' (megabytes). The
410 # default quota is unlimited.
413 # You can lower (or raise) the default number of retries when
414 # downloading a file (default is 20).
417 # Lowering the maximum depth of the recursive retrieval is handy to
418 # prevent newbies from going too "deep" when they unwittingly start
419 # the recursive retrieval. The default is 5.
422 # Many sites are behind firewalls that do not allow initiation of
423 # connections from the outside. On these sites you have to use the
424 # `passive' feature of FTP. If you are behind such a firewall, you
425 # can turn this on to make Wget use passive FTP by default.
428 # The "wait" command below makes Wget wait between every connection.
429 # If, instead, you want Wget to wait only between retries of failed
430 # downloads, set waitretry to maximum number of seconds to wait (Wget
431 # will use "linear backoff", waiting 1 second after the first failure
432 # on a file, 2 seconds after the second failure, etc. up to this max).
437 ## Local settings (for a user to set in his $HOME/.wgetrc). It is
438 ## *highly* undesirable to put these settings in the global file, since
439 ## they are potentially dangerous to "normal" users.
441 ## Even when setting up your own ~/.wgetrc, you should know what you
442 ## are doing before doing so.
445 # Set this to on to use timestamping by default:
448 # It is a good idea to make Wget send your email address in a `From:'
449 # header with your request (so that server administrators can contact
450 # you in case of errors). Wget does *not* send `From:' by default.
451 #header = From: Your Name <username@site.domain>
453 # You can set up other headers, like Accept-Language. Accept-Language
454 # is *not* sent by default.
455 #header = Accept-Language: en
457 # You can set the default proxy for Wget to use. It will override the
458 # value in the environment.
459 #http_proxy = http://proxy.yoyodyne.com:18023/
461 # If you do not want to use proxy at all, set this to off.
464 # You can customize the retrieval outlook. Valid options are default,
465 # binary, mega and micro.
468 # Setting this to off makes Wget not download /robots.txt. Be sure to
469 # know *exactly* what /robots.txt is and how it is used before changing
473 # It can be useful to make Wget wait between connections. Set this to
474 # the number of seconds you want Wget to wait.
477 # You can force creating directory structure, even if a single is being
478 # retrieved, by setting this to on.
481 # You can turn on recursive retrieving by default (don't do this if
482 # you are not sure you know what it means) by setting this to on.
485 # To always back up file X as X.orig before converting its links (due
486 # to -k / --convert-links / convert_links = on having been specified),
487 # set this variable to on:
488 #backup_converted = off
490 # To have Wget follow FTP links from HTML files by default, set this
495 File: wget.info, Node: Examples, Next: Various, Prev: Startup File, Up: Top
500 The examples are classified into three sections, because of clarity.
501 The first section is a tutorial for beginners. The second section
502 explains some of the more complex program features. The third section
503 contains advice for mirror administrators, as well as even more complex
504 features (that some would call perverted).
508 * Simple Usage:: Simple, basic usage of the program.
509 * Advanced Usage:: Advanced techniques of usage.
510 * Guru Usage:: Mirroring and the hairy stuff.
513 File: wget.info, Node: Simple Usage, Next: Advanced Usage, Prev: Examples, Up: Examples
518 * Say you want to download a URL. Just type:
520 wget http://fly.cc.fer.hr/
522 The response will be something like:
524 --13:30:45-- http://fly.cc.fer.hr:80/en/
526 Connecting to fly.cc.fer.hr:80... connected!
527 HTTP request sent, awaiting response... 200 OK
528 Length: 4,694 [text/html]
532 13:30:46 (23.75 KB/s) - `index.html' saved [4694/4694]
534 * But what will happen if the connection is slow, and the file is
535 lengthy? The connection will probably fail before the whole file
536 is retrieved, more than once. In this case, Wget will try getting
537 the file until it either gets the whole of it, or exceeds the
538 default number of retries (this being 20). It is easy to change
539 the number of tries to 45, to insure that the whole file will
542 wget --tries=45 http://fly.cc.fer.hr/jpg/flyweb.jpg
544 * Now let's leave Wget to work in the background, and write its
545 progress to log file `log'. It is tiring to type `--tries', so we
548 wget -t 45 -o log http://fly.cc.fer.hr/jpg/flyweb.jpg &
550 The ampersand at the end of the line makes sure that Wget works in
551 the background. To unlimit the number of retries, use `-t inf'.
553 * The usage of FTP is as simple. Wget will take care of login and
556 $ wget ftp://gnjilux.cc.fer.hr/welcome.msg
557 --10:08:47-- ftp://gnjilux.cc.fer.hr:21/welcome.msg
559 Connecting to gnjilux.cc.fer.hr:21... connected!
560 Logging in as anonymous ... Logged in!
561 ==> TYPE I ... done. ==> CWD not needed.
562 ==> PORT ... done. ==> RETR welcome.msg ... done.
563 Length: 1,340 (unauthoritative)
567 10:08:48 (1.28 MB/s) - `welcome.msg' saved [1340]
569 * If you specify a directory, Wget will retrieve the directory
570 listing, parse it and convert it to HTML. Try:
572 wget ftp://prep.ai.mit.edu/pub/gnu/
576 File: wget.info, Node: Advanced Usage, Next: Guru Usage, Prev: Simple Usage, Up: Examples
581 * You would like to read the list of URLs from a file? Not a problem
586 If you specify `-' as file name, the URLs will be read from
589 * Create a mirror image of GNU WWW site (with the same directory
590 structure the original has) with only one try per document, saving
591 the log of the activities to `gnulog':
593 wget -r -t1 http://www.gnu.ai.mit.edu/ -o gnulog
595 * Retrieve the first layer of yahoo links:
597 wget -r -l1 http://www.yahoo.com/
599 * Retrieve the index.html of `www.lycos.com', showing the original
602 wget -S http://www.lycos.com/
604 * Save the server headers with the file:
605 wget -s http://www.lycos.com/
608 * Retrieve the first two levels of `wuarchive.wustl.edu', saving them
611 wget -P/tmp -l2 ftp://wuarchive.wustl.edu/
613 * You want to download all the GIFs from an HTTP directory. `wget
614 http://host/dir/*.gif' doesn't work, since HTTP retrieval does not
615 support globbing. In that case, use:
617 wget -r -l1 --no-parent -A.gif http://host/dir/
619 It is a bit of a kludge, but it works. `-r -l1' means to retrieve
620 recursively (*Note Recursive Retrieval::), with maximum depth of 1.
621 `--no-parent' means that references to the parent directory are
622 ignored (*Note Directory-Based Limits::), and `-A.gif' means to
623 download only the GIF files. `-A "*.gif"' would have worked too.
625 * Suppose you were in the middle of downloading, when Wget was
626 interrupted. Now you do not want to clobber the files already
627 present. It would be:
629 wget -nc -r http://www.gnu.ai.mit.edu/
631 * If you want to encode your own username and password to HTTP or
632 FTP, use the appropriate URL syntax (*Note URL Format::).
634 wget ftp://hniksic:mypassword@jagor.srce.hr/.emacs
636 * If you do not like the default retrieval visualization (1K dots
637 with 10 dots per cluster and 50 dots per line), you can customize
638 it through dot settings (*Note Wgetrc Commands::). For example,
639 many people like the "binary" style of retrieval, with 8K dots and
642 wget --dot-style=binary ftp://prep.ai.mit.edu/pub/gnu/README
644 You can experiment with other styles, like:
646 wget --dot-style=mega ftp://ftp.xemacs.org/pub/xemacs/xemacs-20.4/xemacs-20.4.tar.gz
647 wget --dot-style=micro http://fly.cc.fer.hr/
649 To make these settings permanent, put them in your `.wgetrc', as
650 described before (*Note Sample Wgetrc::).
653 File: wget.info, Node: Guru Usage, Prev: Advanced Usage, Up: Examples
658 * If you wish Wget to keep a mirror of a page (or FTP
659 subdirectories), use `--mirror' (`-m'), which is the shorthand for
660 `-r -N'. You can put Wget in the crontab file asking it to
661 recheck a site each Sunday:
664 0 0 * * 0 wget --mirror ftp://ftp.xemacs.org/pub/xemacs/ -o /home/me/weeklog
666 * You may wish to do the same with someone's home page. But you do
667 not want to download all those images--you're only interested in
670 wget --mirror -A.html http://www.w3.org/
672 * But what about mirroring the hosts networkologically close to you?
673 It seems so awfully slow because of all that DNS resolving. Just
674 use `-D' (*Note Domain Acceptance::).
676 wget -rN -Dsrce.hr http://www.srce.hr/
678 Now Wget will correctly find out that `regoc.srce.hr' is the same
679 as `www.srce.hr', but will not even take into consideration the
680 link to `www.mit.edu'.
682 * You have a presentation and would like the dumb absolute links to
683 be converted to relative? Use `-k':
687 * You would like the output documents to go to standard output
688 instead of to files? OK, but Wget will automatically shut up
689 (turn on `--quiet') to prevent mixing of Wget output and the
692 wget -O - http://jagor.srce.hr/ http://www.srce.hr/
694 You can also combine the two options and make weird pipelines to
695 retrieve the documents from remote hotlists:
697 wget -O - http://cool.list.com/ | wget --force-html -i -
700 File: wget.info, Node: Various, Next: Appendices, Prev: Examples, Up: Top
705 This chapter contains all the stuff that could not fit anywhere else.
709 * Proxies:: Support for proxy servers
710 * Distribution:: Getting the latest version.
711 * Mailing List:: Wget mailing list for announcements and discussion.
712 * Reporting Bugs:: How and where to report bugs.
713 * Portability:: The systems Wget works on.
714 * Signals:: Signal-handling performed by Wget.
717 File: wget.info, Node: Proxies, Next: Distribution, Prev: Various, Up: Various
722 "Proxies" are special-purpose HTTP servers designed to transfer data
723 from remote servers to local clients. One typical use of proxies is
724 lightening network load for users behind a slow connection. This is
725 achieved by channeling all HTTP and FTP requests through the proxy
726 which caches the transferred data. When a cached resource is requested
727 again, proxy will return the data from cache. Another use for proxies
728 is for companies that separate (for security reasons) their internal
729 networks from the rest of Internet. In order to obtain information
730 from the Web, their users connect and retrieve remote data using an
733 Wget supports proxies for both HTTP and FTP retrievals. The
734 standard way to specify proxy location, which Wget recognizes, is using
735 the following environment variables:
738 This variable should contain the URL of the proxy for HTTP
742 This variable should contain the URL of the proxy for HTTP
743 connections. It is quite common that HTTP_PROXY and FTP_PROXY are
747 This variable should contain a comma-separated list of domain
748 extensions proxy should *not* be used for. For instance, if the
749 value of `no_proxy' is `.mit.edu', proxy will not be used to
750 retrieve documents from MIT.
752 In addition to the environment variables, proxy location and settings
753 may be specified from within Wget itself.
758 This option may be used to turn the proxy support on or off. Proxy
759 support is on by default, provided that the appropriate environment
765 These startup file variables allow you to override the proxy
766 settings specified by the environment.
768 Some proxy servers require authorization to enable you to use them.
769 The authorization consists of "username" and "password", which must be
770 sent by Wget. As with HTTP authorization, several authentication
771 schemes exist. For proxy authorization only the `Basic' authentication
772 scheme is currently implemented.
774 You may specify your username and password either through the proxy
775 URL or through the command-line options. Assuming that the company's
776 proxy is located at `proxy.srce.hr' at port 8001, a proxy URL location
777 containing authorization data might look like this:
779 http://hniksic:mypassword@proxy.company.com:8001/
781 Alternatively, you may use the `proxy-user' and `proxy-password'
782 options, and the equivalent `.wgetrc' settings `proxy_user' and
783 `proxy_passwd' to set the proxy username and password.
786 File: wget.info, Node: Distribution, Next: Mailing List, Prev: Proxies, Up: Various
791 Like all GNU utilities, the latest version of Wget can be found at
792 the master GNU archive site prep.ai.mit.edu, and its mirrors. For
793 example, Wget 1.5.3+dev can be found at
794 `ftp://prep.ai.mit.edu/gnu/wget/wget-1.5.3+dev.tar.gz'
797 File: wget.info, Node: Mailing List, Next: Reporting Bugs, Prev: Distribution, Up: Various
802 Wget has its own mailing list at <wget@sunsite.auc.dk>, thanks to
803 Karsten Thygesen. The mailing list is for discussion of Wget features
804 and web, reporting Wget bugs (those that you think may be of interest
805 to the public) and mailing announcements. You are welcome to
806 subscribe. The more people on the list, the better!
808 To subscribe, send mail to <wget-subscribe@sunsite.auc.dk>. the
809 magic word `subscribe' in the subject line. Unsubscribe by mailing to
810 <wget-unsubscribe@sunsite.auc.dk>.
812 The mailing list is archived at `http://fly.cc.fer.hr/archive/wget'.
815 File: wget.info, Node: Reporting Bugs, Next: Portability, Prev: Mailing List, Up: Various
820 You are welcome to send bug reports about GNU Wget to
821 <bug-wget@gnu.org>. The bugs that you think are of the interest to the
822 public (i.e. more people should be informed about them) can be Cc-ed to
823 the mailing list at <wget@sunsite.auc.dk>.
825 Before actually submitting a bug report, please try to follow a few
828 1. Please try to ascertain that the behaviour you see really is a
829 bug. If Wget crashes, it's a bug. If Wget does not behave as
830 documented, it's a bug. If things work strange, but you are not
831 sure about the way they are supposed to work, it might well be a
834 2. Try to repeat the bug in as simple circumstances as possible.
835 E.g. if Wget crashes on `wget -rLl0 -t5 -Y0 http://yoyodyne.com -o
836 /tmp/log', you should try to see if it will crash with a simpler
839 Also, while I will probably be interested to know the contents of
840 your `.wgetrc' file, just dumping it into the debug message is
841 probably a bad idea. Instead, you should first try to see if the
842 bug repeats with `.wgetrc' moved out of the way. Only if it turns
843 out that `.wgetrc' settings affect the bug, should you mail me the
844 relevant parts of the file.
846 3. Please start Wget with `-d' option and send the log (or the
847 relevant parts of it). If Wget was compiled without debug support,
848 recompile it. It is *much* easier to trace bugs with debug support
851 4. If Wget has crashed, try to run it in a debugger, e.g. `gdb `which
852 wget` core' and type `where' to get the backtrace.
854 5. Find where the bug is, fix it and send me the patches. :-)
857 File: wget.info, Node: Portability, Next: Signals, Prev: Reporting Bugs, Up: Various
862 Since Wget uses GNU Autoconf for building and configuring, and avoids
863 using "special" ultra-mega-cool features of any particular Unix, it
864 should compile (and work) on all common Unix flavors.
866 Various Wget versions have been compiled and tested under many kinds
867 of Unix systems, including Solaris, Linux, SunOS, OSF (aka Digital
868 Unix), Ultrix, *BSD, IRIX, and others; refer to the file `MACHINES' in
869 the distribution directory for a comprehensive list. If you compile it
870 on an architecture not listed there, please let me know so I can update
873 Wget should also compile on the other Unix systems, not listed in
874 `MACHINES'. If it doesn't, please let me know.
876 Thanks to kind contributors, this version of Wget compiles and works
877 on Microsoft Windows 95 and Windows NT platforms. It has been compiled
878 successfully using MS Visual C++ 4.0, Watcom, and Borland C compilers,
879 with Winsock as networking software. Naturally, it is crippled of some
880 features available on Unix, but it should work as a substitute for
881 people stuck with Windows. Note that the Windows port is *neither
882 tested nor maintained* by me--all questions and problems should be
883 reported to Wget mailing list at <wget@sunsite.auc.dk> where the
884 maintainers will look at them.
887 File: wget.info, Node: Signals, Prev: Portability, Up: Various
892 Since the purpose of Wget is background work, it catches the hangup
893 signal (`SIGHUP') and ignores it. If the output was on standard
894 output, it will be redirected to a file named `wget-log'. Otherwise,
895 `SIGHUP' is ignored. This is convenient when you wish to redirect the
896 output of Wget after having started it.
898 $ wget http://www.ifi.uio.no/~larsi/gnus.tar.gz &
899 $ kill -HUP %% # Redirect the output to wget-log
901 Other than that, Wget will not try to interfere with signals in any
902 way. `C-c', `kill -TERM' and `kill -KILL' should kill it alike.
905 File: wget.info, Node: Appendices, Next: Copying, Prev: Various, Up: Top
910 This chapter contains some references I consider useful, like the
911 Robots Exclusion Standard specification, as well as a list of
912 contributors to GNU Wget.
916 * Robots:: Wget as a WWW robot.
917 * Security Considerations:: Security with Wget.
918 * Contributors:: People who helped.
921 File: wget.info, Node: Robots, Next: Security Considerations, Prev: Appendices, Up: Appendices
926 Since Wget is able to traverse the web, it counts as one of the Web
927 "robots". Thus Wget understands "Robots Exclusion Standard"
928 (RES)--contents of `/robots.txt', used by server administrators to
929 shield parts of their systems from wanderings of Wget.
931 Norobots support is turned on only when retrieving recursively, and
932 *never* for the first page. Thus, you may issue:
934 wget -r http://fly.cc.fer.hr/
936 First the index of fly.cc.fer.hr will be downloaded. If Wget finds
937 anything worth downloading on the same host, only *then* will it load
938 the robots, and decide whether or not to load the links after all.
939 `/robots.txt' is loaded only once per host. Wget does not support the
942 The description of the norobots standard was written, and is
943 maintained by Martijn Koster <m.koster@webcrawler.com>. With his
944 permission, I contribute a (slightly modified) TeXified version of the
949 * Introduction to RES::
953 * Norobots Examples::
956 File: wget.info, Node: Introduction to RES, Next: RES Format, Prev: Robots, Up: Robots
961 "WWW Robots" (also called "wanderers" or "spiders") are programs
962 that traverse many pages in the World Wide Web by recursively
963 retrieving linked pages. For more information see the robots page.
965 In 1993 and 1994 there have been occasions where robots have visited
966 WWW servers where they weren't welcome for various reasons. Sometimes
967 these reasons were robot specific, e.g. certain robots swamped servers
968 with rapid-fire requests, or retrieved the same files repeatedly. In
969 other situations robots traversed parts of WWW servers that weren't
970 suitable, e.g. very deep virtual trees, duplicated information,
971 temporary information, or cgi-scripts with side-effects (such as
974 These incidents indicated the need for established mechanisms for
975 WWW servers to indicate to robots which parts of their server should
976 not be accessed. This standard addresses this need with an operational
979 This document represents a consensus on 30 June 1994 on the robots
980 mailing list (`robots@webcrawler.com'), between the majority of robot
981 authors and other people with an interest in robots. It has also been
982 open for discussion on the Technical World Wide Web mailing list
983 (`www-talk@info.cern.ch'). This document is based on a previous working
984 draft under the same title.
986 It is not an official standard backed by a standards body, or owned
987 by any commercial organization. It is not enforced by anybody, and there
988 no guarantee that all current and future robots will use it. Consider
989 it a common facility the majority of robot authors offer the WWW
990 community to protect WWW server against unwanted accesses by their
993 The latest version of this document can be found at
994 `http://info.webcrawler.com/mak/projects/robots/norobots.html'.
997 File: wget.info, Node: RES Format, Next: User-Agent Field, Prev: Introduction to RES, Up: Robots
1002 The format and semantics of the `/robots.txt' file are as follows:
1004 The file consists of one or more records separated by one or more
1005 blank lines (terminated by `CR', `CR/NL', or `NL'). Each record
1006 contains lines of the form:
1008 <field>:<optionalspace><value><optionalspace>
1010 The field name is case insensitive.
1012 Comments can be included in file using UNIX Bourne shell conventions:
1013 the `#' character is used to indicate that preceding space (if any) and
1014 the remainder of the line up to the line termination is discarded.
1015 Lines containing only a comment are discarded completely, and therefore
1016 do not indicate a record boundary.
1018 The record starts with one or more User-agent lines, followed by one
1019 or more Disallow lines, as detailed below. Unrecognized headers are
1022 The presence of an empty `/robots.txt' file has no explicit
1023 associated semantics, it will be treated as if it was not present, i.e.
1024 all robots will consider themselves welcome.
1027 File: wget.info, Node: User-Agent Field, Next: Disallow Field, Prev: RES Format, Up: Robots
1032 The value of this field is the name of the robot the record is
1033 describing access policy for.
1035 If more than one User-agent field is present the record describes an
1036 identical access policy for more than one robot. At least one field
1037 needs to be present per record.
1039 The robot should be liberal in interpreting this field. A case
1040 insensitive substring match of the name without version information is
1043 If the value is `*', the record describes the default access policy
1044 for any robot that has not matched any of the other records. It is not
1045 allowed to have multiple such records in the `/robots.txt' file.
1048 File: wget.info, Node: Disallow Field, Next: Norobots Examples, Prev: User-Agent Field, Up: Robots
1053 The value of this field specifies a partial URL that is not to be
1054 visited. This can be a full path, or a partial path; any URL that
1055 starts with this value will not be retrieved. For example,
1056 `Disallow: /help' disallows both `/help.html' and `/help/index.html',
1057 whereas `Disallow: /help/' would disallow `/help/index.html' but allow
1060 Any empty value, indicates that all URLs can be retrieved. At least
1061 one Disallow field needs to be present in a record.
1064 File: wget.info, Node: Norobots Examples, Prev: Disallow Field, Up: Robots
1069 The following example `/robots.txt' file specifies that no robots
1070 should visit any URL starting with `/cyberworld/map/' or `/tmp/':
1072 # robots.txt for http://www.site.com/
1075 Disallow: /cyberworld/map/ # This is an infinite virtual URL space
1076 Disallow: /tmp/ # these will soon disappear
1078 This example `/robots.txt' file specifies that no robots should
1079 visit any URL starting with `/cyberworld/map/', except the robot called
1082 # robots.txt for http://www.site.com/
1085 Disallow: /cyberworld/map/ # This is an infinite virtual URL space
1087 # Cybermapper knows where to go.
1088 User-agent: cybermapper
1091 This example indicates that no robots should visit this site further:
1098 File: wget.info, Node: Security Considerations, Next: Contributors, Prev: Robots, Up: Appendices
1100 Security Considerations
1101 =======================
1103 When using Wget, you must be aware that it sends unencrypted
1104 passwords through the network, which may present a security problem.
1105 Here are the main issues, and some solutions.
1107 1. The passwords on the command line are visible using `ps'. If this
1108 is a problem, avoid putting passwords from the command line--e.g.
1109 you can use `.netrc' for this.
1111 2. Using the insecure "basic" authentication scheme, unencrypted
1112 passwords are transmitted through the network routers and gateways.
1114 3. The FTP passwords are also in no way encrypted. There is no good
1115 solution for this at the moment.
1117 4. Although the "normal" output of Wget tries to hide the passwords,
1118 debugging logs show them, in all forms. This problem is avoided by
1119 being careful when you send debug logs (yes, even when you send
1123 File: wget.info, Node: Contributors, Prev: Security Considerations, Up: Appendices
1128 GNU Wget was written by Hrvoje Niksic <hniksic@iskon.hr>. However,
1129 its development could never have gone as far as it has, were it not for
1130 the help of many people, either with bug reports, feature proposals,
1131 patches, or letters saying "Thanks!".
1133 Special thanks goes to the following people (no particular order):
1135 * Karsten Thygesen--donated the mailing list and the initial FTP
1138 * Shawn McHorse--bug reports and patches.
1140 * Kaveh R. Ghazi--on-the-fly `ansi2knr'-ization.
1142 * Gordon Matzigkeit--`.netrc' support.
1144 * Zlatko Calusic, Tomislav Vujec and Drazen Kacar--feature
1145 suggestions and "philosophical" discussions.
1147 * Darko Budor--initial port to Windows.
1149 * Antonio Rosella--help and suggestions, plus the Italian
1152 * Tomislav Petrovic, Mario Mikocevic--many bug reports and
1155 * Francois Pinard--many thorough bug reports and discussions.
1157 * Karl Eichwalder--lots of help with internationalization and other
1160 * Junio Hamano--donated support for Opie and HTTP `Digest'
1163 * Brian Gough--a generous donation.
1165 The following people have provided patches, bug/build reports, useful
1166 suggestions, beta testing services, fan mail and all the other things
1167 that make maintenance so much fun:
1169 Tim Adam, Martin Baehr, Dieter Baron, Roger Beeman and the Gurus at
1170 Cisco, Dan Berger, Mark Boyns, John Burden, Wanderlei Cavassin, Gilles
1171 Cedoc, Tim Charron, Noel Cragg, Kristijan Conkas, Andrew Deryabin,
1172 Damir Dzeko, Andrew Davison, Ulrich Drepper, Marc Duponcheel,
1173 Aleksandar Erkalovic, Andy Eskilsson, Masashi Fujita, Howard Gayle,
1174 Marcel Gerrits, Hans Grobler, Mathieu Guillaume, Dan Harkless, Heiko
1175 Herold, Karl Heuer, HIROSE Masaaki, Gregor Hoffleit, Erik Magnus
1176 Hulthen, Richard Huveneers, Simon Josefsson, Mario Juric, Goran
1177 Kezunovic, Robert Kleine, Fila Kolodny, Alexander Kourakos, Martin
1178 Kraemer, Simos KSenitellis, Hrvoje Lacko, Daniel S. Lewart, Dave Love,
1179 Jordan Mendelson, Lin Zhe Min, Charlie Negyesi, Andrew Pollock, Steve
1180 Pothier, Jan Prikryl, Marin Purgar, Keith Refson, Tobias Ringstrom,
1181 Juan Jose Rodrigues, Edward J. Sabol, Heinz Salzmann, Robert Schmidt,
1182 Toomas Soome, Tage Stabell-Kulo, Sven Sternberger, Markus Strasser,
1183 Szakacsits Szabolcs, Mike Thomas, Russell Vincent, Charles G Waldman,
1184 Douglas E. Wegscheid, Jasmin Zainul, Bojan Zdrnja, Kristijan Zimmer.
1186 Apologies to all who I accidentally left out, and many thanks to all
1187 the subscribers of the Wget mailing list.