1 Hey Emacs, this is -*- outline -*- mode
3 This is the to-do list for Wget. There is no timetable of when we plan to
4 implement these features -- this is just a list of things it'd be nice to see in
5 Wget. Patches to implement any of these items would be gladly accepted. The
6 items are not listed in any particular order (except that recently-added items
7 may tend towards the top). Not all of these represent user-visible
10 * Only normal link-following recursion should respect -np. Page-requisite
11 recursion should not. When -np -p is specified, Wget should still retrieve
12 requisite images and such on the server, even if they aren't in that directory
13 or a subdirectory of it. Likewise, -H -np -p should retrieve requisite files
16 * Add a --range parameter allowing you to explicitly specify a range of bytes to
17 get from a file over HTTP (FTP only supports ranges ending at the end of the
18 file, though forcibly disconnecting from the server at the desired endpoint
21 * RFC 1738 says that if logging on to an FTP server puts you in a directory
22 other than '/', the way to specify a file relative to '/' in a URL (let's use
23 "/bin/ls" in this example) is "ftp://host/%2Fbin/ls". Wget needs to support
24 this (and ideally not consider "ftp://host//bin/ls" to be equivalent, as that
25 would equate to the command "CWD " rather than "CWD /"). To accomodate people
26 used to broken FTP clients like Internet Explorer and Netscape, if
27 "ftp://host/bin/ls" doesn't exist, Wget should try again (perhaps under
28 control of an option), acting as if the user had typed "ftp://host/%2Fbin/ls".
30 * If multiple FTP URLs are specified that are on the same host, Wget should
31 re-use the connection rather than opening a new one for each file.
33 * Try to devise a scheme so that, when password is unknown, Wget asks
36 * Limit the number of successive redirection to max. 20 or so.
38 * If -c used on a file that's already completely downloaded, don't re-download
39 it (unless normal --timestamping processing would cause you to do so).
41 * If -c used with -N, check to make sure a file hasn't changed on the server
42 before "continuing" to download it (preventing a bogus hybrid file).
45 <http://info.webcrawler.com/mak/projects/robots/norobots-rfc.html>
46 and support the new directives.
48 * Generalize --html-extension to something like --mime-extensions and have it
49 look at mime.types/mimecap file for preferred extension. Non-HTML files with
50 filenames changed this way would be re-downloaded each time despite -N unless
51 .orig files were saved for them. Since .orig would contain the same data as
52 non-.orig, the latter could be just a link to the former. Another possibility
53 would be to implement a per-directory database called something like
54 .wget_url_mapping containing URLs and their corresponding filenames.
56 * When spanning hosts, there's no way to say that you are only interested in
57 files in a certain directory on _one_ of the hosts (-I and -X apply to all).
58 Perhaps -I and -X should take an optional hostname before the directory?
60 * Add an option to not encode special characters like ' ' and '~' when saving
61 local files. Would be good to have a mode that encodes all special characters
62 (as now), one that encodes none (as above), and one that only encodes a
63 character if it was encoded in the original URL (e.g. %20 but not %7E).
65 * --retr-symlinks should cause wget to traverse links to directories too.
67 * Make wget return non-zero status in more situations, like incorrect HTTP auth.
69 * Make -K compare X.orig to X and move the former on top of the latter if
70 they're the same, rather than leaving identical .orig files laying around.
72 * If CGI output is saved to a file, e.g. cow.cgi?param, -k needs to change the
73 '?' to a "%3F" in links to that file to avoid passing part of the filename as
76 * Make `-k' convert <base href=...> too.
78 * Make `-k' check for files that were downloaded in the past and convert links
79 to them in newly-downloaded documents.
81 * -k currently converts non-fully-qualified (i.e. relative or "hostless
82 absolute") links to files that weren't downloaded into fully-qualified URLs.
83 When files _are_ downloaded, though, absolute links to them (hostless or
84 with-host) should be converted to relative.
86 * Add option to clobber existing file names (no `.N' suffixes).
88 * Introduce a concept of "boolean" options. For instance, every
89 boolean option `--foo' would have a `--no-foo' equivalent for
90 turning it off. Get rid of `--foo=no' stuff. Short options would
91 be handled as `-x' vs. `-nx'.
93 * Implement "thermometer" display (not all that hard; use an
94 alternative show_progress() if the output goes to a terminal.)
96 * Add option to only list wildcard matches without doing the download.
98 * Add case-insensitivity as an option.
100 * Handle MIME types correctly. There should be an option to (not)
101 retrieve files based on MIME types, e.g. `--accept-types=image/*'.
103 * Implement "persistent" retrieving. In "persistent" mode Wget should
104 treat most of the errors as transient.
106 * Allow time-stamping by arbitrary date.
108 * Fix Unix directory parser to allow for spaces in file names.
110 * Allow size limit to files (perhaps with an option to download oversize files
111 up through the limit or not at all, to get more functionality than [u]limit.
113 * Implement breadth-first retrieval.
115 * Download to .in* when mirroring.
117 * Add an option to delete or move no-longer-existent files when mirroring.
119 * Implement a switch to avoid downloading multiple files (e.g. x and x.gz).
121 * Implement uploading (--upload URL?) in FTP and HTTP.
123 * Rewrite FTP code to allow for easy addition of new commands. It
124 should probably be coded as a simple DFA engine.
126 * Make HTTP timestamping use If-Modified-Since facility.
128 * Implement better spider options.
130 * Add more protocols (e.g. gopher and news), implementing them in a
133 * Implement a concept of "packages" a la mirror.
135 * Implement correct RFC1808 URL parsing.
137 * Implement HTTP cookies.
139 * Implement more HTTP/1.1 bells and whistles (ETag, Content-MD5 etc.)
141 * Add a "rollback" option to have --continue throw away a configurable number of
142 bytes at the end of a file before resuming download. Apparently, some stupid
143 proxies insert a "transfer interrupted" string we need to get rid of.
145 * When using --accept and --reject, you can end up with empty directories. Have
146 Wget any such at the end.