- Hey Emacs, this is -*- outline -*- mode
+ -*- outline -*-
-This is the to-do list for Wget. There is no timetable of when we
+This is the to-do list for GNU Wget. There is no timetable of when we
plan to implement these features -- this is just a list of features
we'd like to see in Wget, as well as a list of problems that need
-fixing. Patches to implement these items are likely to be accepted.
+fixing. Patches to implement these items are likely to be accepted,
+especially if they follow the coding convention outlined in PATCHES
+and if they patch the documentation as well.
+
The items are not listed in any particular order (except that
recently-added items may tend towards the top). Not all of these
represent user-visible changes.
-* Currently Wget mirrors remote FTP permissions whenever it retrieves
- the directory listing. This is undesirable for most users, as
- permissions like "664" are frequently used on the servers, which
- might not be what the user wants. Wget should be changed not to
- mirror remote FTP permissions by default. There should be a new
- option add an option that enables this back on.
-
-* Honor `Content-Disposition: XXX; filename="FILE"' when creating the
- file name. If possible, try not to break `-nc' and friends when
- doing that.
-
-* Should allow retries with multiple downloads when using -O on
- regular files. As the source comment says: "A possible solution to
- [rewind not working with multiple downloads] would be to remember
- the file position in the output document and to seek to that
- position, instead of rewinding."
-
- But the above won't work for -O/dev/stdout, when stdout is a pipe.
- An even better solution would be to simply keep writing to the same
- file descriptor each time, instead of reopening it in append mode.
+* Change the file name generation logic so that redirects can't dictate
+ file names (but redirects should still be followed). By default, file
+ names should be generated only from the URL the user provided. However,
+ with an appropriate flag, Wget will allow the remote server to specify
+ the file name, either through redirection (as is always the case now)
+ or via the increasingly popular header `Content-Disposition: XXX;
+ filename="FILE"'.
+
+ The file name should be generated and displayed *after* processing
+ the server's response, not before, as it is done now. This will
+ allow trivial implementation of -nc, of O_EXCL when opening the
+ file, --html-extension will stop being a horrible hack, and so on.
+
+* -O should be respected, with no exceptions. It should work in
+ conjunction with -N and -k. (This is hard to achieve in the current
+ code base.) Ancillary files, such as directory listings and such,
+ should be downloaded either directly to memory, or to /tmp.
+
+* Implement digest and NTLM authorization for proxies. This is harder
+ than it seems because it requires some rethinking of the HTTP code.
+
+* Rethink the interaction between recur.c (the recursive download code)
+ and HTTP/FTP code. Ideally, the downloading code should have a way
+ to retrieve a file and, optionally, to specify a list of URLs for
+ continuing the "recursive" download. FTP code will surely benefit
+ from such a restructuring because its current incarnation is way too
+ smart for its own good.
+
+* Both HTTP and FTP connections should be first-class objects that can
+ be reused after a download is done. Currently information about both
+ is kept implicitly on the stack, and forgotten after each download.
+
+* Restructure the FTP code to remove massive amounts of code duplication
+ and repetition. Remove all the "intelligence" and make it work as
+ outlined in the previous bullet.
+
+* Add support for SFTP. Teach Wget about newer features of more
+ recent FTP servers in general, such as receiving reliable checksums
+ and timestamps. This can be used to implement really robust
+ downloads.
* Wget shouldn't delete rejected files that were not downloaded, but
just found on disk because of `-nc'. For example, `wget -r -nc
removing any of the existing HTML files.
* Be careful not to lose username/password information given for the
- URL on the command line.
+ URL on the command line. For example,
+ wget -r http://username:password@server/path/ should send that
+ username and password to all content under /path/ (this is apparently
+ what browsers do).
+
+* Don't send credentials using "Basic" authorization before the server
+ has a chance to tell us that it supports Digest or NTLM!
* Add a --range parameter allowing you to explicitly specify a range
of bytes to get from a file over HTTP (FTP only supports ranges
ending at the end of the file, though forcibly disconnecting from
- the server at the desired endpoint might be workable).
+ the server at the desired endpoint would work). For example,
+ --range=n-m would specify inclusive range (a la the Range header),
+ and --range=n:m would specify exclusive range (a la Python's
+ slices). -c should work with --range by assuming the range is
+ partially downloaded on disk, and contuing from there (effectively
+ requesting a smaller range).
* If multiple FTP URLs are specified that are on the same host, Wget should
re-use the connection rather than opening a new one for each file.
+ This should be easy provided the above restructuring of FTP code that
+ would include the FTP connection becoming a first-class objects.
* Try to devise a scheme so that, when password is unknown, Wget asks
- the user for one.
+ the user for one. This is harder than it seems because the password
+ may be requested by some page encountered long after the user has
+ left Wget to run.
* If -c used with -N, check to make sure a file hasn't changed on the server
before "continuing" to download it (preventing a bogus hybrid file).
* Generalize --html-extension to something like --mime-extensions and
- have it look at mime.types/mimecap file for preferred extension.
- Non-HTML files with filenames changed this way would be
- re-downloaded each time despite -N unless .orig files were saved for
- them. Since .orig would contain the same data as non-.orig, the
- latter could be just a link to the former. Another possibility
- would be to implement a per-directory database called something like
+ have consult mime.types for the preferred extension. Non-HTML files
+ with filenames changed this way would be re-downloaded each time
+ despite -N unless .orig files were saved for them. (#### Why? The
+ HEAD request we use to implement -N would still be able to construct
+ the correct file name based on the declared Content-Type.)
+
+ Since .orig would contain the same data as non-.orig, the latter
+ could be just a link to the former. Another possibility would be to
+ implement a per-directory database called something like
.wget_url_mapping containing URLs and their corresponding filenames.
-* When spanning hosts, there's no way to say that you are only interested in
- files in a certain directory on _one_ of the hosts (-I and -X apply to all).
- Perhaps -I and -X should take an optional hostname before the directory?
-
-* Add an option to not encode special characters like ' ' and '~' when saving
- local files. Would be good to have a mode that encodes all special characters
- (as now), one that encodes none (as above), and one that only encodes a
- character if it was encoded in the original URL (e.g. %20 but not %7E).
+* When spanning hosts, there's no way to say that you are only
+ interested in files in a certain directory on _one_ of the hosts (-I
+ and -X apply to all). Perhaps -I and -X should take an optional
+ "hostname:" before the directory?
* --retr-symlinks should cause wget to traverse links to directories too.
* Make wget return non-zero status in more situations, like incorrect HTTP auth.
+ Create and document different exit statuses for different errors.
* Make -K compare X.orig to X and move the former on top of the latter if
they're the same, rather than leaving identical .orig files laying around.
* Make `-k' check for files that were downloaded in the past and convert links
to them in newly-downloaded documents.
-* Add option to clobber existing file names (no `.N' suffixes).
+* Devise a way for options to have effect on a per-URL basis. This is very
+ natural for some options, such as --post-data. It could be implemented
+ simply by having more than one struct options.
-* Introduce real "boolean" options. Every `--foo' setting should have
- a corresponding `--no-foo' that turns off. This is useful even for
- options turned off by default, because the default can be reversed
- in `.wgetrc'. Get rid of `--foo=no'. Short options would be
- handled as `-x' vs. `-nx'.
+* Add option to clobber existing file names (no `.N' suffixes).
-* Add option to only list wildcard matches without doing the download.
+* Add option to only list wildcard matches without doing the download. The same
+ could be generalized to support something like apt's --print-uri.
* Handle MIME types correctly. There should be an option to (not)
retrieve files based on MIME types, e.g. `--accept-types=image/*'.
+ This would work for FTP by translating file extensions to MIME types
+ using mime.types.
-* Allow time-stamping by arbitrary date.
+* Allow time-stamping by arbitrary date. For example,
+ wget --if-modified-after DATE URL.
-* Allow size limit to files (perhaps with an option to download oversize files
- up through the limit or not at all, to get more functionality than [u]limit.
+* Make quota apply to single files, preferrably so that the download of an
+ oversized file is not attempted at all.
-* Download to .in* when mirroring.
+* When updating an existing mirror, download to temporary files (such as .in*)
+ and rename the file after the download is done.
* Add an option to delete or move no-longer-existent files when mirroring.
-* Implement uploading (--upload URL?) in FTP and HTTP.
-
-* Rewrite FTP code to allow for easy addition of new commands. It
- should probably be coded as a simple DFA engine.
+* Implement uploading (--upload=FILE URL?) in FTP and HTTP. A beginning of
+ this is available in the form of --post-file, but it should be expanded to
+ be really useful.
* Make HTTP timestamping use If-Modified-Since facility.
-* Implement better spider options.
-
-* Add more protocols (e.g. gopher and news), implementing them in a
- modular fashion.
-
-* Implement a concept of "packages" a la mirror.
+* Add more protocols (such as news or possibly some of the streaming
+ protocols), implementing them in a modular fashion.
* Add a "rollback" option to have continued retrieval throw away a
configurable number of bytes at the end of a file before resuming
download. Apparently, some stupid proxies insert a "transfer
interrupted" string we need to get rid of.
-
-* When using --accept and --reject, you can end up with empty directories. Have
- Wget any such at the end.