1 Hey Emacs, this is -*- outline -*- mode
3 This is the to-do list for GNU Wget. There is no timetable of when we
4 plan to implement these features -- this is just a list of features
5 we'd like to see in Wget, as well as a list of problems that need
6 fixing. Patches to implement these items are likely to be accepted,
7 especially if they follow the coding convention outlined in PATCHES
8 and if they patch the documentation as well.
10 The items are not listed in any particular order (except that
11 recently-added items may tend towards the top). Not all of these
12 represent user-visible changes.
14 * Honor `Content-Disposition: XXX; filename="FILE"' when creating the
15 file name. If possible, try not to break `-nc' and friends when
18 * Should allow retries with multiple downloads when using -O on
19 regular files. As the source comment says: "A possible solution to
20 [rewind not working with multiple downloads] would be to remember
21 the file position in the output document and to seek to that
22 position, instead of rewinding."
24 But the above won't work for -O/dev/stdout, when stdout is a pipe.
25 An even better solution would be to simply keep writing to the same
26 file descriptor each time, instead of reopening it in append mode.
28 * Wget shouldn't delete rejected files that were not downloaded, but
29 just found on disk because of `-nc'. For example, `wget -r -nc
30 -A.gif URL' should allow the user to get all the GIFs without
31 removing any of the existing HTML files.
33 * Be careful not to lose username/password information given for the
34 URL on the command line.
36 * Add a --range parameter allowing you to explicitly specify a range
37 of bytes to get from a file over HTTP (FTP only supports ranges
38 ending at the end of the file, though forcibly disconnecting from
39 the server at the desired endpoint might be workable).
41 * If multiple FTP URLs are specified that are on the same host, Wget should
42 re-use the connection rather than opening a new one for each file.
44 * Try to devise a scheme so that, when password is unknown, Wget asks
47 * If -c used with -N, check to make sure a file hasn't changed on the server
48 before "continuing" to download it (preventing a bogus hybrid file).
50 * Generalize --html-extension to something like --mime-extensions and
51 have it look at mime.types/mimecap file for preferred extension.
52 Non-HTML files with filenames changed this way would be
53 re-downloaded each time despite -N unless .orig files were saved for
54 them. Since .orig would contain the same data as non-.orig, the
55 latter could be just a link to the former. Another possibility
56 would be to implement a per-directory database called something like
57 .wget_url_mapping containing URLs and their corresponding filenames.
59 * When spanning hosts, there's no way to say that you are only interested in
60 files in a certain directory on _one_ of the hosts (-I and -X apply to all).
61 Perhaps -I and -X should take an optional hostname before the directory?
63 * --retr-symlinks should cause wget to traverse links to directories too.
65 * Make wget return non-zero status in more situations, like incorrect HTTP auth.
67 * Make -K compare X.orig to X and move the former on top of the latter if
68 they're the same, rather than leaving identical .orig files laying around.
70 * Make `-k' check for files that were downloaded in the past and convert links
71 to them in newly-downloaded documents.
73 * Add option to clobber existing file names (no `.N' suffixes).
75 * Add option to only list wildcard matches without doing the download.
77 * Handle MIME types correctly. There should be an option to (not)
78 retrieve files based on MIME types, e.g. `--accept-types=image/*'.
80 * Allow time-stamping by arbitrary date.
82 * Allow size limit to files (perhaps with an option to download oversize files
83 up through the limit or not at all, to get more functionality than [u]limit.
85 * Download to .in* when mirroring.
87 * Add an option to delete or move no-longer-existent files when mirroring.
89 * Implement uploading (--upload URL?) in FTP and HTTP.
91 * Rewrite FTP code to allow for easy addition of new commands. It
92 should probably be coded as a simple DFA engine.
94 * Make HTTP timestamping use If-Modified-Since facility.
96 * Add more protocols (e.g. gopher and news), implementing them in a
99 * Add a "rollback" option to have continued retrieval throw away a
100 configurable number of bytes at the end of a file before resuming
101 download. Apparently, some stupid proxies insert a "transfer
102 interrupted" string we need to get rid of.