may tend towards the top). Not all of these represent user-visible
changes.
-* It would be nice to have a simple man page for wget that refers you to the
- .info files for more information. It could be as simple as the output of wget
- --help plus some boilerplate. This should stop wget re-packagers like RedHat
- who include the out-of-date 1.4.5 man page in order to have one. Perhaps we
- can automatically generate a man page from the .texi file like gcc does?
+* Only normal link-following recursion should respect -np. Page-requisite
+ recursion should not. When -np -p is specified, Wget should still retrieve
+ requisite images and such on the server, even if they aren't in that directory
+ or a subdirectory of it. Likewise, -H -np -p should retrieve requisite files
+ from other hosts.
+
+* Add a --range parameter allowing you to explicitly specify a range of bytes to
+ get from a file over HTTP (FTP only supports ranges ending at the end of the
+ file, though forcibly disconnecting from the server at the desired endpoint
+ might be workable).
+
+* RFC 1738 says that if logging on to an FTP server puts you in a directory
+ other than '/', the way to specify a file relative to '/' in a URL (let's use
+ "/bin/ls" in this example) is "ftp://host/%2Fbin/ls". Wget needs to support
+ this (and ideally not consider "ftp://host//bin/ls" to be equivalent, as that
+ would equate to the command "CWD " rather than "CWD /"). To accomodate people
+ used to broken FTP clients like Internet Explorer and Netscape, if
+ "ftp://host/bin/ls" doesn't exist, Wget should try again (perhaps under
+ control of an option), acting as if the user had typed "ftp://host/%2Fbin/ls".
+
+* If multiple FTP URLs are specified that are on the same host, Wget should
+ re-use the connection rather than opening a new one for each file.
* Try to devise a scheme so that, when password is unknown, Wget asks
the user for one.
* Make wget return non-zero status in more situations, like incorrect HTTP auth.
-* Timestamps are sometimes not copied over on files retrieved by FTP.
-
* Make -K compare X.orig to X and move the former on top of the latter if
they're the same, rather than leaving identical .orig files laying around.
* Make `-k' check for files that were downloaded in the past and convert links
to them in newly-downloaded documents.
+* -k currently converts non-fully-qualified (i.e. relative or "hostless
+ absolute") links to files that weren't downloaded into fully-qualified URLs.
+ When files _are_ downloaded, though, absolute links to them (hostless or
+ with-host) should be converted to relative.
+
* Add option to clobber existing file names (no `.N' suffixes).
* Introduce a concept of "boolean" options. For instance, every
* Rewrite FTP code to allow for easy addition of new commands. It
should probably be coded as a simple DFA engine.
-* Recognize more FTP servers (VMS).
-
* Make HTTP timestamping use If-Modified-Since facility.
* Implement better spider options.