add log
This commit is contained in:
parent
e7a85e5e00
commit
b015a3e8e2
12 changed files with 116 additions and 63 deletions
|
@ -20,19 +20,19 @@ Wget has been designed for robustness over slow or unstable network connections;
|
|||
| `-e, --execute command` | Execute command as if it were a part of .wgetrc. A command thus invoked will be executed after the commands in .wgetrc, thus taking precedence over them. If you need to specify more than one wgetrc command, use multiple instances of -e. |
|
||||
|
||||
### Logging Options
|
||||
| Option | Description |
|
||||
| ----------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-o, --output-file=logfile` | Log all messages to logfile. The messages are normally reported to standard error. |
|
||||
| `-a, --append-output=logfile` | Append to logfile. This is the same as -o, only it appends to logfile instead of overwriting the old log file. |
|
||||
| `-q, --quiet` | Turn off Wget's output. |
|
||||
| Option | Description |
|
||||
| ----------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-o, --output-file=logfile` | Log all messages to logfile. The messages are normally reported to standard error. |
|
||||
| `-a, --append-output=logfile` | Append to logfile. This is the same as -o, only it appends to logfile instead of overwriting the old [log file](../../../dev/Log.md). |
|
||||
| `-q, --quiet` | Turn off Wget's output. |
|
||||
| `-i, --input-file=file` | Read URLs from a local or external file. If - is specified as file, URLs are read from the standard input. (Use ./- to read from a file literally named -.). If this function is used, no URLs need be present on the command line. If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line. However, if you specify --force-html, the document will be regarded as [html](../../../internet/HTML.md). In that case you may have problems with relative links, which you can solve either by adding "\<base href="url">" to the documents or by specifying --base=url on the command line. If the file is an external one, the document will be automatically treated as [html](../../../internet/HTML.md) if the Content-Type matches text/html. Furthermore, the file's location will be implicitly used as base href if none was specified. |
|
||||
| `-B, --base=URL` | Resolves relative links using URL as the point of reference, when reading links from an [HTML](../../../internet/HTML.md) file specified via the -i/--input-file option (together with --force-html, or when the input file was fetched remotely from a server describing it as [HTML](../../../internet/HTML.md)). This is equivalent to the presence of a "BASE" tag in the [HTML](../../../internet/HTML.md) input file, with URL as the value for the "href" attribute. |
|
||||
| `-B, --base=URL` | Resolves relative links using URL as the point of reference, when reading links from an [HTML](../../../internet/HTML.md) file specified via the -i/--input-file option (together with --force-html, or when the input file was fetched remotely from a server describing it as [HTML](../../../internet/HTML.md)). This is equivalent to the presence of a "BASE" tag in the [HTML](../../../internet/HTML.md) input file, with URL as the value for the "href" attribute. |
|
||||
|
||||
### Download Options
|
||||
| Option | Description |
|
||||
| ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-t, --tries=number` | Set number of tries to number. Specify 0 or inf for infinite retrying. The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried. |
|
||||
| `-O, --output-document=file` | The documents will not be written to the appropriate files, but all will be concatenated together and written to file. If - is used as file, documents will be printed to standard output, disabling link conversion. (Use ./- to print to a file literally named -.) |
|
||||
| `-O, --output-document=file` | The documents will not be written to the appropriate files, but all will be concatenated together and written to file. If - is used as file, documents will be printed to standard output, disabling link conversion. (Use ./- to print to a file literally named -.) |
|
||||
| `--backups=backups` | Before (over)writing a file, back up an existing file by adding a .1 suffix (\_1 on VMS) to the file name. Such backup files are rotated to .2, .3, and so on, up to `backups` (and lost beyond that) |
|
||||
| `-c, --continue` | Continue getting a partially-downloaded file. This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. |
|
||||
| `--show-progress` | Force wget to display the progress bar in any verbosity. |
|
||||
|
@ -41,7 +41,7 @@ Wget has been designed for robustness over slow or unstable network connections;
|
|||
| `-w, --wait=seconds` | Wait the specified number of seconds between the retrievals. Use of this option is recommended, as it lightens the server load by making the requests less frequent. Instead of in seconds, the time can be specified in minutes using the "m" suffix, in hours using "h" suffix, or in days using "d" suffix. |
|
||||
| `--waitretry=seconds` | If you don't want Wget to wait between every retrieval, but only between retries of failed downloads, you can use this option. Wget will use linear backoff, waiting 1 second after the first failure on a given file, then waiting 2 seconds after the second failure on that file, up to the maximum number of seconds you specify. |
|
||||
| `--random-wait` | Some web sites may perform log analysis to identify retrieval programs such as Wget by looking for statistically significant similarities in the time between requests. This option causes the time between requests to vary between 0.5 and 1.5 * wait seconds, where wait was specified using the --wait option, in order to mask Wget's presence from such analysis. |
|
||||
| `--user=user, --password=password` | Specify the username and password for both [FTP](../../../internet/FTP.md) and [HTTP](../../../internet/HTTP.md) file retrieval. |
|
||||
| `--user=user, --password=password` | Specify the username and password for both [FTP](../../../internet/FTP.md) and [HTTP](../../../internet/HTTP.md) file retrieval. |
|
||||
| `--ask-password` | Prompt for a password for each connection established. |
|
||||
|
||||
### Directory Options
|
||||
|
@ -55,13 +55,13 @@ Wget has been designed for robustness over slow or unstable network connections;
|
|||
| Option | Description |
|
||||
| ---------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `--no-cookies` | Disable the use of cookies. |
|
||||
| `--load-cookies file` | Load cookies from file before the first [HTTP](../../../internet/HTTP.md) retrieval. file is a textual file in the format originally used by Netscape's cookies.txt file. |
|
||||
| `--load-cookies file` | Load cookies from file before the first [HTTP](../../../internet/HTTP.md) retrieval. file is a textual file in the format originally used by Netscape's cookies.txt file. |
|
||||
| `--save-cookies file` | Save cookies to file before exiting. This will not save cookies that have expired or that have no expiry time (so-called "session cookies"), but also see --keep-session-cookies. |
|
||||
| `--keep-session-cookies` | When specified, causes --save-cookies to also save session cookies. Session cookies are normally not saved because they are meant to be kept in memory and forgotten when you exit the browser. Saving them is useful on sites that require you to log in or to visit the home page before you can access some pages. With this option, multiple Wget runs are considered a single browser session as far as the site is concerned. |
|
||||
| `--header=header-line` | Send header-line along with the rest of the headers in each [HTTP](../../../internet/HTTP.md) request. The supplied header is sent as-is, which means it must contain name and value separated by colon, and must not contain newlines. |
|
||||
| `--header=header-line` | Send header-line along with the rest of the headers in each [HTTP](../../../internet/HTTP.md) request. The supplied header is sent as-is, which means it must contain name and value separated by colon, and must not contain newlines. |
|
||||
| `--proxy-user=user, --proxy-password=password` | Specify the username user and password password for authentication on a proxy server. Wget will encode them using the "basic" authentication scheme. |
|
||||
| `--referer=url` | Include 'Referer: url' header in [HTTP](../../../internet/HTTP.md) request. Useful for retrieving documents with server-side processing that assume they are always being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. |
|
||||
| `-U, --user-agent=agent-string` | Identify as `agent-string` to the [HTTP](../../../internet/HTTP.md) server. |
|
||||
| `--referer=url` | Include 'Referer: url' header in [HTTP](../../../internet/HTTP.md) request. Useful for retrieving documents with server-side processing that assume they are always being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. |
|
||||
| `-U, --user-agent=agent-string` | Identify as `agent-string` to the [HTTP](../../../internet/HTTP.md) server. |
|
||||
|
||||
### HTTPS Options
|
||||
| Option | Description |
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue