restructure

This commit is contained in:
JMARyA 2024-01-17 09:00:45 +01:00
parent ef7661245b
commit 598a10bc28
Signed by: jmarya
GPG key ID: 901B2ADDF27C2263
182 changed files with 342 additions and 336 deletions

View file

@ -0,0 +1,39 @@
---
obj: application
os:
- linux
website: https://aria2.github.io/
repo: https://github.com/aria2/aria2
---
# aria2
[aria2](https://aria2.github.io/) is a utility for downloading files. The supported protocols are [HTTP](../../../internet/HTTP.md)(S), [FTP](../../../internet/FTP.md), SFTP, [BitTorrent](../../../internet/BitTorrent.md), and Metalink. aria2 can download a file from multiple sources/protocols and tries to utilize your maximum download bandwidth. It supports downloading a file from [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md)/SFTP and [BitTorrent](../../../internet/BitTorrent.md) at the same time, while the data downloaded from [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md)/SFTP is uploaded to the [BitTorrent](../../../internet/BitTorrent.md) swarm. Using Metalink's chunk checksums, aria2 automatically validates chunks of data while downloading a file like [BitTorrent](../../../internet/BitTorrent.md). Aria2 can be used as a downloader by [yt-dlp](../../media/yt-dlp.md).
## Usage
```shell
aria2c [<OPTIONS>] [<URI>|<MAGNET>|<TORRENT_FILE>|<METALINK_FILE>]
```
### Options
| Option | Description |
| -------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-d, --dir=<DIR>` | The directory to store the downloaded file. |
| `-i, --input-file=<FILE>` | Downloads the URIs listed in **FILE**. |
| `-l, --log=<LOG>` | The file name of the log file. If - is specified, log is written to stdout. If empty string("") is specified, or this option is omitted, no log is written to disk at all. |
| `-j, --max-concurrent-downloads=<N>` | Set the maximum number of parallel downloads for every queue item. |
| `-V, --check-integrity [true/false]` | Check file integrity by validating piece hashes or a hash of entire file. This option has effect only in [BitTorrent](../../../internet/BitTorrent.md), Metalink downloads with checksums or [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads with --checksum option. If piece hashes are provided, this option can detect damaged portions of a file and re-download them. If a hash of entire file is provided, hash check is only done when file has been already download. This is determined by file length. If hash check fails, file is re-downloaded from scratch. If both piece hashes and a hash of entire file are provided, only piece hashes are used. Default: false |
| `-c, --continue [true/false]` | Continue downloading a partially downloaded file. Use this option to resume a download started by a web browser or another program which downloads files sequentially from the beginning. Currently this option is only applicable to [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads. |
| `--checksum=<TYPE>=<DIGEST>` | Set checksum. TYPE is hash type. The supported hash type is listed in Hash Algorithms in aria2c -v. DIGEST is hex digest. For example, setting sha-1 digest looks like this: `sha-1=0192ba11326fe2298c8cb4de616f4d4140213838` This option applies only to [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads. |
| `-x, --max-connection-per-server=<NUM>` | The maximum number of connections to one server for each download.  Default: **1** |
| `-k, --min-split-size=<SIZE>` | aria2 does not split less than 2\*SIZE byte range. For example, let's consider downloading 20MiB file. If SIZE is 10M, aria2 can split file into 2 range (0-10MiB) and (10MiB-20MiB) and download it using 2 sources(if --split >= 2, of course). If SIZE is 15M, since 2\*15M > 20MiB, aria2 does not split file and download it using 1 source. You can append K or M (1K = 1024, 1M = 1024K). Possible Values: 1M -1024M Default: 20M |
| `-o, --out=<FILE>` | The file name of the downloaded file. It is always relative to the directory given in `--dir` option. |
| `-s, --split=<N>` | Download a file using N connections. If more than N URIs are given, first N URIs are used and remaining URIs are used for backup. If less than N URIs are given, those URIs are used more than once so that N connections total are made simultaneously. The number of connections to the same host is restricted by the `--max-connection-per-server` option. See also the `--min-split-size` option. Default: 5 |
| `-t, --timeout=<SEC>` | Set timeout in seconds. Default: 60 |
| `--check-certificate [true/false]` | Verify the peer using certificates specified in `--ca-certificate` option. Default: true |
| `--http-user=<USER>, --http-passwd=<PASSWD>` | Credentials for [HTTP](../../../internet/HTTP.md) Auth |
| `--header=<HEADER>` | Append HEADER to [HTTP](../../../internet/HTTP.md) request header. You can use this option repeatedly to specify more than one header:<br>`aria2c --header="X-A: b78" --header="X-B: 9J1" "http://host/file"` |
| `--load-cookies=<FILE>` | Load Cookies from FILE using the Firefox3 format (SQLite3), Chromium/Google Chrome (SQLite3) and the Mozilla/[Firefox](../../network/browsers/Firefox.md)(1.x/2.x)/Netscape format. |
| `--save-cookies=<FILE>` | Save Cookies to FILE in Mozilla/[Firefox](../../network/browsers/Firefox.md)(1.x/2.x)/ Netscape format. If FILE already exists, it is overwritten. Session Cookies are also saved and their expiry values are treated as 0. |
| `-U, --user-agent=<USER_AGENT>` | Set user agent for [HTTP](../../../internet/HTTP.md)(S) downloads. Default: `aria2/$VERSION`, $VERSION is replaced by package version. |
| `-S, --show-files [true/false]` | Print file listing of ".torrent", ".meta4" and ".metalink" file and exit. In case of ".torrent" file, additional information (infohash, piece length, etc) is also printed. |
| `--select-file=<INDEX>...` | (Torrent) Set file to download by specifying its index. You can find the file index using the `--show-files` option. Multiple indexes can be specified by using `,`, for example: `3,6`. You can also use `-` to specify a range: `1-5`. `,` and `-` can be used together: `1-5,8,9`. |

View file

@ -0,0 +1,32 @@
---
obj: application
os: linux
website: https://curl.se/
repo: https://github.com/curl/curl
---
# curl
cURL is a command-line tool and library for transferring data with URLs. It supports a wide range of protocols, making it a versatile tool for making [HTTP](../../../internet/HTTP.md) requests, downloading files, and more.
## Usage
To make a simple GET request: `curl https://example.com`
## Options
| Option | Description |
| -------------------------------------- | ------------------------------------------------------------------------------------- |
| `-C, --continue-at <offset>` | Continue/Resume a previous file transfer at the given offset. |
| `-c, --cookie-jar <filename>` | Specify to which file you want curl to write all cookies after a completed operation. |
| `-b, --cookie <data/filename>` | Pass the data to the [HTTP](../../../internet/HTTP.md) server in the [Cookie](../../../internet/Cookie.md) header. |
| `-d, --data <data>` | Sends the specified data in a POST request to the [HTTP](../../../internet/HTTP.md) server |
| `-F, --form <name=content>` | Specify multipart MIME data |
| `-k, --insecure` | Allow insecure server connections when using SSL |
| `-L, --location` | Follow redirects |
| `-o, --output <file>` | Write to file instead of stdout |
| `-x, --proxy [protocol://]host[:port]` | Use this proxy |
| `-X, --request <command>` | Specify request command to use |
| `-r, --range <range>` | Retrieve only the bytes within RANGE |
| `--retry <num> ` | Retry request if transient problems occur |
| `-s, --silent` | Silent mode |
| `--retry-delay <seconds> ` | Wait time between retries |
| `-u, --user <user:password>` | Server user and password |
| `-A, --user-agent <name>` | Send User-Agent \<name> to server |

View file

@ -0,0 +1,56 @@
---
obj: application
wiki: https://en.wikipedia.org/wiki/Netcat
---
# netcat
The `nc` (or `netcat`) utility is used for just about anything under the sun involving [TCP](../../../internet/TCP.md), [UDP](../../../internet/UDP.md), or UNIX-domain sockets. It can open [TCP](../../../internet/TCP.md) connections, send [UDP](../../../internet/UDP.md) packets, listen on arbitrary [TCP](../../../internet/TCP.md) and [UDP](../../../internet/UDP.md) ports, do port scanning, and deal with both IPv4 and IPv6.
Common uses include:
- simple [TCP](../../../internet/TCP.md) proxies
- shell-script based [HTTP](../../../internet/HTTP.md) clients and servers
- network daemon testing
- a SOCKS or [HTTP](../../../internet/HTTP.md) ProxyCommand for [ssh](../../network/SSH.md)
## Options
| Option | Description |
| ------------------ | --------------------------------------------------------------------------------------------------- |
| `-4` | Use IPv4 addresses only |
| `-6` | Use IPv6 addresses only |
| `-b` | Allow broadcast |
| `-l` | Listen for an incoming connection rather than initiating a connection to a remote host |
| `-N` | shutdown the network socket after EOF on the input. Some servers require this to finish their work |
| `-p <source_port>` | Specify the source port `nc` should use, subject to privilege restrictions and availability |
## Examples
### Client/Server Model
On one console, start `nc` listening on a specific port for a connection. For example:
```shell
nc -l 1234
```
`nc` is now listening on port 1234 for a connection. On a second console (or a second machine), connect to the machine and port being listened on:
```shell
nc -N 127.0.0.1 1234
```
There should now be a connection between the ports. Anything typed at the second console will be concatenated to the first, and vice-versa. After the connection has been set up, `nc` does not really care which side is being used as a server and which side is being used as a client. The connection may be terminated using an `EOF` (`^D`), as the `-N` flag was given.
### Data Transfer
The example in the previous section can be expanded to build a basic data transfer model. Any information input into one end of the connection will be output to the other end, and input and output can be easily captured in order to emulate file transfer.
Start by using `nc` to listen on a specific port, with output captured into a file:
```shell
nc -l 1234 > filename.out
```
Using a second machine, connect to the listening `nc` process, feeding it the file which is to be transferred:
```shell
nc -N host.example.com 1234 < filename.in
```
### Talking to Servers
It is sometimes useful to talk to servers “by hand” rather than through a user interface. It can aid in troubleshooting, when it might be necessary to verify what data a server is sending in response to commands issued by the client. For example, to retrieve the home page of a web site:
```shell
printf "GET / HTTP/1.0\r\n\r\n" | nc host.example.com 80
```

View file

@ -0,0 +1,33 @@
---
obj: application
source: https://www.kali.org/tools/netdiscover
repo: https://github.com/netdiscover-scanner/netdiscover
---
# netdiscover
Netdiscover is an active/passive address reconnaissance tool, mainly developed for those wireless networks without [dhcp](../../../internet/DHCP.md) server, when you are wardriving. It can be also used on hub/switched networks.
Built on top of libnet and libpcap, it can passively detect online hosts, or search for them, by actively sending ARP requests.
Netdiscover can also be used to inspect your network ARP traffic, or find network addresses using auto scan mode, which will scan for common local networks.
Netdiscover uses the OUI table to show the vendor of the each MAC address discovered and is very useful for security checks or in pentests.
## Options
| Option | Description |
| ------------ | -------------------------------------------------------------------------------------------- |
| ` -i device` | network device used |
| ` -r range` | scan a given range instead of auto scan. 192.168.6.0/24,/16,/8 |
| ` -l file` | scan the list of ranges contained into the given file |
| ` -p` | passive mode, do not send anything, only sniff |
| ` -m file` | scan a list of known MACs and host names |
| ` -F filter` | customize pcap filter expression (default: "arp") |
| ` -s time` | time to sleep between each ARP request (milliseconds) |
| ` -c count` | number of times to send each ARP request (for nets with packet loss) |
| ` -n node` | last source IP octet used for scanning (from 2 to 253) |
| ` -d` | ignore home config files for autoscan and fast mode |
| ` -f` | enable fastmode scan, saves a lot of time, recommended for auto |
| ` -P` | print results in a format suitable for parsing by another program and stop after active scan |
| ` -L` | similar to `-P` but continue listening after the active scan is completed |
| ` -N` | Do not print header. Only valid when `-P` or `-L` is enabled. |
| ` -S` | enable sleep time suppression between each request (hardcore mode) |

View file

@ -0,0 +1,122 @@
---
obj: application
website: https://nmap.org
repo: https://github.com/nmap/nmap
---
# nmap
Network exploration tool and security / port scanner
## Usage
Usage: `nmap [Scan Type(s)] [Options] {target specification}`
### Options
#### TARGET SPECIFICATION
Can pass hostnames, IP addresses, networks, etc.
Ex: scanme.nmap.org, 192.168.0.1; 10.0.0-255.1-254
| Option | Description |
| --------------------------------------- | --------------------------------- |
| `-iL <inputfilename>` | Input from list of hosts/networks |
| `--exclude <host1[,host2][,host3],...>` | Exclude hosts/networks |
| `--excludefile <exclude_file>` | Exclude list from file |
#### HOST DISCOVERY
| Option | Description |
| ----------------------------------- | --------------------------------------------------------------------------------------------------- |
| `-sL` | List Scan - simply list targets to scan |
| `-sn` | Ping Scan - disable port scan |
| `-PS/PA/PU/PY[portlist]` | [TCP](../../../internet/TCP.md) SYN/ACK, [UDP](../../../internet/UDP.md) or SCTP discovery to given ports |
| `-PE/PP/PM` | ICMP echo, timestamp, and netmask request discovery probes |
| `-n/-R` | Never do [DNS](../../../internet/DNS.md) resolution/Always resolve \[default: sometimes] |
| `--dns-servers <serv1[,serv2],...>` | Specify custom [DNS](../../../internet/DNS.md) servers |
| `--traceroute` | Trace hop path to each host |
#### SCAN TECHNIQUES
| Option | Description |
| --------------------- | ------------------------------------------------------------------ |
| `-sS/sT/sA/sW/sM` | [TCP](../../../internet/TCP.md) SYN/Connect()/ACK/Window/Maimon scans |
| `-sU` | [UDP](../../../internet/UDP.md) Scan |
| `-sN/sF/sX` | [TCP](../../../internet/TCP.md) Null, FIN, and Xmas scans |
| `--scanflags <flags>` | Customize [TCP](../../../internet/TCP.md) scan flags |
| `-sO` | IP protocol scan |
#### PORT SPECIFICATION AND SCAN ORDER
| Option | Description |
| ------------------------------- | --------------------------------------------------------------------------------------------- |
| `-p <port ranges>` | Only scan specified ports. Ex: `-p22`; `-p1-65535`; `-p U:53,111,137,T:21-25,80,139,8080,S:9` |
| `--exclude-ports <port ranges>` | Exclude the specified ports from scanning |
| `-F` | Fast mode - Scan fewer ports than the default scan |
| `-r` | Scan ports sequentially - don't randomize |
| `-top-ports <number>` | Scan \<number> most common ports |
#### SERVICE/VERSION DETECTION
| Option | Description |
| ----------------------------- | -------------------------------------------------- |
| `-sV` | Probe open ports to determine service/version info |
| `--version-intensity <level>` | Set from 0 (light) to 9 (try all probes) |
| `--version-light` | Limit to most likely probes (intensity 2) |
| `--version-all` | Try every single probe (intensity 9) |
#### SCRIPT SCAN
| Option | Description |
| ------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-sC` | equivalent to `--script=default` |
| `--script=<Lua scripts>` | \<Lua scripts> is a comma separated list of directories, script-files or script-categories. The scripts are commonly found at `/usr/share/nmap/scripts` |
| `--script-updatedb` | Update the script database. |
#### OS DETECTION
| Option | Description |
| ---------------- | --------------------------------------- |
| `-O` | Enable OS detection |
| `--osscan-limit` | Limit OS detection to promising targets |
| `--osscan-guess` | Guess OS more aggressively |
#### TIMING AND PERFORMANCE
Options which take \<time> are in seconds, or append 'ms' (milliseconds), 's' (seconds), 'm' (minutes), or 'h' (hours) to the value (e.g. 30m).
| Option | Descriptions |
| -------------------------------------------------------------- | ------------------------------------------------ |
| `-T<0-5>` | Set timing template (higher is faster) |
| `--min-hostgroup/max-hostgroup <size>` | Parallel host scan group sizes |
| `--min-parallelism/max-parallelism <numprobes>` | Probe parallelization |
| `--min-rtt-timeout/max-rtt-timeout/initial-rtt-timeout <time>` | Specifies probe round trip time. |
| `--max-retries <tries>` | Caps number of port scan probe retransmissions. |
| `--host-timeout <time>` | Give up on target after this long |
| `--scan-delay/--max-scan-delay <time>` | Adjust delay between probes |
| `--min-rate <number>` | Send packets no slower than \<number> per second |
| `--max-rate <number>` | Send packets no faster than \<number> per second |
#### FIREWALL/IDS EVASION AND SPOOFING
| Option | Description |
| ---------------------------------------------- | ------------------------------------------------------------------------------------------------- |
| `-f; --mtu <val>` | fragment packets (optionally w/given MTU) |
| `-D <decoy1,decoy2[,ME],...>` | Cloak a scan with IP decoys |
| `-S <IP_Address>` | Spoof source address |
| `-e <iface>` | Use specified interface |
| `-g/--source-port <portnum>` | Use given port number |
| `--proxies <url1,[url2],...>` | Relay connections through [HTTP](../../../internet/HTTP.md)/SOCKS4 proxies |
| `--data <hex string>` | Append a custom payload to sent packets |
| `--data-string <string>` | Append a custom [ASCII](../../../files/ASCII.md) string to sent packets |
| `--data-length <num>` | Append random data to sent packets |
| `--ip-options <options>` | Send packets with specified ip options |
| `--ttl <val>` | Set IP time-to-live field |
| `--spoof-mac <mac address/prefix/vendor name>` | Spoof your MAC address |
| `--badsum` | Send packets with a bogus [TCP](../../../internet/TCP.md)/[UDP](../../../internet/UDP.md)/SCTP checksum |
#### OUTPUT
| Option | Description |
| ------------------------- | -------------------------------------------------------------------------------------------------------------------------- |
| `-oN/-oX/-oS/-oG <file>` | Output scan in normal, [XML](../../../files/XML.md), scrIpt kIddi3, and Grepable format, respectively, to the given filename. |
| `-oA <basename>` | Output in the three major formats at once |
| `-v` | Increase verbosity level (use `-vv` or more for greater effect) |
| `--open` | Only show open (or possibly open) ports |
| `--append-output` | Append to rather than clobber specified output files |
| `--resume <filename>` | Resume an aborted scan |
| `--stylesheet <path/URL>` | XSL stylesheet to transform [XML](../../../files/XML.md) output to [HTML](../../../internet/HTML.md) |
| `--webxml` | Reference stylesheet from Nmap.Org for more portable [XML](../../../files/XML.md) |
| `--no-stylesheet` | Prevent associating of XSL stylesheet w/[XML](../../../files/XML.md) output |

View file

@ -0,0 +1,79 @@
---
obj: application
arch-wiki: https://wiki.archlinux.org/title/Wget
wiki: https://en.wikipedia.org/wiki/Wget
repo: https://git.savannah.gnu.org/cgit/wget.git
---
# wget
GNU Wget is a free utility for non-interactive download of files from the Web. It supports [HTTP](../../../internet/HTTP.md), HTTPS, and [FTP](../../../internet/FTP.md) protocols, as well as retrieval through [HTTP](../../../internet/HTTP.md) proxies.
Wget is non-interactive, meaning that it can work in the background, while the user is not logged on. This allows you to start a retrieval and disconnect from the system, letting Wget finish the work. By contrast, most of the Web browsers require constant user's presence, which can be a great hindrance when transferring a lot of data.
Wget can follow links in [HTML](../../../internet/HTML.md), XHTML, and [CSS](../../../internet/CSS.md) pages, to create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded files to point at the local files, for offline viewing.
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off.
## Options
| Option | Description |
| ----------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-b, --background` | Go to background immediately after startup. If no output file is specified via the -o, output is redirected to wget-log. |
| `-e, --execute command` | Execute  command  as  if  it were a part of .wgetrc.  A command thus invoked will be executed after the commands in .wgetrc, thus taking precedence over them.  If you need to specify more than one wgetrc command, use multiple instances of -e. |
### Logging Options
| Option | Description |
| ----------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-o, --output-file=logfile` | Log all messages to logfile. The messages are normally reported to standard error. |
| `-a, --append-output=logfile` | Append to logfile. This is the same as -o, only it appends to logfile instead of overwriting the old log file. |
| `-q, --quiet` | Turn off Wget's output. |
| `-i, --input-file=file` | Read URLs from a local or external file.  If - is specified as file, URLs are read from the standard input.  (Use ./- to read from a file literally named -.). If this function is used, no URLs need be present on the command line.  If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line. However, if you specify --force-html, the document will be regarded as [html](../../../internet/HTML.md).  In that case you may have problems with relative links, which you can solve either by adding "\<base href="url">" to the documents or by specifying --base=url on the command line. If the file is an external one, the document will be automatically treated as [html](../../../internet/HTML.md) if the Content-Type matches text/html. Furthermore, the file's location will be implicitly used as base href if none was specified. |
| `-B, --base=URL` | Resolves relative links using URL as the point of reference, when reading links from an [HTML](../../../internet/HTML.md) file specified via the -i/--input-file option (together with --force-html, or when the input file was fetched remotely from a server describing it as [HTML](../../../internet/HTML.md)). This is equivalent to the presence of a "BASE" tag in the [HTML](../../../internet/HTML.md) input file, with URL as the value for the "href" attribute. |
### Download Options
| Option | Description |
| ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-t, --tries=number` | Set number of tries to number. Specify 0 or inf for infinite retrying.  The default is to retry 20 times, with the exception of fatal errors like "connection refused" or "not found" (404), which are not retried. |
| `-O, --output-document=file` | The documents will not be written to the appropriate files, but all will be concatenated together and written to file.  If - is used as file, documents will be printed to standard output, disabling link conversion.  (Use ./- to print to a file literally named -.) |
| `--backups=backups` | Before (over)writing a file, back up an existing file by adding a .1 suffix (\_1 on VMS) to the file name.  Such backup files are rotated to .2, .3, and so on, up to `backups` (and lost beyond that) |
| `-c, --continue` | Continue  getting  a  partially-downloaded file.  This is useful when you want to finish up a download started by a previous instance of Wget, or by another program. |
| `--show-progress` | Force wget to display the progress bar in any verbosity. |
| `-T, --timeout=seconds` | Set the network timeout to `seconds` seconds. |
| `--limit-rate=amount` | Limit the download speed to amount bytes per second.  Amount may be expressed in bytes, kilobytes with the k suffix, or megabytes with the  m  suffix. For example, --limit-rate=20k will limit the retrieval rate to 20KB/s. This is useful when, for whatever reason, you don't want Wget to consume the entire available bandwidth. |
| `-w, --wait=seconds` | Wait the specified number of seconds between the retrievals. Use of this option is recommended, as it lightens the server load by making the requests less frequent. Instead of in seconds, the time can be specified in minutes using the "m" suffix, in hours using "h" suffix, or in days using "d" suffix. |
| `--waitretry=seconds` | If you don't want Wget to wait between every retrieval, but only between retries of failed downloads, you can use this option. Wget  will use linear backoff, waiting 1 second after the first failure on a given file, then waiting 2 seconds after the second failure on  that file, up to the maximum number of seconds you specify. |
| `--random-wait` | Some web sites may perform log analysis to identify retrieval programs such as Wget by looking for statistically significant  similarities in the time between requests. This option causes the time between requests to vary between 0.5 and 1.5 * wait seconds,  where wait was specified using the --wait option, in order to mask Wget's presence from such analysis. |
| `--user=user, --password=password` | Specify the username and password for both [FTP](../../../internet/FTP.md) and [HTTP](../../../internet/HTTP.md) file retrieval. |
| `--ask-password` | Prompt for a password for each connection established. |
### Directory Options
| Option | Description |
| ------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `-nH, --no-host-directories` | Disable generation of host-prefixed directories. By default, invoking Wget with -r http://fly.srk.fer.hr/ will create a structure of  directories beginning with fly.srk.fer.hr/. This option disables such behavior. |
| `--cut-dirs=number` | Ignore number directory components. This is useful for getting a fine-grained control over the directory where recursive retrieval will be saved. |
| `-P, --directory-prefix=prefix` | Set directory prefix to prefix. The directory prefix is the directory where all other files and subdirectories will be saved to, i.e.  the top of the retrieval tree. The default is . (the current directory). |
### HTTP Options
| Option | Description |
| ---------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `--no-cookies` | Disable the use of cookies. |
| `--load-cookies file` | Load cookies from file before the first [HTTP](../../../internet/HTTP.md) retrieval.  file is a textual file in the format originally used by Netscape's  cookies.txt file. |
| `--save-cookies file` | Save cookies to file before exiting. This will not save cookies that have expired or that have no expiry time (so-called "session cookies"), but also see --keep-session-cookies. |
| `--keep-session-cookies` | When specified, causes --save-cookies to also save session cookies. Session cookies are normally not saved because they are meant to be  kept in memory and forgotten when you exit the browser. Saving them is useful on sites that require you to log in or to visit the home  page before you can access some pages. With this option, multiple Wget runs are considered a single browser session as far as the site  is concerned. |
| `--header=header-line` | Send header-line along with the rest of the headers in each [HTTP](../../../internet/HTTP.md) request. The supplied header is sent as-is, which means it must  contain name and value separated by colon, and must not contain newlines. |
| `--proxy-user=user, --proxy-password=password` | Specify the username user and password password for authentication on a proxy server. Wget will encode them using the "basic"  authentication scheme. |
| `--referer=url` | Include 'Referer: url' header in [HTTP](../../../internet/HTTP.md) request. Useful for retrieving documents with server-side processing that assume they are always  being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. |
| `-U, --user-agent=agent-string` | Identify as `agent-string` to the [HTTP](../../../internet/HTTP.md) server. |
### HTTPS Options
| Option | Description |
| -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `--no-check-certificate` | Don't check the server certificate against the available certificate authorities. Also don't require the URL host name to match the  common name presented by the certificate. |
| `--ca-certificate=file` | Use file as the file with the bundle of certificate authorities ("CA") to verify the peers. The certificates must be in PEM format. |
| `--ca-directory=directory` | Specifies directory containing CA certificates in PEM format. |
### Recursive Retrieval Options
| Option | Description |
| ----------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `-r, --recursive` | Turn on recursive retrieving. The default maximum depth is 5. |
| `-l, --level=depth` | Set the maximum number of subdirectories that Wget will recurse into to depth. |
| `-k, --convert-links` | After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the  visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets,  hyperlinks to non-[HTML](../../../internet/HTML.md) content, etc. |
| `-p, --page-requisites` | This option causes Wget to download all the files that are necessary to properly display a given [HTML](../../../internet/HTML.md) page. This includes such things  as inlined images, sounds, and referenced stylesheets. |