restructure

This commit is contained in:
JMARyA 2024-01-17 09:00:45 +01:00
parent ef7661245b
commit 598a10bc28
Signed by: jmarya
GPG key ID: 901B2ADDF27C2263
182 changed files with 342 additions and 336 deletions

View file

@ -4,7 +4,7 @@ os: ["macos", "linux", "windows"]
repo: https://github.com/sharkdp/bat
---
# bat
bat is a cat rewrite in [Rust](../../programming/languages/Rust.md)
bat is a cat rewrite in [Rust](../../dev/programming/languages/Rust.md)
## Usage
Flags:

View file

@ -19,6 +19,6 @@ choose -f" " 0:3 # Choose element 0 to 3 seperated by " "
## Options
| Option | Description |
| ------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- |
| `-f, --field-separator <field-separator>` | Specify field separator other than whitespace, using [Rust](../../programming/languages/Rust.md) `regex` syntax |
| `-f, --field-separator <field-separator>` | Specify field separator other than whitespace, using [Rust](../../dev/programming/languages/Rust.md) `regex` syntax |
| `-i, --input <input>` | Input file |
| `-o, --output-field-separator <output-field-separator>` | Specify output field separator |

View file

@ -4,7 +4,7 @@ os: linux
aliases: ["7zip", "7z"]
---
# 7Zip
7-Zip is a file archiver with the highest compression ratio. The program supports 7z (that implements LZMA compression algorithm), [ZIP](../../files/ZIP.md), CAB, ARJ, GZIP, BZIP2, [TAR](tar.md), CPIO, RPM and DEB formats. Compression ratio in the new 7z format is 30-50% better than ratio in [ZIP](../../files/ZIP.md) format.
7-Zip is a file archiver with the highest compression ratio. The program supports 7z (that implements LZMA compression algorithm), [ZIP](../../../files/ZIP.md), CAB, ARJ, GZIP, BZIP2, [TAR](compression/tar.md), CPIO, RPM and DEB formats. Compression ratio in the new 7z format is 30-50% better than ratio in [ZIP](../../../files/ZIP.md) format.
## Usage
Add file/directory to the archive (or create a new one):

View file

@ -20,4 +20,4 @@ Usage: `crunch <min-len> <max-len> [<charset string>] [options]`
| `-o wordlist.txt` | Specifies the file to write the output to, eg: wordlist.txt |
| `-s startblock` | Specifies a starting string, eg: 03god22fs |
| `-t @,%^` | Specifies a pattern, eg: @@god@@@@ where the only the @'s, ,'s, %'s, and ^'s  will change. <br>`@` will insert lower case characters<br>`,` will insert upper case characters<br>`%` will insert numbers<br>`^` will insert symbols |
| `-z gzip, bzip2, lzma, and 7z` | Compresses the output from the -o option. Valid parameters are gzip, bzip2, lzma, and [7z](p7zip.md). |
| `-z gzip, bzip2, lzma, and 7z` | Compresses the output from the -o option. Valid parameters are gzip, bzip2, lzma, and [7z](compression/p7zip.md). |

View file

@ -5,7 +5,7 @@ repo: https://github.com/casey/intermodal
---
# Intermodal
[Repo](https://github.com/casey/intermodal)
Intermodal is a user-friendly and featureful command-line [BitTorrent](../../tools/BitTorrent.md) metainfo utility. The binary is called `imdl` and runs on [Linux](../../linux/Linux.md), [Windows](../../windows/Windows.md), and [macOS](../../macos/macOS.md).
Intermodal is a user-friendly and featureful command-line [BitTorrent](../../internet/BitTorrent.md) metainfo utility. The binary is called `imdl` and runs on [Linux](../../linux/Linux.md), [Windows](../../windows/Windows.md), and [macOS](../../macos/macOS.md).
## Usage
### Create torrent file:

View file

@ -3,4 +3,4 @@ obj: application
os: linux
---
# Jless
[`jless`](https://jless.io/) is a command-line [JSON](../../files/JSON.md) viewer. Use it as a replacement for whatever combination of `less`, `jq`, `cat` and your editor you currently use for viewing [JSON](../../files/JSON.md) files. It is written in [Rust](../../programming/languages/Rust.md) and can be installed as a single standalone binary.
[`jless`](https://jless.io/) is a command-line [JSON](../../files/JSON.md) viewer. Use it as a replacement for whatever combination of `less`, `jq`, `cat` and your editor you currently use for viewing [JSON](../../files/JSON.md) files. It is written in [Rust](../../dev/programming/languages/Rust.md) and can be installed as a single standalone binary.

View file

@ -5,4 +5,4 @@ repo: https://github.com/kamiyaa/joshuto
---
# Joshuto
[Repo](https://github.com/kamiyaa/joshuto)
Joshuto is a [ranger](https://github.com/ranger/ranger)-like terminal file manager written in [Rust](../../programming/languages/Rust.md).
Joshuto is a [ranger](https://github.com/ranger/ranger)-like terminal file manager written in [Rust](../../dev/programming/languages/Rust.md).

View file

@ -144,7 +144,7 @@ Joins the array of elements given as input, using the argument as separator. It
`if A then B end` is the same as `if A then B else . end`. That is, the `else` branch is optional, and if absent is the same as `.`. This also applies to `elif` with absent ending `else` branch.
Checking for false or null is a simpler notion of "truthiness" than is found in JavaScript or [Python](../../programming/languages/Python.md), but it means that you'll sometimes have to be more explicit about the condition you want. You can't test whether, e.g. a string is empty using `if .name then A else B end`; you'll need something like `if .name == "" then A else B end` instead.
Checking for false or null is a simpler notion of "truthiness" than is found in JavaScript or [Python](../../dev/programming/languages/Python.md), but it means that you'll sometimes have to be more explicit about the condition you want. You can't test whether, e.g. a string is empty using `if .name then A else B end`; you'll need something like `if .name == "" then A else B end` instead.
If the condition `A` produces multiple results, then `B` is evaluated once for each result that is not false or null, and `C` is evaluated once for each false or null.

View file

@ -531,8 +531,8 @@ These functions can fail, for example if a path does not have an extension, whic
- `error(message)` - Abort execution and report error `message` to user.
#### UUID and Hash Generation
- `sha256(string)` - Return the [SHA](../../Cryptography/SHA.md)-256 hash of `string` as a hexadecimal string.
- `sha256_file(path)` - Return the [SHA](../../Cryptography/SHA.md)-256 hash of the file at `path` as a hexadecimal string.
- `sha256(string)` - Return the [SHA](../../cryptography/SHA.md)-256 hash of `string` as a hexadecimal string.
- `sha256_file(path)` - Return the [SHA](../../cryptography/SHA.md)-256 hash of the file at `path` as a hexadecimal string.
- `uuid()` - Return a randomly generated UUID.
### Recipe Attributes

View file

@ -7,7 +7,7 @@ website: https://micro-editor.github.io/
# micro
**micro** is a terminal-based text editor that aims to be easy to use and intuitive, while also taking advantage of the capabilities of modern terminals. It comes as a single, batteries-included, static binary with no dependencies; you can download and use it right now!
As its name indicates, micro aims to be somewhat of a successor to the nano editor by being easy to install and use. It strives to be enjoyable as a full-time editor for people who prefer to work in a terminal, or those who regularly edit files over [SSH](../SSH.md).
As its name indicates, micro aims to be somewhat of a successor to the nano editor by being easy to install and use. It strives to be enjoyable as a full-time editor for people who prefer to work in a terminal, or those who regularly edit files over [SSH](../network/SSH.md).
![Screenshot][Screenshot]
@ -392,7 +392,7 @@ Here are the available options:
reading with `clipboard_control` setting), iTerm2 (only copying),
st, rxvt-unicode and xterm if enabled (see `> help copypaste` for
details). Note that Gnome-terminal does not support this feature. With
this setting, copy-paste **will** work over [ssh](../SSH.md). See `> help copypaste`
this setting, copy-paste **will** work over [ssh](../network/SSH.md). See `> help copypaste`
for details.
* `internal`: micro will use an internal clipboard.
@ -548,7 +548,7 @@ Here are the available options:
* `mouse`: mouse support. When mouse support is disabled,
usually the terminal will be able to access mouse events which can be useful
if you want to copy from the terminal instead of from micro (if over [ssh](../SSH.md) for
if you want to copy from the terminal instead of from micro (if over [ssh](../network/SSH.md) for
example, because the terminal has access to the local clipboard and micro
does not).
@ -700,7 +700,7 @@ Here are the available options:
default value: `true`
* `sucmd`: specifies the super user command. On most systems this is "sudo" but
on BSD it can be "[doas](doas.md)." This option can be customized and is only used when
on BSD it can be "[doas](system/doas.md)." This option can be customized and is only used when
saving with su.
default value: `sudo`

View file

@ -0,0 +1,6 @@
---
obj: application
---
# nano
#wip

View file

@ -6,7 +6,7 @@ website: https://aria2.github.io/
repo: https://github.com/aria2/aria2
---
# aria2
[aria2](https://aria2.github.io/) is a utility for downloading files. The supported protocols are [HTTP](../../internet/HTTP.md)(S), [FTP](../../internet/FTP.md), SFTP, [BitTorrent](../../tools/BitTorrent.md), and Metalink. aria2 can download a file from multiple sources/protocols and tries to utilize your maximum download bandwidth. It supports downloading a file from [HTTP](../../internet/HTTP.md)(S)/[FTP](../../internet/FTP.md)/SFTP and [BitTorrent](../../tools/BitTorrent.md) at the same time, while the data downloaded from [HTTP](../../internet/HTTP.md)(S)/[FTP](../../internet/FTP.md)/SFTP is uploaded to the [BitTorrent](../../tools/BitTorrent.md) swarm. Using Metalink's chunk checksums, aria2 automatically validates chunks of data while downloading a file like [BitTorrent](../../tools/BitTorrent.md). Aria2 can be used as a downloader by [yt-dlp](../media/yt-dlp.md).
[aria2](https://aria2.github.io/) is a utility for downloading files. The supported protocols are [HTTP](../../../internet/HTTP.md)(S), [FTP](../../../internet/FTP.md), SFTP, [BitTorrent](../../../internet/BitTorrent.md), and Metalink. aria2 can download a file from multiple sources/protocols and tries to utilize your maximum download bandwidth. It supports downloading a file from [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md)/SFTP and [BitTorrent](../../../internet/BitTorrent.md) at the same time, while the data downloaded from [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md)/SFTP is uploaded to the [BitTorrent](../../../internet/BitTorrent.md) swarm. Using Metalink's chunk checksums, aria2 automatically validates chunks of data while downloading a file like [BitTorrent](../../../internet/BitTorrent.md). Aria2 can be used as a downloader by [yt-dlp](../../media/yt-dlp.md).
## Usage
@ -21,19 +21,19 @@ aria2c [<OPTIONS>] [<URI>|<MAGNET>|<TORRENT_FILE>|<METALINK_FILE>]
| `-i, --input-file=<FILE>` | Downloads the URIs listed in **FILE**. |
| `-l, --log=<LOG>` | The file name of the log file. If - is specified, log is written to stdout. If empty string("") is specified, or this option is omitted, no log is written to disk at all. |
| `-j, --max-concurrent-downloads=<N>` | Set the maximum number of parallel downloads for every queue item. |
| `-V, --check-integrity [true/false]` | Check file integrity by validating piece hashes or a hash of entire file. This option has effect only in [BitTorrent](../../tools/BitTorrent.md), Metalink downloads with checksums or [HTTP](../../internet/HTTP.md)(S)/[FTP](../../internet/FTP.md) downloads with --checksum option. If piece hashes are provided, this option can detect damaged portions of a file and re-download them. If a hash of entire file is provided, hash check is only done when file has been already download. This is determined by file length. If hash check fails, file is re-downloaded from scratch. If both piece hashes and a hash of entire file are provided, only piece hashes are used. Default: false |
| `-c, --continue [true/false]` | Continue downloading a partially downloaded file. Use this option to resume a download started by a web browser or another program which downloads files sequentially from the beginning. Currently this option is only applicable to [HTTP](../../internet/HTTP.md)(S)/[FTP](../../internet/FTP.md) downloads. |
| `--checksum=<TYPE>=<DIGEST>` | Set checksum. TYPE is hash type. The supported hash type is listed in Hash Algorithms in aria2c -v. DIGEST is hex digest. For example, setting sha-1 digest looks like this: `sha-1=0192ba11326fe2298c8cb4de616f4d4140213838` This option applies only to [HTTP](../../internet/HTTP.md)(S)/[FTP](../../internet/FTP.md) downloads. |
| `-V, --check-integrity [true/false]` | Check file integrity by validating piece hashes or a hash of entire file. This option has effect only in [BitTorrent](../../../internet/BitTorrent.md), Metalink downloads with checksums or [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads with --checksum option. If piece hashes are provided, this option can detect damaged portions of a file and re-download them. If a hash of entire file is provided, hash check is only done when file has been already download. This is determined by file length. If hash check fails, file is re-downloaded from scratch. If both piece hashes and a hash of entire file are provided, only piece hashes are used. Default: false |
| `-c, --continue [true/false]` | Continue downloading a partially downloaded file. Use this option to resume a download started by a web browser or another program which downloads files sequentially from the beginning. Currently this option is only applicable to [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads. |
| `--checksum=<TYPE>=<DIGEST>` | Set checksum. TYPE is hash type. The supported hash type is listed in Hash Algorithms in aria2c -v. DIGEST is hex digest. For example, setting sha-1 digest looks like this: `sha-1=0192ba11326fe2298c8cb4de616f4d4140213838` This option applies only to [HTTP](../../../internet/HTTP.md)(S)/[FTP](../../../internet/FTP.md) downloads. |
| `-x, --max-connection-per-server=<NUM>` | The maximum number of connections to one server for each download.  Default: **1** |
| `-k, --min-split-size=<SIZE>` | aria2 does not split less than 2\*SIZE byte range. For example, let's consider downloading 20MiB file. If SIZE is 10M, aria2 can split file into 2 range (0-10MiB) and (10MiB-20MiB) and download it using 2 sources(if --split >= 2, of course). If SIZE is 15M, since 2\*15M > 20MiB, aria2 does not split file and download it using 1 source. You can append K or M (1K = 1024, 1M = 1024K). Possible Values: 1M -1024M Default: 20M |
| `-o, --out=<FILE>` | The file name of the downloaded file. It is always relative to the directory given in `--dir` option. |
| `-s, --split=<N>` | Download a file using N connections. If more than N URIs are given, first N URIs are used and remaining URIs are used for backup. If less than N URIs are given, those URIs are used more than once so that N connections total are made simultaneously. The number of connections to the same host is restricted by the `--max-connection-per-server` option. See also the `--min-split-size` option. Default: 5 |
| `-t, --timeout=<SEC>` | Set timeout in seconds. Default: 60 |
| `--check-certificate [true/false]` | Verify the peer using certificates specified in `--ca-certificate` option. Default: true |
| `--http-user=<USER>, --http-passwd=<PASSWD>` | Credentials for [HTTP](../../internet/HTTP.md) Auth |
| `--header=<HEADER>` | Append HEADER to [HTTP](../../internet/HTTP.md) request header. You can use this option repeatedly to specify more than one header:<br>`aria2c --header="X-A: b78" --header="X-B: 9J1" "http://host/file"` |
| `--load-cookies=<FILE>` | Load Cookies from FILE using the Firefox3 format (SQLite3), Chromium/Google Chrome (SQLite3) and the Mozilla/[Firefox](../network/browsers/Firefox.md)(1.x/2.x)/Netscape format. |
| `--save-cookies=<FILE>` | Save Cookies to FILE in Mozilla/[Firefox](../network/browsers/Firefox.md)(1.x/2.x)/ Netscape format. If FILE already exists, it is overwritten. Session Cookies are also saved and their expiry values are treated as 0. |
| `-U, --user-agent=<USER_AGENT>` | Set user agent for [HTTP](../../internet/HTTP.md)(S) downloads. Default: `aria2/$VERSION`, $VERSION is replaced by package version. |
| `--http-user=<USER>, --http-passwd=<PASSWD>` | Credentials for [HTTP](../../../internet/HTTP.md) Auth |
| `--header=<HEADER>` | Append HEADER to [HTTP](../../../internet/HTTP.md) request header. You can use this option repeatedly to specify more than one header:<br>`aria2c --header="X-A: b78" --header="X-B: 9J1" "http://host/file"` |
| `--load-cookies=<FILE>` | Load Cookies from FILE using the Firefox3 format (SQLite3), Chromium/Google Chrome (SQLite3) and the Mozilla/[Firefox](../../network/browsers/Firefox.md)(1.x/2.x)/Netscape format. |
| `--save-cookies=<FILE>` | Save Cookies to FILE in Mozilla/[Firefox](../../network/browsers/Firefox.md)(1.x/2.x)/ Netscape format. If FILE already exists, it is overwritten. Session Cookies are also saved and their expiry values are treated as 0. |
| `-U, --user-agent=<USER_AGENT>` | Set user agent for [HTTP](../../../internet/HTTP.md)(S) downloads. Default: `aria2/$VERSION`, $VERSION is replaced by package version. |
| `-S, --show-files [true/false]` | Print file listing of ".torrent", ".meta4" and ".metalink" file and exit. In case of ".torrent" file, additional information (infohash, piece length, etc) is also printed. |
| `--select-file=<INDEX>...` | (Torrent) Set file to download by specifying its index. You can find the file index using the `--show-files` option. Multiple indexes can be specified by using `,`, for example: `3,6`. You can also use `-` to specify a range: `1-5`. `,` and `-` can be used together: `1-5,8,9`. |

View file

@ -5,7 +5,7 @@ website: https://curl.se/
repo: https://github.com/curl/curl
---
# curl
cURL is a command-line tool and library for transferring data with URLs. It supports a wide range of protocols, making it a versatile tool for making [HTTP](../../internet/HTTP.md) requests, downloading files, and more.
cURL is a command-line tool and library for transferring data with URLs. It supports a wide range of protocols, making it a versatile tool for making [HTTP](../../../internet/HTTP.md) requests, downloading files, and more.
## Usage
To make a simple GET request: `curl https://example.com`
@ -15,8 +15,8 @@ To make a simple GET request: `curl https://example.com`
| -------------------------------------- | ------------------------------------------------------------------------------------- |
| `-C, --continue-at <offset>` | Continue/Resume a previous file transfer at the given offset. |
| `-c, --cookie-jar <filename>` | Specify to which file you want curl to write all cookies after a completed operation. |
| `-b, --cookie <data/filename>` | Pass the data to the [HTTP](../../internet/HTTP.md) server in the [Cookie](../../internet/Cookie.md) header. |
| `-d, --data <data>` | Sends the specified data in a POST request to the [HTTP](../../internet/HTTP.md) server |
| `-b, --cookie <data/filename>` | Pass the data to the [HTTP](../../../internet/HTTP.md) server in the [Cookie](../../../internet/Cookie.md) header. |
| `-d, --data <data>` | Sends the specified data in a POST request to the [HTTP](../../../internet/HTTP.md) server |
| `-F, --form <name=content>` | Specify multipart MIME data |
| `-k, --insecure` | Allow insecure server connections when using SSL |
| `-L, --location` | Follow redirects |

View file

@ -4,13 +4,13 @@ wiki: https://en.wikipedia.org/wiki/Netcat
---
# netcat
The `nc` (or `netcat`) utility is used for just about anything under the sun involving [TCP](../../internet/TCP.md), [UDP](../../internet/UDP.md), or UNIX-domain sockets. It can open [TCP](../../internet/TCP.md) connections, send [UDP](../../internet/UDP.md) packets, listen on arbitrary [TCP](../../internet/TCP.md) and [UDP](../../internet/UDP.md) ports, do port scanning, and deal with both IPv4 and IPv6.
The `nc` (or `netcat`) utility is used for just about anything under the sun involving [TCP](../../../internet/TCP.md), [UDP](../../../internet/UDP.md), or UNIX-domain sockets. It can open [TCP](../../../internet/TCP.md) connections, send [UDP](../../../internet/UDP.md) packets, listen on arbitrary [TCP](../../../internet/TCP.md) and [UDP](../../../internet/UDP.md) ports, do port scanning, and deal with both IPv4 and IPv6.
Common uses include:
- simple [TCP](../../internet/TCP.md) proxies
- shell-script based [HTTP](../../internet/HTTP.md) clients and servers
- simple [TCP](../../../internet/TCP.md) proxies
- shell-script based [HTTP](../../../internet/HTTP.md) clients and servers
- network daemon testing
- a SOCKS or [HTTP](../../internet/HTTP.md) ProxyCommand for [ssh](../SSH.md)
- a SOCKS or [HTTP](../../../internet/HTTP.md) ProxyCommand for [ssh](../../network/SSH.md)
## Options
| Option | Description |

View file

@ -5,7 +5,7 @@ repo: https://github.com/netdiscover-scanner/netdiscover
---
# netdiscover
Netdiscover is an active/passive address reconnaissance tool, mainly developed for those wireless networks without [dhcp](../../internet/DHCP.md) server, when you are wardriving. It can be also used on hub/switched networks.
Netdiscover is an active/passive address reconnaissance tool, mainly developed for those wireless networks without [dhcp](../../../internet/DHCP.md) server, when you are wardriving. It can be also used on hub/switched networks.
Built on top of libnet and libpcap, it can passively detect online hosts, or search for them, by actively sending ARP requests.

View file

@ -26,19 +26,19 @@ Ex: scanme.nmap.org, 192.168.0.1; 10.0.0-255.1-254
| ----------------------------------- | --------------------------------------------------------------------------------------------------- |
| `-sL` | List Scan - simply list targets to scan |
| `-sn` | Ping Scan - disable port scan |
| `-PS/PA/PU/PY[portlist]` | [TCP](../../internet/TCP.md) SYN/ACK, [UDP](../../internet/UDP.md) or SCTP discovery to given ports |
| `-PS/PA/PU/PY[portlist]` | [TCP](../../../internet/TCP.md) SYN/ACK, [UDP](../../../internet/UDP.md) or SCTP discovery to given ports |
| `-PE/PP/PM` | ICMP echo, timestamp, and netmask request discovery probes |
| `-n/-R` | Never do [DNS](../../internet/DNS.md) resolution/Always resolve \[default: sometimes] |
| `--dns-servers <serv1[,serv2],...>` | Specify custom [DNS](../../internet/DNS.md) servers |
| `-n/-R` | Never do [DNS](../../../internet/DNS.md) resolution/Always resolve \[default: sometimes] |
| `--dns-servers <serv1[,serv2],...>` | Specify custom [DNS](../../../internet/DNS.md) servers |
| `--traceroute` | Trace hop path to each host |
#### SCAN TECHNIQUES
| Option | Description |
| --------------------- | ------------------------------------------------------------------ |
| `-sS/sT/sA/sW/sM` | [TCP](../../internet/TCP.md) SYN/Connect()/ACK/Window/Maimon scans |
| `-sU` | [UDP](../../internet/UDP.md) Scan |
| `-sN/sF/sX` | [TCP](../../internet/TCP.md) Null, FIN, and Xmas scans |
| `--scanflags <flags>` | Customize [TCP](../../internet/TCP.md) scan flags |
| `-sS/sT/sA/sW/sM` | [TCP](../../../internet/TCP.md) SYN/Connect()/ACK/Window/Maimon scans |
| `-sU` | [UDP](../../../internet/UDP.md) Scan |
| `-sN/sF/sX` | [TCP](../../../internet/TCP.md) Null, FIN, and Xmas scans |
| `--scanflags <flags>` | Customize [TCP](../../../internet/TCP.md) scan flags |
| `-sO` | IP protocol scan |
#### PORT SPECIFICATION AND SCAN ORDER
@ -95,27 +95,27 @@ Ex: scanme.nmap.org, 192.168.0.1; 10.0.0-255.1-254
| `-S <IP_Address>` | Spoof source address |
| `-e <iface>` | Use specified interface |
| `-g/--source-port <portnum>` | Use given port number |
| `--proxies <url1,[url2],...>` | Relay connections through [HTTP](../../internet/HTTP.md)/SOCKS4 proxies |
| `--proxies <url1,[url2],...>` | Relay connections through [HTTP](../../../internet/HTTP.md)/SOCKS4 proxies |
| `--data <hex string>` | Append a custom payload to sent packets |
| `--data-string <string>` | Append a custom [ASCII](../../files/ASCII.md) string to sent packets |
| `--data-string <string>` | Append a custom [ASCII](../../../files/ASCII.md) string to sent packets |
| `--data-length <num>` | Append random data to sent packets |
| `--ip-options <options>` | Send packets with specified ip options |
| `--ttl <val>` | Set IP time-to-live field |
| `--spoof-mac <mac address/prefix/vendor name>` | Spoof your MAC address |
| `--badsum` | Send packets with a bogus [TCP](../../internet/TCP.md)/[UDP](../../internet/UDP.md)/SCTP checksum |
| `--badsum` | Send packets with a bogus [TCP](../../../internet/TCP.md)/[UDP](../../../internet/UDP.md)/SCTP checksum |
#### OUTPUT
| Option | Description |
| ------------------------- | -------------------------------------------------------------------------------------------------------------------------- |
| `-oN/-oX/-oS/-oG <file>` | Output scan in normal, [XML](../../files/XML.md), scrIpt kIddi3, and Grepable format, respectively, to the given filename. |
| `-oN/-oX/-oS/-oG <file>` | Output scan in normal, [XML](../../../files/XML.md), scrIpt kIddi3, and Grepable format, respectively, to the given filename. |
| `-oA <basename>` | Output in the three major formats at once |
| `-v` | Increase verbosity level (use `-vv` or more for greater effect) |
| `--open` | Only show open (or possibly open) ports |
| `--append-output` | Append to rather than clobber specified output files |
| `--resume <filename>` | Resume an aborted scan |
| `--stylesheet <path/URL>` | XSL stylesheet to transform [XML](../../files/XML.md) output to [HTML](../../internet/HTML.md) |
| `--webxml` | Reference stylesheet from Nmap.Org for more portable [XML](../../files/XML.md) |
| `--no-stylesheet` | Prevent associating of XSL stylesheet w/[XML](../../files/XML.md) output |
| `--stylesheet <path/URL>` | XSL stylesheet to transform [XML](../../../files/XML.md) output to [HTML](../../../internet/HTML.md) |
| `--webxml` | Reference stylesheet from Nmap.Org for more portable [XML](../../../files/XML.md) |
| `--no-stylesheet` | Prevent associating of XSL stylesheet w/[XML](../../../files/XML.md) output |

View file

@ -5,11 +5,11 @@ wiki: https://en.wikipedia.org/wiki/Wget
repo: https://git.savannah.gnu.org/cgit/wget.git
---
# wget
GNU Wget is a free utility for non-interactive download of files from the Web. It supports [HTTP](../../internet/HTTP.md), HTTPS, and [FTP](../../internet/FTP.md) protocols, as well as retrieval through [HTTP](../../internet/HTTP.md) proxies.
GNU Wget is a free utility for non-interactive download of files from the Web. It supports [HTTP](../../../internet/HTTP.md), HTTPS, and [FTP](../../../internet/FTP.md) protocols, as well as retrieval through [HTTP](../../../internet/HTTP.md) proxies.
Wget is non-interactive, meaning that it can work in the background, while the user is not logged on. This allows you to start a retrieval and disconnect from the system, letting Wget finish the work. By contrast, most of the Web browsers require constant user's presence, which can be a great hindrance when transferring a lot of data.
Wget can follow links in [HTML](../../internet/HTML.md), XHTML, and [CSS](../../internet/CSS.md) pages, to create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded files to point at the local files, for offline viewing.
Wget can follow links in [HTML](../../../internet/HTML.md), XHTML, and [CSS](../../../internet/CSS.md) pages, to create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded files to point at the local files, for offline viewing.
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports regetting, it will instruct the server to continue the download from where it left off.
@ -25,8 +25,8 @@ Wget has been designed for robustness over slow or unstable network connections;
| `-o, --output-file=logfile` | Log all messages to logfile. The messages are normally reported to standard error. |
| `-a, --append-output=logfile` | Append to logfile. This is the same as -o, only it appends to logfile instead of overwriting the old log file. |
| `-q, --quiet` | Turn off Wget's output. |
| `-i, --input-file=file` | Read URLs from a local or external file.  If - is specified as file, URLs are read from the standard input.  (Use ./- to read from a file literally named -.). If this function is used, no URLs need be present on the command line.  If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line. However, if you specify --force-html, the document will be regarded as [html](../../internet/HTML.md).  In that case you may have problems with relative links, which you can solve either by adding "\<base href="url">" to the documents or by specifying --base=url on the command line. If the file is an external one, the document will be automatically treated as [html](../../internet/HTML.md) if the Content-Type matches text/html. Furthermore, the file's location will be implicitly used as base href if none was specified. |
| `-B, --base=URL` | Resolves relative links using URL as the point of reference, when reading links from an [HTML](../../internet/HTML.md) file specified via the -i/--input-file option (together with --force-html, or when the input file was fetched remotely from a server describing it as [HTML](../../internet/HTML.md)). This is equivalent to the presence of a "BASE" tag in the [HTML](../../internet/HTML.md) input file, with URL as the value for the "href" attribute. |
| `-i, --input-file=file` | Read URLs from a local or external file.  If - is specified as file, URLs are read from the standard input.  (Use ./- to read from a file literally named -.). If this function is used, no URLs need be present on the command line.  If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved. If --force-html is not specified, then file should consist of a series of URLs, one per line. However, if you specify --force-html, the document will be regarded as [html](../../../internet/HTML.md).  In that case you may have problems with relative links, which you can solve either by adding "\<base href="url">" to the documents or by specifying --base=url on the command line. If the file is an external one, the document will be automatically treated as [html](../../../internet/HTML.md) if the Content-Type matches text/html. Furthermore, the file's location will be implicitly used as base href if none was specified. |
| `-B, --base=URL` | Resolves relative links using URL as the point of reference, when reading links from an [HTML](../../../internet/HTML.md) file specified via the -i/--input-file option (together with --force-html, or when the input file was fetched remotely from a server describing it as [HTML](../../../internet/HTML.md)). This is equivalent to the presence of a "BASE" tag in the [HTML](../../../internet/HTML.md) input file, with URL as the value for the "href" attribute. |
### Download Options
| Option | Description |
@ -41,7 +41,7 @@ Wget has been designed for robustness over slow or unstable network connections;
| `-w, --wait=seconds` | Wait the specified number of seconds between the retrievals. Use of this option is recommended, as it lightens the server load by making the requests less frequent. Instead of in seconds, the time can be specified in minutes using the "m" suffix, in hours using "h" suffix, or in days using "d" suffix. |
| `--waitretry=seconds` | If you don't want Wget to wait between every retrieval, but only between retries of failed downloads, you can use this option. Wget  will use linear backoff, waiting 1 second after the first failure on a given file, then waiting 2 seconds after the second failure on  that file, up to the maximum number of seconds you specify. |
| `--random-wait` | Some web sites may perform log analysis to identify retrieval programs such as Wget by looking for statistically significant  similarities in the time between requests. This option causes the time between requests to vary between 0.5 and 1.5 * wait seconds,  where wait was specified using the --wait option, in order to mask Wget's presence from such analysis. |
| `--user=user, --password=password` | Specify the username and password for both [FTP](../../internet/FTP.md) and [HTTP](../../internet/HTTP.md) file retrieval. |
| `--user=user, --password=password` | Specify the username and password for both [FTP](../../../internet/FTP.md) and [HTTP](../../../internet/HTTP.md) file retrieval. |
| `--ask-password` | Prompt for a password for each connection established. |
### Directory Options
@ -55,13 +55,13 @@ Wget has been designed for robustness over slow or unstable network connections;
| Option | Description |
| ---------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `--no-cookies` | Disable the use of cookies. |
| `--load-cookies file` | Load cookies from file before the first [HTTP](../../internet/HTTP.md) retrieval.  file is a textual file in the format originally used by Netscape's  cookies.txt file. |
| `--load-cookies file` | Load cookies from file before the first [HTTP](../../../internet/HTTP.md) retrieval.  file is a textual file in the format originally used by Netscape's  cookies.txt file. |
| `--save-cookies file` | Save cookies to file before exiting. This will not save cookies that have expired or that have no expiry time (so-called "session cookies"), but also see --keep-session-cookies. |
| `--keep-session-cookies` | When specified, causes --save-cookies to also save session cookies. Session cookies are normally not saved because they are meant to be  kept in memory and forgotten when you exit the browser. Saving them is useful on sites that require you to log in or to visit the home  page before you can access some pages. With this option, multiple Wget runs are considered a single browser session as far as the site  is concerned. |
| `--header=header-line` | Send header-line along with the rest of the headers in each [HTTP](../../internet/HTTP.md) request. The supplied header is sent as-is, which means it must  contain name and value separated by colon, and must not contain newlines. |
| `--header=header-line` | Send header-line along with the rest of the headers in each [HTTP](../../../internet/HTTP.md) request. The supplied header is sent as-is, which means it must  contain name and value separated by colon, and must not contain newlines. |
| `--proxy-user=user, --proxy-password=password` | Specify the username user and password password for authentication on a proxy server. Wget will encode them using the "basic"  authentication scheme. |
| `--referer=url` | Include 'Referer: url' header in [HTTP](../../internet/HTTP.md) request. Useful for retrieving documents with server-side processing that assume they are always  being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. |
| `-U, --user-agent=agent-string` | Identify as `agent-string` to the [HTTP](../../internet/HTTP.md) server. |
| `--referer=url` | Include 'Referer: url' header in [HTTP](../../../internet/HTTP.md) request. Useful for retrieving documents with server-side processing that assume they are always  being retrieved by interactive web browsers and only come out properly when Referer is set to one of the pages that point to them. |
| `-U, --user-agent=agent-string` | Identify as `agent-string` to the [HTTP](../../../internet/HTTP.md) server. |
### HTTPS Options
| Option | Description |
@ -75,5 +75,5 @@ Wget has been designed for robustness over slow or unstable network connections;
| ----------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `-r, --recursive` | Turn on recursive retrieving. The default maximum depth is 5. |
| `-l, --level=depth` | Set the maximum number of subdirectories that Wget will recurse into to depth. |
| `-k, --convert-links` | After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the  visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets,  hyperlinks to non-[HTML](../../internet/HTML.md) content, etc. |
| `-p, --page-requisites` | This option causes Wget to download all the files that are necessary to properly display a given [HTML](../../internet/HTML.md) page. This includes such things  as inlined images, sounds, and referenced stylesheets. |
| `-k, --convert-links` | After the download is complete, convert the links in the document to make them suitable for local viewing. This affects not only the  visible hyperlinks, but any part of the document that links to external content, such as embedded images, links to style sheets,  hyperlinks to non-[HTML](../../../internet/HTML.md) content, etc. |
| `-p, --page-requisites` | This option causes Wget to download all the files that are necessary to properly display a given [HTML](../../../internet/HTML.md) page. This includes such things  as inlined images, sounds, and referenced stylesheets. |

View file

@ -7,7 +7,7 @@ Pueue is a command-line task management tool for sequential and parallel executi
Simply put, it's a tool that **p**rocesses a q**ueue** of [shell](Shell.md) commands. On top of that, there are a lot of convenient features and abstractions.
Since Pueue is not bound to any terminal, you can control your tasks from any terminal on the same machine. The queue will be continuously processed, even if you no longer have any active [ssh](../SSH.md) sessions.
Since Pueue is not bound to any terminal, you can control your tasks from any terminal on the same machine. The queue will be continuously processed, even if you no longer have any active [ssh](../network/SSH.md) sessions.
## Start the Daemon
Before you can use the `pueue` client, you have to start the daemon.
@ -18,7 +18,7 @@ Before you can use the `pueue` client, you have to start the daemon.
The daemon can always be shut down using the client command `pueue shutdown`.
### Systemd
[Systemd](../../linux/Systemd.md) user services allow every user to start/enable their own session on [Linux](../../linux/Linux.md) operating system distributions.
[Systemd](../../linux/systemd/Systemd.md) user services allow every user to start/enable their own session on [Linux](../../linux/Linux.md) operating system distributions.
If you didn't install Pueue with a package manager, follow these instructions first:
1. download `pueued.service` from the GitHub Releases page;

View file

@ -8,7 +8,7 @@ wiki: https://en.wikipedia.org/wiki/GNU_Core_Utilities
The GNU Core Utilities or coreutils is a package of GNU software containing implementations for many of the basic tools, such as cat, ls, and rm, which are used on Unix-like operating systems.
## base64
[base64](../../files/Base64.md) encode/decode data and print to standard output
[base64](../../../files/Base64.md) encode/decode data and print to standard output
Usage: `base64 [OPTION]... [FILE]`
### Flags
@ -189,7 +189,7 @@ Usage: `echo [OPTION]... [STRING]...`
## env
run a program in a modified environment
Print [Environment Variables](../../linux/Environment%20Variables.md) with only `env`
Print [Environment Variables](../../../linux/Environment%20Variables.md) with only `env`
Usage: `env [OPTION]... [-] [NAME=VALUE]... [COMMAND [ARG]...]`
### Options

View file

@ -6,7 +6,7 @@ arch-wiki: https://wiki.archlinux.org/title/Doas
---
# doas
doas is a program to execute commands as another user. The system administrator can configure it to give specified users privileges to execute specified commands. It is free and open-source under the ISC license and available in Unix and Unix-like operating systems ([FreeBSD](../../bsd/FreeBSD.md), [OpenBSD](../../bsd/OpenBSD.md), [Linux](../../linux/Linux.md)).
doas is a program to execute commands as another user. The system administrator can configure it to give specified users privileges to execute specified commands. It is free and open-source under the ISC license and available in Unix and Unix-like operating systems ([FreeBSD](../../../bsd/FreeBSD.md), [OpenBSD](../../../bsd/OpenBSD.md), [Linux](../../../linux/Linux.md)).
## Usage
To use doas, simply prefix a command and its arguments with doas and a space:

View file

@ -5,7 +5,7 @@ wiki: https://en.wikipedia.org/wiki/File_(command)
website: https://darwinsys.com/file
---
# file
determine file / [MIME](../../files/MIME.md) type
determine file / [MIME](../../../files/MIME.md) type
Usage: `file [OPTION] [FILE]...`
## Options

View file

@ -4,7 +4,7 @@ repo: git://git.kernel.org/pub/scm/utils/util-linux/util-linux.git
---
# losetup
set up and control [loop devices](../../linux/Loop%20Device.md)
set up and control [loop devices](../../../linux/Loop%20Device.md)
## Usage
Get info: