add gallery-dl
This commit is contained in:
parent
6985cf84e8
commit
bfc7f37453
1 changed files with 126 additions and 0 deletions
126
technology/applications/media/gallery-dl.md
Normal file
126
technology/applications/media/gallery-dl.md
Normal file
|
@ -0,0 +1,126 @@
|
|||
---
|
||||
obj: application
|
||||
repo: https://github.com/mikf/gallery-dl
|
||||
rev: 2024-04-15
|
||||
---
|
||||
|
||||
# gallery-dl
|
||||
Gallery-DL is a powerful command-line program designed to download image galleries and collections from various websites. It provides a convenient way to download entire galleries or collections with just a few simple commands. For videos see [yt-dlp](yt-dlp.md).
|
||||
|
||||
## Command-Line Options
|
||||
Usage: `gallery-dl [url]`
|
||||
|
||||
### General Options:
|
||||
|
||||
| Option | Description |
|
||||
| ------------------------ | --------------------------------------------------------------------------------- |
|
||||
| `-f, --filename FORMAT` | Filename format string for downloaded files ('/O' for "original" filenames) |
|
||||
| `-d, --destination PATH` | Target location for file downloads |
|
||||
| `-D, --directory PATH` | Exact location for file downloads |
|
||||
| `-X, --extractors PATH` | Load external extractors from PATH |
|
||||
| `--proxy URL` | Use the specified proxy |
|
||||
| `--source-address IP` | Client-side IP address to bind to |
|
||||
| `--user-agent UA` | User-Agent request header |
|
||||
| `--clear-cache MODULE` | Delete cached login sessions, cookies, etc. for MODULE (ALL to delete everything) |
|
||||
|
||||
### Input Options:
|
||||
|
||||
| Option | Description |
|
||||
| ------------------------------- | ------------------------------------------------------------------------------------------- |
|
||||
| `-i, --input-file FILE` | Download URLs found in FILE ('-' for stdin). More than one `--input-file` can be specified. |
|
||||
| `-I, --input-file-comment FILE` | Download URLs found in FILE. Comment them out after they were downloaded successfully. |
|
||||
| `-x, --input-file-delete FILE` | Download URLs found in FILE. Delete them after they were downloaded successfully. |
|
||||
|
||||
### Output Options:
|
||||
|
||||
| Option | Description |
|
||||
| -------------------------- | ---------------------------------------------------------------------------------------------------------- |
|
||||
| `-q, --quiet` | Activate quiet mode |
|
||||
| `-v, --verbose` | Print various debugging information |
|
||||
| `-g, --get-urls` | Print URLs instead of downloading |
|
||||
| `-G, --resolve-urls` | Print URLs instead of downloading; resolve intermediary URLs |
|
||||
| `-j, --dump-json` | Print [JSON](../../files/JSON.md) information |
|
||||
| `-s, --simulate` | Simulate data extraction; do not download anything |
|
||||
| `-E, --extractor-info` | Print extractor defaults and settings |
|
||||
| `-K, --list-keywords` | Print a list of available keywords and example values for the given URLs |
|
||||
| `-e, --error-file FILE` | Add input URLs which returned an error to FILE |
|
||||
| `--list-modules` | Print a list of available extractor modules |
|
||||
| `--list-extractors` | Print a list of extractor classes with description, (sub)category and example [URL](../../internet/URL.md) |
|
||||
| `--write-log FILE` | Write [logging](../../dev/Log.md) output to FILE |
|
||||
| `--write-unsupported FILE` | Write URLs, which get emitted by other extractors but cannot be handled, to FILE |
|
||||
| `--write-pages` | Write downloaded intermediary pages to files in the current directory to debug problems |
|
||||
|
||||
### Downloader Options:
|
||||
|
||||
| Option | Description |
|
||||
| --------------------------- | -------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-r, --limit-rate RATE` | Maximum download rate (e.g. 500k or 2.5M) |
|
||||
| `-R, --retries N` | Maximum number of retries for failed [HTTP](../../internet/HTTP.md) requests or -1 for infinite retries (default: 4) |
|
||||
| `--http-timeout SECONDS` | Timeout for [HTTP](../../internet/HTTP.md) connections (default: 30.0) |
|
||||
| `--sleep SECONDS` | Number of seconds to wait before each download. This can be either a constant value or a range (e.g. 2.7 or 2.0-3.5) |
|
||||
| `--sleep-request SECONDS` | Number of seconds to wait between [HTTP](../../internet/HTTP.md) requests during data extraction |
|
||||
| `--sleep-extractor SECONDS` | Number of seconds to wait before starting data extraction for an input [URL](../../internet/URL.md) |
|
||||
| `--filesize-min SIZE` | Do not download files smaller than SIZE (e.g. 500k or 2.5M) |
|
||||
| `--filesize-max SIZE` | Do not download files larger than SIZE (e.g. 500k or 2.5M) |
|
||||
| `--chunk-size SIZE` | Size of in-memory data chunks (default: 32k) |
|
||||
| `--no-part` | Do not use .part files |
|
||||
| `--no-skip` | Do not skip downloads; overwrite existing files |
|
||||
| `--no-mtime` | Do not set file modification times according to Last-Modified [HTTP](../../internet/HTTP.md) response headers |
|
||||
| `--no-download` | Do not download any files |
|
||||
| `--no-postprocessors` | Do not run any post processors |
|
||||
| `--no-check-certificate` | Disable HTTPS certificate validation |
|
||||
|
||||
### Configuration Options:
|
||||
|
||||
| Option | Description |
|
||||
| ------------------------ | -------------------------------------------------------------------- |
|
||||
| `-o, --option KEY=VALUE` | Additional options. Example: `-o browser=firefox` |
|
||||
| `-c, --config FILE` | Additional configuration files |
|
||||
| `--config-yaml FILE` | Additional configuration files in [YAML](../../files/YAML.md) format |
|
||||
| `--config-toml FILE` | Additional configuration files in [TOML](../../files/TOML.md) format |
|
||||
| `--config-create` | Create a basic configuration file |
|
||||
| `--config-ignore` | Do not read default configuration files |
|
||||
|
||||
### Authentication Options:
|
||||
|
||||
| Option | Description |
|
||||
| --------------------- | ---------------------------------------- |
|
||||
| `-u, --username USER` | Username to login with |
|
||||
| `-p, --password PASS` | Password belonging to the given username |
|
||||
| `--netrc` | Enable .netrc authentication data |
|
||||
|
||||
### Cookie Options:
|
||||
|
||||
| Option | Description |
|
||||
| -------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-C, --cookies FILE` | File to load additional cookies from |
|
||||
| `--cookies-export FILE` | Export session cookies to FILE |
|
||||
| `--cookies-from-browser BROWSER[/DOMAIN][+KEYRING][:PROFILE][::CONTAINER]` | Name of the browser to load cookies from, with optional domain prefixed with '/', keyring name prefixed with '+', profile prefixed with ':', and container prefixed with '::' ('none' for no container') |
|
||||
|
||||
### Selection Options:
|
||||
|
||||
| Option | Description |
|
||||
| ------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `--download-archive FILE` | Record all downloaded or skipped files in FILE and skip downloading any file already in it |
|
||||
| `-A, --abort N` | Stop current extractor run after N consecutive file downloads were skipped |
|
||||
| `-T, --terminate N` | Stop current and parent extractor runs after N consecutive file downloads were skipped |
|
||||
| `--range RANGE` | Index range(s) specifying which files to download. These can be either a constant value, range, or slice (e.g. '5', '8-20', or '1:24:3') |
|
||||
| `--chapter-range RANGE` | Like '--range', but applies to manga chapters and other delegated URLs |
|
||||
| `--filter EXPR` | [Python](../../dev/programming/languages/Python.md) expression controlling which files to download. Files for which the expression evaluates to False are ignored. Available keys are the filename-specific ones listed by '-K'. Example: `--filter "image_width >= 1000 and rating in ('s', 'q')"` |
|
||||
| `--chapter-filter EXPR` | Like '--filter', but applies to manga chapters and other delegated URLs |
|
||||
|
||||
### Post-processing Options:ooo
|
||||
|
||||
| Option | Description |
|
||||
| -------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `-P, --postprocessor NAME` | Activate the specified post processor |
|
||||
| `-O, --postprocessor-option KEY=VALUE` | Additional post processor options |
|
||||
| `--write-metadata` | Write metadata to separate [JSON](../../files/JSON.md) files |
|
||||
| `--write-info-json` | Write gallery metadata to a info.json file |
|
||||
| `--write-tags` | Write image tags to separate text files |
|
||||
| `--zip` | Store downloaded files in a [ZIP](../../files/ZIP.md) archive |
|
||||
| `--cbz` | Store downloaded files in a [CBZ](../../files/Comic%20Book%20Archive.md) archive |
|
||||
| `--mtime NAME` | Set file modification times according to metadata selected by NAME. Examples: 'date' or 'status\[date]' |
|
||||
| `--ugoira FORMAT` | Convert Pixiv Ugoira to FORMAT using [FFmpeg](ffmpeg.md). Supported formats are 'webm', 'mp4', 'gif', 'vp8', 'vp9', 'vp9-lossless', 'copy'. |
|
||||
| `--exec CMD` | Execute CMD for each downloaded file. Supported replacement fields are {} or {_path}, {_directory}, {_filename}. Example: --exec "convert {} {}.png && rm {}" |
|
||||
| `--exec-after CMD` | Execute CMD after all files were downloaded. Example: --exec-after "cd {_directory} && convert * ../doc.pdf" |
|
Loading…
Reference in a new issue