DOKK / manpages / debian 12 / python3-scrapy / scrapy.1.en
SCRAPY(1) General Commands Manual SCRAPY(1)

scrapy - the Scrapy command-line tool

scrapy [command] [OPTIONS] ...

Scrapy is controlled through the scrapy command-line tool. The script provides several commands, for different purposes. Each command supports its own particular syntax. In other words, each command supports a different set of arguments and options.

Print response HTTP headers instead of body

Run a spider

Store scraped items to FILE in XML format

Query Scrapy settings

Print raw setting value
Print setting value, interpreted as a boolean
Print setting value, interpreted as an integer
Print setting value, interpreted as a float
Print setting value, interpreted as a float
Print initial setting value (before loading extensions and spiders)

Launch the interactive scraping console

Create new project with an initial project template

Print command help and options

Log file. if omitted stderr will be used

Log level (default: None)

Disable logging completely

Always use this spider when arguments are urls

Write python cProfile stats to FILE

Write lsprof profiling stats to FILE

Write process ID to FILE

Set/override setting (may be repeated)

Scrapy was written by the Scrapy Developers.

This manual page was written by Ignace Mouzannar <mouzannar@gmail.com>, for the Debian project (but may be used by others).

October 17, 2009