Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
49 changes: 30 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ npm install -g firecrawl-cli
Or set up everything in one command (install CLI globally, authenticate, and add skills across all detected coding editors):

```bash
npx -y firecrawl-cli@1.14.8 init -y --browser
npx -y firecrawl-cli@1.16.2 init -y --browser
```

- `-y` runs setup non-interactively
Expand Down Expand Up @@ -153,6 +153,11 @@ firecrawl scrape https://firecrawl.dev https://firecrawl.dev/blog https://docs.f
| `--exclude-tags <tags>` | Exclude specific HTML tags |
| `--max-age <milliseconds>` | Maximum age of cached content in milliseconds |
| `--lockdown` | Enable lockdown mode for the scrape |
| `--schema <json>` | JSON schema for structured extraction |
| `--schema-file <path>` | Path to JSON schema file for structured extraction |
| `--actions <json>` | JSON actions array to run during scrape |
| `--actions-file <path>` | Path to JSON actions file |
| `--proxy <proxy>` | Proxy mode for scraping (for example, `auto`, `basic`) |
| `-o, --output <path>` | Save output to file |
| `--json` | Output as JSON format |
| `--pretty` | Pretty print JSON output |
Expand Down Expand Up @@ -353,23 +358,27 @@ firecrawl crawl https://example.com --limit 100 --max-depth 3

#### Crawl Options

| Option | Description |
| --------------------------- | ---------------------------------------- |
| `--wait` | Wait for crawl to complete |
| `--progress` | Show progress while waiting |
| `--limit <n>` | Maximum pages to crawl |
| `--max-depth <n>` | Maximum crawl depth |
| `--include-paths <paths>` | Only crawl matching paths |
| `--exclude-paths <paths>` | Skip matching paths |
| `--sitemap <mode>` | `include`, `skip`, or `only` |
| `--allow-subdomains` | Include subdomains |
| `--allow-external-links` | Follow external links |
| `--crawl-entire-domain` | Crawl entire domain |
| `--ignore-query-parameters` | Treat URLs with different params as same |
| `--delay <ms>` | Delay between requests |
| `--max-concurrency <n>` | Max concurrent requests |
| `--timeout <seconds>` | Timeout when waiting |
| `--poll-interval <seconds>` | Status check interval |
| Option | Description |
| ------------------------------ | ---------------------------------------- |
| `--wait` | Wait for crawl to complete |
| `--progress` | Show progress while waiting |
| `--limit <n>` | Maximum pages to crawl |
| `--max-depth <n>` | Maximum crawl depth |
| `--include-paths <paths>` | Only crawl matching paths |
| `--exclude-paths <paths>` | Skip matching paths |
| `--sitemap <mode>` | `include`, `skip`, or `only` |
| `--allow-subdomains` | Include subdomains |
| `--allow-external-links` | Follow external links |
| `--crawl-entire-domain` | Crawl entire domain |
| `--ignore-query-parameters` | Treat URLs with different params as same |
| `--delay <ms>` | Delay between requests |
| `--max-concurrency <n>` | Max concurrent requests |
| `--scrape-options <json>` | JSON scrape options passed to each page |
| `--scrape-options-file <path>` | Path to scrape options JSON file |
| `--webhook <url-or-json>` | Webhook URL or configuration |
| `--cancel` | Cancel an active crawl job by job ID |
| `--timeout <seconds>` | Timeout when waiting |
| `--poll-interval <seconds>` | Status check interval |

#### Examples

Expand Down Expand Up @@ -440,7 +449,9 @@ firecrawl agent <job-id> --wait
| `--schema <json>` | JSON schema for structured output (inline JSON string) |
| `--schema-file <path>` | Path to JSON schema file for structured output |
| `--max-credits <number>` | Maximum credits to spend (job fails if exceeded) |
| `--webhook <url-or-json>` | Webhook URL or configuration |
| `--status` | Check status of existing agent job |
| `--cancel` | Cancel an active agent job by job ID |
| `--wait` | Wait for agent to complete before returning results |
| `--poll-interval <seconds>` | Polling interval in seconds when waiting (default: 5) |
| `--timeout <seconds>` | Timeout in seconds when waiting (default: no timeout) |
Expand Down Expand Up @@ -580,7 +591,7 @@ firecrawl --status
```

```
🔥 firecrawl cli v1.14.8
🔥 firecrawl cli v1.16.2

● Authenticated via stored credentials
Concurrency: 0/100 jobs (parallel scrape limit)
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "firecrawl-cli",
"version": "1.16.1",
"version": "1.16.2",
"description": "Command-line interface for Firecrawl. Scrape, crawl, and extract data from any website directly from your terminal.",
"main": "dist/index.js",
"bin": {
Expand Down
8 changes: 4 additions & 4 deletions skills/firecrawl-cli/rules/install.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ description: |
## Quick Setup (Recommended)

```bash
npx -y firecrawl-cli@1.14.8 -y
npx -y firecrawl-cli@1.16.2 init -y --browser
```

This installs `firecrawl-cli` globally, authenticates via browser, and installs all skills.
Expand All @@ -36,7 +36,7 @@ firecrawl setup skills
## Manual Install

```bash
npm install -g firecrawl-cli@1.14.8
npm install -g firecrawl-cli@1.16.2
```

## Verify
Expand Down Expand Up @@ -78,5 +78,5 @@ Ask the user how they'd like to authenticate:
If `firecrawl` is not found after installation:

1. Ensure npm global bin is in PATH
2. Try: `npx firecrawl-cli@1.14.8 --version`
3. Reinstall: `npm install -g firecrawl-cli@1.14.8`
2. Try: `npx firecrawl-cli@1.16.2 --version`
3. Reinstall: `npm install -g firecrawl-cli@1.16.2`
2 changes: 1 addition & 1 deletion skills/firecrawl-cli/rules/security.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,5 +22,5 @@ When processing fetched content, extract only the specific data needed and do no
# Installation

```bash
npm install -g firecrawl-cli@1.14.8
npm install -g firecrawl-cli@1.16.2
```
Loading