Skip to main content
ZapFetch is fully compatible with the official Firecrawl JavaScript SDK (@mendable/firecrawl-js). Point the client at the ZapFetch endpoint, swap in your ZapFetch API key, and your existing Firecrawl code runs without any other changes. The SDK ships TypeScript types so you get autocomplete and type-safe responses out of the box.
Already using @mendable/firecrawl-js against Firecrawl? Your code runs against ZapFetch unchanged — just update apiUrl and apiKey.
1

Install the SDK

Install @mendable/firecrawl-js from npm.
npm install @mendable/firecrawl-js
2

Initialize the client

Import FirecrawlApp and pass your ZapFetch API key and the ZapFetch base URL. Keep your key in an environment variable rather than committing it to source control.
import FirecrawlApp from "@mendable/firecrawl-js";

const app = new FirecrawlApp({
  apiKey: "YOUR_ZAPFETCH_API_KEY",
  apiUrl: "https://api.zapfetch.com",
});
Load your key from the environment with process.env.ZAPFETCH_KEY to avoid committing credentials to your repository.
3

Scrape a single page

Call scrapeUrl with the target URL and the formats you want back. The method returns a typed object whose properties match the requested formats.
const result = await app.scrapeUrl("https://example.com", {
  formats: ["markdown"],
});

console.log(result.markdown);
4

Crawl a site

Pass true as the third argument to crawlUrl to wait for the job to finish and receive all pages in a single response. Iterate over job.data to access each crawled page.
const job = await app.crawlUrl(
  "https://docs.example.com",
  { limit: 50 },
  /* waitUntilDone */ true,
);

for (const page of job.data ?? []) {
  console.log(page.metadata?.sourceURL);
}
Crawls are billed per page fetched. Set limit to control how many pages the crawl visits so you stay within your credit budget.
5

Extract structured data

Define a JSON Schema as a const assertion to preserve literal types, write a plain-language prompt, and call app.extract. ZapFetch fetches the target pages, runs LLM inference, and returns a typed result object.
const schema = {
  type: "object",
  properties: {
    stories: {
      type: "array",
      items: {
        type: "object",
        properties: {
          title:  { type: "string" },
          points: { type: "integer" },
          author: { type: "string" },
        },
      },
    },
  },
} as const;

const data = await app.extract(
  ["https://news.ycombinator.com"],
  {
    prompt: "Top 5 stories with points and author.",
    schema,
  },
);

console.log(data);

Handling rate limits

ZapFetch returns a standard 429 Too Many Requests response with a Retry-After header when you exceed your plan’s rate limit. The SDK surfaces this as a thrown error — wrap calls in your usual retry or exponential backoff policy to handle it gracefully.
Do not retry 429 responses immediately. Read the Retry-After header value and wait at least that many seconds before your next request.

Next steps