Get Started
Batch Scrape
Use Batch Scraping to submit up to 10,000 URLs in a single API call. This is useful when you need to extract the same type of data from a large number of pages — for example, scraping product pages or company profiles at scale.
Unlike the standard Web Scraping API which accepts a single url
, Batch Scrape works by sending an array of URLs under the urls
field.
All URLs in the batch will be processed using the same parameters.
When to Use Batch Scraping
- Extracting titles, authors, and publish dates from a list of blog or news article URLs
- Running
aiExtractRules
across a set of company websites to collect structured data like founding year, services, and contact info - Gathering legal notices or disclaimers from the footer pages of 5,000+ policy URLs
Submit a Batch Scrape Job
Response
This means the batch job was accepted and is being processed asynchronously.
Get Job Status & Results
To check the status of your batch job:
To retrieve results once ready (supports pagination):
Example Result
Each result matches the format of a regular Web Scraping API response, except it’s returned as part of an array.
Notes
- Maximum batch size: 10,000 URLs
- All URLs are processed using the same parameters
- Failed URLs do not consume credits
- All outputs are returned as an array of Web Scraping API-style results