If you need to monitor prices, product details, or availability across Amazon, Walmart, eBay, and other e-commerce sites, you may be running into two problems:
- Fragmented data collection — having to run separate scrapers for each site or type of page.
- Incomplete coverage — category pages are great for finding new items, but can miss updates to important products; product detail pages give precise tracking but no discovery.
We'll show you how to solve these problems with E-commerce Scraping Tool. It extracts data from multiple e-commerce websites, from both category and product detail URLs, in the same run, so you can get complete, deduplicated datasets in one export.

How to scrape product data for e-commerce
You can use E-commerce Scraping Tool via the UI using natural language or JSON input, or run it programmatically via API. The UI is the best way to test it out, so let's start there.
Step 1. Open Apify's E-commerce Scraping Tool
E-commerce Scraping Tool lives on Apify Store - the world's largest marketplace of web scrapers. To get started, click Try for free. If you're logged in to your Apify account, you'll be taken to Apify Console – your dashboard for running the scraper. If not, you'll be prompted to sign in or sign up first.

Step 2. Choose your input types
Once you're logged in, you can configure the tool in Apify Console.
The tool supports two main URL types: Category listing URLs and Product detail URLs.
Input type | What it is | When to use it |
---|---|---|
Category listing URLs | Search results or category pages with multiple products | Discover many products, monitor whole categories, find new arrivals |
Product detail URLs | URLs pointing directly to a single product page | Monitor known SKUs, track specific items for price/stock changes |
- Using the manual tab:

- Same configuration in the JSON tab:

- When to use both category and detail URLs
Using both input types in the same run is helpful when you want:
- Discovery + refresh → crawl categories for new items while re-scraping your curated set of known products.
- Mixed sourcing → some sites give stable product URLs, others require crawling categories to find items.
- Single dataset output → no need to merge separate runs.
The scraper automatically deduplicates products found in both inputs.
Step 3. Run and export
- Click Start to run the scraper.

- When the run finishes, open the Storage tab to view results: product name, price, SKU, brand, image, description, and URL.

- Export your dataset as JSON, CSV, Excel, or HTML, or integrate it directly with your systems.

How much will your run cost?
E-commerce Scraping Tool uses a Pay Per Event pricing model. You pay for:
- Actor start (per run)
- Listings scraped (per page)
- Optional: Residential proxy use (per product)
- Optional: Browser rendering (per product)
Here are the current Starter plan and Business plan rates:

Example: Scraping 1,000 listing pages (~20,000 products)
No proxies or browser rendering:
- Actor start = $0.00009
- Listings = 1,000 × $0.00042 = $0.42
- Product details = 20,000 × $0.00150 = $30.00
Total ≈ $30.42
With proxies + browser rendering:
- Actor start = $0.00009
- Listings = $0.42
- Product details = $30.00
- Residential proxy = 20,000 × $0.00100 = $20.00
- Browser rendering = 20,000 × $0.00057 = $11.40
Total ≈ $61.82
No proxies or browser rendering:
- Actor start = $0.00007
- Listings = 1,000 × $0.00026 = $0.26
- Product details = 20,000 × $0.00100 = $20.00
Total ≈ $20.26
With proxies + browser rendering:
- Actor start = $0.00007
- Listings = $0.26
- Product details = $20.00
- Residential proxy = 20,000 × $0.00080 = $16.00
- Browser rendering = 20,000 × $0.00051 = $10.20
Total ≈ $46.46
Key takeaway
At scale (20,000 products), costs remain very low relative to data volume. The Starter plan run costs about $61.82 with proxies + browser rendering. The Business plan cuts that down to $46.46, a savings of ~25%. Listings are cheap: most costs come from product detail scrapes and optional add-ons. For frequent large-scale runs, the Business plan offers the best efficiency.
Run E-commerce Scraping Tool via API
If you want to automate, scale, or integrate scraping into your existing workflow, you can run E-commerce Scraping Tool programmatically with the Apify API.
Why use the API?
- You already have URLs generated from another system (e.g., ERP, Google Sheet) and want to feed them directly.
- You need to run scraping jobs on a schedule and deliver results automatically.
- You’re building a price monitoring or product tracking app that requires fresh data on demand.
How to use it
You can run the same configuration programmatically with the following:
- E-commerce Scraping Tool API in Python
- E-commerce Scraping Tool API in JavaScript
- E-commerce Scraping Tool API through CLI
- E-commerce Scraping Tool OpenAPI definition
For more details about using the API, go to the Readme or the Apify API documentation.
Try E-commerce Scraping Tool
For e-commerce product data, you can collect information from both listing and detail URLs across multiple websites and store it in one dataset. This lets you discover many products, track down known SKUs, and get maximum coverage in one deduplicated dataset. You can run the scraper via UI for simplicity, or programmatically via API for integration into your workflow.