If you need to monitor prices, product details, or availability across Amazon, Walmart, eBay, and other e-commerce sites, you may be running into two problems:
- Fragmented data collection — having to run separate scrapers for each site or type of page.
- Incomplete coverage — category pages are great for finding new items, but can miss updates to important products; product detail pages give precise tracking but no discovery.
We'll show you how to solve these problems with E-commerce Scraping Tool. It extracts data from multiple e-commerce websites, using category, product detail URLs, or both - so you can get complete, deduplicated datasets in one export. It can also scrape marketplaces using keywords instead, if you’re interested in product discovery or market analysis.

How to scrape product data for e-commerce
You can use E-commerce Scraping Tool via the UI using natural language or JSON input, or run it programmatically via API. The UI is the best way to test it out, so let's start there.
Step 1. Open Apify's E-commerce Scraping Tool
E-commerce Scraping Tool lives on Apify Store - the world's largest marketplace of web scrapers. To get started, click Try for free. If you're logged in to your Apify account, you'll be taken to Apify Console – your dashboard for running the scraper. If not, you'll be prompted to sign in or sign up first.

Step 2. Choose your input types
Once you're logged in, you can configure the tool in Apify Console.
The tool supports three input methods: Category listing URLs, Product detail URLs, and Keywords search.
| Input type | What it is | When to use it |
|---|---|---|
| Category listing URLs | Search results or category pages with multiple products | Discover many products, monitor whole categories, find new arrivals |
| Product detail URLs | URLs pointing directly to a single product page | Monitor known SKUs, track specific items for price/stock changes |
| Keyword search | Search marketplaces (i.e. amazon.de, ikea.com, kaufland.at) by keywords | Fast search — no need to gather URLs. Great for market research |
We’ll use product details URLs in this example, and compare wireless headphones available on Amazon, Walmart, and eBay. You can use URLs from any e-commerce platform.
- Using the manual tab:
We’ll add several URLs as our input, using the bulk edit option.

The AI analysis feature allows you to get more out of your data, using natural-language instructions. To use it, define which dataset fields are relevant, and add a custom prompt for the scraper.

- Same configuration in the JSON tab:

- How to use keywords instead
If product discovery is your main objective, choose one or more keywords as your input instead of the category/product URLs used above. Next, decide which marketplaces you want to source data from - E-commerce Scraping Tool supports global platforms and their local marketplaces, including Amazon, Walmart, Costco, Kaufland, Allegro, IKEA, and more.

- Additional feature: Scrape reviews
This scraper can also be used to scrape reviews instead of product details. To do this, add your URLs to the Review Options. The tool will collect review text, rating, reviewer’s name, and URL.


Step 3. Run and export
- Click Save & Start to run the scraper.

When the run finishes, you can export your results. Scroll down to check the preview of your dataset, with name, URL, image, and other product information, along with the AI summary we configured earlier.

Click the Export button to choose from multiple data export options, and filter results using Select or Omit fields.

How much will your run cost?
E-commerce Scraping Tool uses a pay per event pricing model. You pay for:
- Actor start (per run)
- Listings scraped (for each pagination page)
- Details (for product, reviews, or seller)
- Optional: Residential proxy use (per product)
- Optional: Browser rendering (per product)
- Optional: AI summary
Higher subscription plans unlock lower Actor costs, as seen when comparing Free and Business plans:

Example: Scraping 1,000 listing pages (~20,000 products)
No proxies or browser rendering:
- Actor start = $0.0007
- Listings = $0.26 (for 1,000 listings)
- Product details = 20,000 × ($1.00 / 1,000) = $20.00
- Total = $0.0007 + $0.26 + $20.00 = $20.2607 → ≈ $20.26
With proxies + browser rendering:
- Actor start = $0.0007
- Listings = $0.26
- Product details = $20.00
- Residential proxy = 20,000 × ($0.80 / 1,000) = $16.00
- Browser rendering = 20,000 × ($0.51 / 1,000) = $10.20
- Total = $0.0007 + $0.26 + $20.00 + $16.00 + $10.20 = $46.4607 → ≈ $46.46
Key takeaway
Costs remain very low relative to data volume. A large run with proxies + browser rendering comes to about $46.46. Listings are cheap: most costs come from product detail scrapes and optional add-ons. For frequent large-scale runs, the Business plan offers the most efficiency.
Run E-commerce Scraping Tool via API
If you want to automate, scale, or integrate scraping into your existing workflow, you can run E-commerce Scraping Tool programmatically with the Apify API.
Why use the API?
- You already have URLs generated from another system (e.g., ERP, Google Sheet) and want to feed them directly.
- You need to run scraping jobs on a schedule and deliver results automatically.
- You’re building a price monitoring or product tracking app that requires fresh data on demand.
How to use it
You can run the same configuration programmatically with the following:
- E-commerce Scraping Tool API in Python
- E-commerce Scraping Tool API in JavaScript
- E-commerce Scraping Tool API through CLI
- E-commerce Scraping Tool OpenAPI definition
- E-commerce Scraping Tool for MCP
For more details about using the API, go to the Readme or the Apify API documentation.
Try E-commerce Scraping Tool
For e-commerce product data, you can collect information from both listing and detail URLs across multiple websites and store it in one dataset. This lets you discover many products, track down known SKUs, and get maximum coverage in one deduplicated dataset. You can run the scraper via UI for simplicity, or programmatically via API for integration into your workflow.