Apartments.com Scraper data scraper is essentially a tool for web scraping. It supports the following features:
- Scrape any address: search for a specific location and scrape the results.
- Apply any filters: apply any filter provided by Apartments.com.
- Scrape property details: target any of the property detail links.
- Limit results by page or amount of property: if you don’t want to get all the results, you can set limits.
- Integrate the startPage field.
- Scrape better contact, vendor, and agent names.
- Scrape more information that can be retrieved from detail pages. If you have any requests, please ask for them here.
Setup & usage
You can see how Apartments.com Scraper works in this video:
During the run, the Actor will output messages to let you know what's going on. Each message always contains a short label specifying which page from the list is currently being scraped.
When items are loaded from the page, you should see a message about this event with a loaded item count and total item count for each page.
If you provide incorrect input to the actor, it will immediately stop with a failure state and output an explanation of what is wrong.
When you want to scrape a specific listing URL, just copy and paste the link as one of the startURLs.
If you would like to scrape only the first page of a listing, add the link for the page and set the endPage as 1.
Using the last approach above, you can also apply any of the available filters to the search results. Go to Apartments.com, search for a location, apply filters, and copy/paste the link as startUrl.
Compute unit consumption
The Actor is optimized to run blazing fast and scrape many as listings as possible. Therefore, it forefronts all listing detail requests. If the Actor doesn’t get blocked very often, it’ll scrape 100 listings in 1 minute and just consume approx. 0.03–0.04 compute units.
There are lots of new features on the roadmap and I am always open to new ideas. Please don’t hesitate to contact me if you have any feedback, feature requests, or totally new ideas that might be interesting to implement.
P.S. You should always use a proxy to get the best results