Get data from Google Places not provided by official API

Use our Google Places crawler to scrape data from Google Maps Our Google Places crawler lets you get lots more information from Google Maps than the official API

Content

Every now and then someone wants to get data from Google Places using an Apify web scraping tool. When starting to cooperate on such a project in the past, we always suggested that the user could instead try the Google Places API. But most users replied that they need information that the API does not provide. So we decided to come up with a generic solution.

The result is Google Maps Scraper, an Actor that crawls Google Maps. The Actor fills out a search query in Google Maps search and goes through each item in the result. The output item for a single place contains:

  • Basic information
  • Popular times histogram
  • Photos
  • Reviews
3 screenshots of a "pubs near Prague 2" search result on Google Maps with photos and reviews of the restaurant

The benefits of using an Actor are that you can get all user reviews, all place photos and the popular times histogram. If you’re using the Google Places API, you can only get 5 user reviews, 10 place photos and the popular times histogram is not included in the data.

How to use the Actor

The best thing about Apify public Actors is that you can use them for free! Start off on the Actor library page, where you can click on the Try Actor button (if you’re not an Apify user, create an account first). You will be redirected to the Actor run console page, where you can specify which places you want to get data about.

screenshot of a filled out input schema for Google Maps Scraper

The Actor provides the following configuration options. Search is used to specify the search string with which you want to search places, e.g. pubs in our example. In the Proxy configuration, you can set up any proxy IPs you want to use for the crawler. As this option is for advanced Apify users (read more about Apify Proxy here), let’s just choose Apify proxy (automatic) for our example.

If you want to search places in a specific location, you can use Viewport point latitude, Viewport point longitude, Viewport zoom level. For example, we want to search for pubs in Berlin. We have to find the latitude and longitude for Berlin (you can use this handy lat and long converter for that). The converter gives us 52.520008 for latitude and 13.404954 for longitude, so let’s enter these values into the configuration. Viewport zoom level is used to set up the zoom level of the viewport. This is a number from 1 to 20, where 20 means that map viewport is maximum zoomed in. In our example, we’ll use 14, which give us a viewport for the whole Berlin city center when combined with the selected latitude and longitude. You can check selected viewport on Google Maps.

The last configuration is Max crawled places, which is used to limit the results count. There are hundreds of pubs in Berlin, so we’ll limit our results to get up to 50 pubs.

Then run the Actor using the Run button under the configuration. After a few minutes, you should see the results in a Apify dataset, which you can find in the tabs menu on the Actor run page.

screenshot of a successful run of the actor with multiple export options

The best format for the Actor is JSON because there are a few nested arrays in each result and these can make table formats (HTML, CSV, Excel) unreadable.

And that’s it! We’d love to hear feedback from you. If you’ve found any additional data you want to get from Google Places or any issues, you can submit an issue on the Actor GitHub repo.

Google Maps scraping manual: extract maps data like a pro
Comprehensive free guide on how to extract data from Google Maps: reviews, image data, contacts, and more.

The most recent manual for crawling Google Places

Jakub Drobník
Jakub Drobník
Full-stack developer from the early days at Apify, involved in almost all Apify platform projects since then. Currently switching between work and travelling around the globe on a biweekly basis.

Get started now

Step up your web scraping and automation