Google Places API limits (and how to overcome them using geolocation)

Your extended go-to guide on how to overcome Google Places API limits by scraping Google Maps with geolocation.

Content

Need to scrape more than 120 places on Google Maps in a large city? What about hundreds of thousands or even millions of places across the whole country or maybe even a continent? Is there a tool able to do that? All is possible with the right Google Maps API, and that's not the official one for sure.

This guide to using Google Maps Scraper ๐Ÿ”— and Google Maps Data Extractor ๐Ÿ”— will show you how to overcome the limits of extracting data from Google Maps.

No time for reading today? Video guide to the rescue:

How to overcome the Google Places API limit of 120 places (video guide)

โ›” What are Google Maps limitations for scraping?

๐Ÿ”–
The Google Maps website has a limitation of displaying 120 results max per map for users (even though there might be more). The website does this for the UX reasons โ€“ seeing an entire map full of overlapping pins might be overwhelming.

Despite its benefits for users, this UX decision has a direct impact on scraping Google Maps since, in effect, it means hiding the actual number of places until you zoom in to a certain level.

Letโ€™s take New York Cityโ€™s restaurants as an example to showcase how the Google Maps website applies this limit. When browsing the Google Maps website as a user, you might notice a peculiar thing โ€“ no matter how many places there are in a city, at some point, you will run out of places that are displayed. The helpful sidebar on the left will say Youโ€™ve reached the end of the list.

Scrollbar on the left always atops at 120 places
The scrollbar on the left always stops at 120 places ๐Ÿค”

If you count how many items actually are on this list, you will always end up with 120 at most. The same thing will happen if you try to search even a bigger area (a state, for example) โ€“ 120 places displayed max.

But there are definitely more than 120 restaurants in New York City or New York State. This display limitation of Google Maps is there for a reason (better UX), but this is not the limitation youโ€™re expecting when web scraping that data. So how do we overcome it?

To overcome the Google Maps limits of 120 places, we created Google Maps Scraper ๐Ÿ”—. This tool can double as both scraper and an unofficial Google Maps API or geolocation API. It can find or scrape hundreds of thousands of places at a time, leaving no stone unturned. Letโ€™s explore how it works.

๐ŸŒ How to overcome Google Places API limits

As it has become increasingly popular, Apifyโ€™s Google Maps API has evolved to answer the demand for quality Google Maps scraping in many ways. We now have at least 4 different ways to scrape the same location: by creating Custom geolocation or using Location parameters. But that came with a price: it takes a bit of a learning curve to apply each of these methods. So let's learn them.

๐Ÿ—บ

1. Create a custom area by using pairs of coordinates ๐Ÿ“ก

It is a more complex but also the most autonomous way to use Google Maps Scrapers. Defining the area by geolocation will give you full control over the map area, no matter how big or small. Here are a few interesting shapes you can create using the Custom geolocation section of the Google Maps Scraper:

โญ•๏ธ Circle

If you donโ€™t care about scraping places in the outskirts, circle is your best option.

This is where we dive into a bit of the geometry-meets-JavaScript part. A circle feature is very useful for scraping places in specific, typically circular and dense areas such as the city center.

First, define the focus of your circle on Google Maps (longitude and latitude):

โ€ฆ then add your circle coordinates into Custom search area field. Donโ€™t forget to pick the kilometer radius of how far your circle should reach.

Now add those circle parameters in the Custom search area field
Now add those circle parameters in the Custom search area field

๐Ÿ’  Polygon

Polygon is perfect for scraping places from custom or unusual shapes (a district, a peninsula or an island, for example). It can have as many points as you need (triangle, square, pentagon, hexagon, etc.)

Do not be intimidated โ€“ a polygon uses the same principle for interacting with Google Maps geolocation as the circle. However, it means you will have to add more than one coordinate pair.

To form a free polygon, just pick a few different points on the map (at least three) and take note of their coordinates. You can find any pair of coordinates by clicking anywhere on the map:

Click anywhere on the map to reveal the coordinates
Click anywhere on the map to reveal the coordinates

You can add as many points as you want to define your area in the most accurate shape possible. We're gonna go with the pentagon shape. Now copypaste each coordinate pair in a sequence into the Custom search area. This is what creating a pentagon for part of Manhattan would look like:

๐Ÿ”–
There are only 3 rules on how to paste coordinates to scrape Google Maps:

1. They have to come in pairs (longitude-latitude per each point).
2. The first pair and the last pair of coordinates have to be the same (thus completing the shape of the area).
3. They have to be reversed compared to the coordinates on Google Maps website.ย The first field must be longitude โ†•๏ธ, second field must be latitude โ†”๏ธ.

If in doubt, use the readme ๐Ÿ”— as your formatting guide.

Now click Start โ–ท to begin extracting places data. Of course, we could add even more points to this shape to make it more accurate for the island, but you get the gist. You can choose to scrape Google Maps using polygons if your priority is speed, consuming fewer platform credits, or scraping a specific area with unusual geometry.

๐Ÿ’ ๐Ÿ’  Multipolygon

It is best used for areas that are difficult to fit into one polygon or areas that are quite far away geographically but still have to be scraped together (for instance, an island together with the mainland).

The most complicated and most flexible in custom geolocation is Multipolygon. It is not a scraping option for everyone but it can definitely solve a few headaches.

For instance, this multipolygon would include two polygons since that seems to be the only way to encompass all of Florida. Don't doubt it, even though you as a user see no Google Maps pins displayed on this map, the Google Maps Scraper will find them all.

Two pentagons to scrape all places in Florida
Two pentagons to scrape all places in Florida

So you do the same thing as with the polygons but twice or more times. Here's how you paste the coordinates of a multipolygon (by combining two or more polygon shapes) to Google Maps Scraper. The three rules above on how to insert the coordinates apply to multipolygons as well.

Creating a multipolygon for scraping complex shapes
Creating a multipolygon to scrape areas with complex shapes

2. Choose the location using regular toponomy๐Ÿ“

This method of Google places scraping is a better choice if you don't need to customize the area you're scraping. The area is already defined by its name โ€“ a city, an region, a zipcode area or even an entire country.

๐ŸŒ† Where + What to scrape (search term + city)

Perfect for scraping trials and fast results.

This method is much easier than geolocation settings. Just indicate what you are searching for (restaurants) and where (New York). Then hit Start โ–ท and the scraper will do all the work and extract every place that matches your search.

Where + What to scrape (city + search term) example. You can also add how many places you want to scrape
Where + What to scrape (city + search term) example. You can also add how many places you expect to scrape
๐Ÿ”–
There are only 3 rules regarding scraping Google Maps with ๐Ÿ“ location + ๐Ÿ” search term combo:

1. There must be only 1 location per 1 search. You can have multiple search terms but there has to be only one location for all of them.
2. Don't put location and search term into the same field. Keep them separate. The scraper will still work but it won't overcome the limit of 120 places per map.
3. Don't put only location (without search term). Otherwise, you will scrape all places possible from that area instead of a specific category of places.

And regarding the search term, ๐Ÿ” Search term can be anything. Types of places: restaurants, cafes, vegan museums, gas stations, pubs, bakeries, ATMs โ€“ whatever category of place youโ€™re searching for. Names of places: Starbuck's, Zara, Pizzahut. Here are just a few examples of what you can put in the Search term:

Examples of search terms for Google Maps scraping
Examples of search terms for Google Maps scraping

Remember that you can also use search terms in languages other than English ๐Ÿ˜‰

๐Ÿ“ฎ Where + What to scrape (search term + extended address)

This method is good for scraping areas by zip code or smaller cities and towns.

For this method, you define the location with the Country, State, US County if applicable, City & Postal code of the area that interests you. In our case, weโ€™re scraping all restaurants in Syracuse by knowing the city specs. The more specs, the better results. And donโ€™t forget to fill out the ๐Ÿ” Search term as well.

Just a more detailed version of setting up known location
Just a more detailed version of setting up known location

For the most accurate search, fill out as many fields as you can in this section. The scraper will create map grids and adjust the zoom level in the background; no need to set that up. Just hit the Start โ–ท button.

You can also use Categories instead of Search term. In English, there are 2,500+ official Google Maps place categories to choose from.
You can also use Categories instead of Search terms. There are 2,500+ official Google Maps place categories to choose from.
๐Ÿ”–
Bonus: if you're scraping in English, you can also replace the ๐Ÿ” Search term with ๐Ÿ› Categories. The chance is, Google Maps already has your search term listed in their collection of 2,500+ categories, either in one or two words length.

๐Ÿ•ต๏ธโ€โ™‚๏ธ Understanding how Google Maps Scraper works

If you really want to know the backstory about how this scraper works its magic, prepare for this longread.

Creating map grids ๐Ÿ—บ

Thereโ€™s no magic here, of course. What Google Maps Scraper ๐Ÿ”— will do is split the large map into many smaller maps. It will then search or scrape each small map separately and combine the end results.

How a Google Maps Scraper creates a map grid
How a scraper splits a bigger map into a map grid

Why would the scraper do this, you ask? Besides, each small map seems empty from this distance (no pins visible ๐Ÿ‘€), would there be any results at all? To find an answer, we need some more detective work on the Google Maps website.

๐Ÿ’ 
Can you scrape Google Places by Google Maps URLs? Yes, even with this scraper. But the URL method won't let you overcome the limit of 120 results per map. See this tutorial to learn more โžก๏ธ

Automatic zooming on Google Maps ๐Ÿ”Ž

If you zoom out from the map far enough, you wonโ€™t see many places on the map, only city and state names. The further you zoom out, the fewer places (represented by Google Maps pins ๐Ÿ“) you can see, and the lower the zoom level is.

This is what Zoom 5 on Google Maps looks like
This is what Zoom 5 on Google Maps looks like

Zoom 1 on Google Maps = the whole planet, 21 = tiny street. You can always check the zoom level in the URL. For instance, to display the whole East Coast, Google Maps uses 5 for zoom level (as seen in the URL). But the more you zoom in, the higher the level is, the more detailed the map is, and the more places it displays.

For our New York City example, compare the number of pins at zoom level 16 (closer) vs. zoom level 15 (further). Can you see how many places are missing in the latter?

This zoom difference would influence your API results as well because - youโ€™ve guessed it โ€“ the 120 places limitation applies no matter the zoom level. So how does the scraper deal with this issue?

The scraper will split the NYC map into a number of smaller, more detailed mini-maps. For each map, the zoom level will be adjusted automatically to the most optimal and rich-in-results level (usually 16), so the mini-maps wonโ€™t be empty. This way the scraper will be able to find and scrape every single place on a small map. Itโ€™s important to note that scraper does allow you to override the standard zoom level and get even more results (but for higher usage of compute units).

Finally, the scraper sums up and patches together the results of all mini-maps. That way we can get accurate map data from the whole city of New York, which makes this method the best for large-scale scraping.

However, this methodโ€™s advantage comes at a cost: while being thorough, map grids can be quite slow and consume a lot of credits. There might be very few restaurants on the mini-maps, way less than 120, so why does it take so long to scrape them? The reason is every small map is a separate page the scraper has to open. So that's one extra request for the scraper to do, hence extra work, hence extra consumed credits.

๐Ÿช„
To recap, hereโ€™s how the scraper is able to find all places on Google Maps:

1. You provide the scraper with a set of coordinates: country, city, state, county, or postal code.
2. The scraper splits this map into smaller maps.
3. It also automatically adjusts the zoom level on each minimap to reach the maximum density of pins.
4. The scraper extracts the results from each mini-map.
5. The scraper combines results from all mini-maps.

๐Ÿ›  Extra feature for scraping large areas efficiently

There used to be one core issue when trying to scrape large areas: empty or lesser populated territories. Be it a natural park, a body of water, or even a desert with obviously no restaurants, hospitals, or other specific public places, the scraper would try to find them there anyway โ€“ which wastes its resources. The same happened for the outskirts of cities or towns where there are higher chances for large parks, forests, or industrial areas with no pins.

Which is why we've added a special feature for our scraper โ€“ ๐Ÿ™ Deeper city scrape. Enabling it will command the scraper to skip those territories and instead focus on the denser parts of the map. It will scrape only pins and skip deserted or natural areas once it reaches them and simultaneously it focuses on cities and densely populated areas and not the outskirts.

Enable for efficient, large-scale scraping that will focus only on cities.
Enable for efficient, large-scale scraping that will focus only on scraping pins from cities.

So give it a try next time you need to scrape a truly large area with more than 120 places, and let us know how it worked for you.

Discover even more tips and tricks on how to get data from Google Maps: reviews, images, contact details, place IDs, and more!
Google Maps Scraper is a very versatile tool. This tutorial is just the tip of an iceberg โ†“
๐Ÿ”–
Discover even more tips and tricks on how to get data from Google Maps: reviews, images, contact details, place IDs, and more!
Natasha Lekh
Natasha Lekh
Crafting content that charms both readers and Googleโ€™s algorithms: readmes, blogs, and SEO secrets.

Get started now

Step up your web scraping and automation