5 API throttling best practices worth following

Managing an API provider’s throttling policies (or the number of requests they allow you to make over a certain period) effectively allows you to avoid 429 “Too Many Requests” errors, additional costs, and even a temporary or permanent ban from accessing the API.

We’ll help you manage any API provider’s approach to throttling by breaking down several best practices.

Review how the API provider implements throttling 

Every API provider can impose throttling differently and/or in multiple ways. 

API providers typically impose rate limits to set a cap on the number of requests a consumer can make. But they can also set limits on the number of requests that can be made concurrently, and use quota-based throttling, where an API provider sets a request limit for a longer timeframe.

All this to say, it’s worth evaluating the specific throttling policies each API provider puts into place by reviewing their API documentation

Make efficient API calls

To ensure that you’re not making unnecessary API calls and retrieving as much data as possible from each, you can call specific endpoints that are likely to retrieve more data than others. 

For instance, let’s say that you want to retrieve employees from BambooHR, a widely-used HRIS solution. 

Instead of calling a specific endpoint that returns individual employees, like <code class="blog_inline-code">https://api.bamboohr.com/api/gateway.php/{companyDomain}/v1/employees/{id}/</code>, you can call the endpoint <code class="blog_inline-code">https://api.bamboohr.com/api/gateway.php/{companyDomain}/v1/reports/custom</code> to retrieve a custom report of all the employees.

BambooHR mentions that calling the custom report endpoint can be a more efficient way to fetch employees in their API documentation

Related: How to manage rate limits effectively

Establish your own throttling mechanism

Once you determine an API provider’s approach to throttling, you can use custom code to automate the process of making requests. Moreover, these automated requests could be set up to occur on a cadence that falls within the provider’s rate limit.

As an example, you can use what’s referred to as the “Token Bucket Algorithm”, where tokens get added to a bucket at a fixed rate. A token gets consumed with every request and once the bucket is empty, no requests can occur until a token gets added. You can learn more about implementing this method by reading this tutorial.

Use caching to retrieve previous responses

Many types of data change infrequently, (e.g., employee fields, like first and last names, job titles, departments, addresses, etc.). To help you avoid making an API request every time you need to get data that falls within this category, you can simply cache their API responses and retrieve them when they're needed. 

You can also call these endpoints on an infrequent basis to determine if there have been any changes. And if there are, you can replace the data that’s cached with what’s changed. 

Related: Best practices for managing API pagination 

Leverage webhooks when possible

Webhooks, which allow you to get notified when a certain event occurs in an application (via a POST request), let you avoid making API requests while still allowing you to get the data you need, when you need it.

That said, not all API providers offer webhooks, or they only offer them for certain endpoints, so you’ll need to review their API documentation to confirm that they do.

A screenshot of GitHub's webhooks documentation
GitHub, a developer platform, has dedicated ample documentation on its webhooks, allowing its API consumers to not only confirm which endpoints they support but also how you can create and troubleshoot them effectively 

Avoid dealing with API throttling for each provider by using Merge

Merge, the leading unified API solution, lets you add hundreds of integrations to your product through its unified API. This means that you only have to consider Merge’s approach to API throttling to access all the 3rd-party integrations you want and need.

You can learn more about Merge and its throttling policies by scheduling a demo with one of our integration experts.