In modern web applications, controlling the rate of requests using a concurrency rate limiter to your APIs is essential for maintaining performance and preventing abuse.
While fixed window rate limiting is a common approach, this guide focuses on implementing a concurrency request rate limiting in ASP.NET Web API, which limits the number of concurrent requests processed by the server.
What Is a Concurrency Rate Limiter?
A concurrency rate limiter restricts the number of requests that an API can process at the same time. Unlike time-based rate limiters, it ensures that the server doesn’t exceed a specified number of simultaneous requests, protecting resources from being overwhelmed.
Setting Up the Concurrency Rate Limiter
Here’s how to configure a concurrency rate limit in ASP.NET Web API:
Step 1: Add the Rate Limiter Middleware
First, ensure you have the required packages installed and update your Program.cs
to include the rate limiter:
var builder = WebApplication.CreateBuilder(args);
// Add services to the container
builder.Services.AddControllers();
builder.Services.AddRateLimiter(options =>
{
options.GlobalRateLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
{
// Partition key is based on the host header
var partitionKey = httpContext.Request.Headers.Host.ToString();
return RateLimitPartition.GetConcurrencyLimiter(partitionKey, _ =>
new ConcurrencyLimiterOptions
{
PermitLimit = 1, // Maximum concurrent requests allowed
QueueProcessingOrder = QueueProcessingOrder.OldestFirst,
QueueLimit = 0, // No queued requests
RejectionStatusCode = StatusCodes.Status429TooManyRequests
});
});
});
var app = builder.Build();
// Add middleware to the pipeline
app.UseRateLimiter();
app.UseAuthorization();
app.MapControllers();
app.Run();
Step 2: Test the Implementation
To validate the concurrency request rate limiter, create a console application that sends multiple simultaneous requests to your API:
class Program
{
static async Task Main(string[] args)
{
while (true)
{
Parallel.For(0, 5, async i =>
{
using var client = new HttpClient();
var response = await client.GetAsync("https://localhost:5001/weatherforecast");
Console.WriteLine(response.StatusCode);
});
Thread.Sleep(1000); // Pause before the next batch of requests
}
}
}
When you run this test, you’ll observe both 200 OK
and 429 Too Many Requests
responses, demonstrating that the concurrency request rate limiting is working as intended.
Key Configuration Options
- PermitLimit: Specifies the maximum number of concurrent requests allowed.
- QueueLimit: Defines how many requests to queue if the limit reaches its configuration.
- QueueProcessingOrder: Determines the order in which to process the requests from the queue (“OldestFirst” is common).
- RejectionStatusCode: The HTTP status code returned when the limit is exceeded (typically
429 Too Many Requests
).
Benefits of Concurrency Rate Limiter
- Resource Protection: Ensures that critical system resources are not overwhelmed by excessive concurrent requests.
- Fair Usage: Provides fair access to API clients by controlling simultaneous requests.
- Improved Performance: Prevents bottlenecks caused by overloading server threads.
Conclusion
Implementing a concurrency rate limiting in ASP.NET Web API is a straightforward yet powerful way to ensure your application handles concurrent requests efficiently. By configuring the limiter, you can safeguard your API and maintain a smooth user experience.
This whole exercise is available in my YouTube channel video here: https://youtu.be/p30BRMVj6KE?si=YiOKcTnJyTPMagSB
If you found this guide helpful, consider subscribing to our blog for more tutorials on ASP.NET and .NET development!