Rate Limiting Middleware In .NET 7

Introduction

Rate limiting middleware is a software component used to control the rate of requests made to a web application or API. It is designed to prevent resource overuse, and abuse of the API, or ensure fair usage for all users. This type of middleware is added to the pipeline of an application. It is responsible for monitoring the rate of requests made by a user or system, and then blocking or throttling requests that exceed a certain threshold.

What is Rate Limit?

Rate limiting is a technique used to control the rate at which a user or system can access a resource. This is typically done to prevent resource overuse, abuse of an API, or ensure fair usage for all users. Rate limiting is often implemented by monitoring the number of requests made by a user or system within a specific time interval, and then taking action when the number of requests exceeds a specified threshold. This action can include blocking or throttling the requests.

For example, an API might allow only a certain number of requests per minute from a single user, in order to prevent that user from overwhelming the API with too many requests. Or a website might limit the number of login attempts per hour to prevent brute force attacks. Rate limiting can be implemented at various layers of a system, such as at the server or network level, or at the application level using middleware.

Rate limiting in ASP.NET Core

In ASP.NET Core, rate limiting can be implemented using middleware. Middleware is a component that sits between the client and the server and handles requests and responses. To implement rate limiting in ASP.NET Core, we can create a custom middleware component that checks the rate of requests for each user and blocks or throttles requests that exceed a certain threshold. 

If your application is using .NET 7 (or higher), a rate limiting middleware is available out of the box. It provides a way to apply rate limiting to your web application and API endpoints.

Here's an example of how to implement a rate limiting middleware in ASP.NET Core.

STEP 1

We have to create an Asp.net Core Web API Project

STEP 2

Now comes in Program.cs file and add a simple rate limiter that limits all to 20 requests per minute, per authenticated username (or hostname if not authenticated):

builder.Services.AddRateLimiter(options => {
    options.GlobalLimiter = PartitionedRateLimiter.Create < HttpContext, string > (httpContext => RateLimitPartition.GetFixedWindowLimiter(partitionKey: httpContext.User.Identity?.Name ?? httpContext.Request.Headers.Host.ToString(), factory: partition => new FixedWindowRateLimiterOptions {
        AutoReplenishment = true,
            PermitLimit = 20,
            QueueLimit = 0,
            Window = TimeSpan.FromMinutes(1)
    }));
});
var app = builder.Build();
app.UseRateLimiter();

Now explain the above code.

builder.Services.AddRateLimiter

The method is used to configure and register the rate limiter service with the application's service container. Once added to the application, the rate limiter can be used to control access to certain routes or endpoints, ensuring that they are not overwhelmed by too many requests.

Now if we want to set a global rate limiter for all requests GlobalLimiter option is set to any PartitionedRateLimiter. In the above example, we have added a FixedWindowLimiter, and configured it to apply "per authenticated username (or hostname if not authenticated)" - the partition. The FixedWindowLimiter is then configured to automatically replenish permitted requests and permits 20 requests per minute. 

Now run the above code and test it. 

If we hit the API then the response comes and states code 200. But when we try to hit the same API again and again 20 times in 1 min. After 21 times we show the status code 503

If we want to show the meaningful response code then we have changed the status code. We Have Add 

options.RejectionStatusCode = 429;
builder.Services.AddRateLimiter(options => {
            options.RejectionStatusCode = 429;

After adding the StatusCode429, it shows the meaningful response,

Error: Too Many Requests

Types of RateLimit 

There are several types of rate limiters that can be used to control access to a service or application:

  1. Fixed window rate limiter: This type of rate limiter enforces a fixed number of requests over a fixed time window. For example, a fixed window rate limiter might allow 100 requests per minute.
  2. Sliding window rate limiter: This type of rate limiter enforces a fixed number of requests over a sliding time window. For example, a sliding window rate limiter might allow 100 requests per minute, but the time window resets after each request.
  3. Token bucket rate limiter: This type of rate limiter enforces a fixed number of requests over a fixed time window, but allows for bursts of requests above the limit. For example, a token bucket rate limiter might allow 100 requests per Minute but also allow up to 10 requests above the limit in a short burst.
  4. Leaky bucket rate limiter: This type of rate limiter enforces a fixed number of requests over a fixed time window, but allows for a certain number of requests to be "leaked" out of the bucket over time.
  5. IP rate limiter: This type of rate limiter enforces a rate limit based on the IP address of the client making the request.
  6. User rate limiter: This type of rate limiter enforces a rate limit based on the user or client making the request.
  7. combination of the above: you can also use a combination of these rate limiters to control access to your service or application.

Rate Limit groups of endpoints 

In the above, we have checked the global limit request, But if we want to set a different Rate Limit for different endpoints.

Below is an example configuration adding 2 fixed window limiters with different settings, and a different policy name ("API" and "Web").

builder.Services.AddRateLimiter(options => {
    options.RejectionStatusCode = 429;
    options.AddFixedWindowLimiter("Api", options => {
        options.AutoReplenishment = true;
        options.PermitLimit = 10;
        options.Window = TimeSpan.FromMinutes(1);
    });
    options.AddFixedWindowLimiter("Web", options => {
        options.AutoReplenishment = true;
        options.PermitLimit = 5;
        options.Window = TimeSpan.FromMinutes(1);
    });
});

In the API code, we have EnableRateLimiting 

[EnableRateLimiting("Api")]
[HttpGet("GetWeatherForecast")]
public IEnumerable < WeatherForecast > Get() {
        return Enumerable.Range(1, 5).Select(index => new WeatherForecast {
            Date = DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
                TemperatureC = Random.Shared.Next(-20, 55),
                Summary = Summaries[Random.Shared.Next(Summaries.Length)]
        }).ToArray();
    }
    [EnableRateLimiting("Web")]
    [HttpGet("GetWeatherForecast1")]
public IEnumerable < WeatherForecast > Get1() {
    return Enumerable.Range(1, 5).Select(index => new WeatherForecast {
        Date = DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
            TemperatureC = Random.Shared.Next(-20, 55),
            Summary = Summaries[Random.Shared.Next(Summaries.Length)]
    }).ToArray();
}

So if you want to disable RateLimit to some Action, Then you should just on the top of the action [DisableRateLimiting]

Conclusion 

In conclusion, rate limiting is a technique used to control the rate of requests made to a server or service. It is used to prevent abuse, protect resources and ensure that service remains available and responsive for legitimate users. There are several different algorithms and techniques for implementing rate limiting, which can vary depending on the use case and the goals of the service. Overall, rate limiting is an important method for maintaining the performance and stability of a service in the face of high traffic or malicious activity.


Similar Articles