Implementing Rate Limiting in .NET Core Web API using .NET 7.0

Introduction

Rate limiting is a crucial aspect of building scalable and secure web APIs. It helps prevent abuse, protects server resources, and ensures fair usage for all consumers. In this article, we will explore how to implement rate limiting in a .NET Core Web API application. We'll cover the fundamentals of rate limiting, discuss various strategies, and provide practical examples to demonstrate its implementation.

What is Rate Limiting?

Rate limiting is a technique that restricts the number of API requests a client can make within a specific time period. It establishes boundaries on how frequently clients can access certain endpoints or resources. By enforcing these limits, developers can prevent abuse, distribute server load evenly, and maintain a high-quality user experience.

Implementing Rate Limiting in .NET Core Web API

To implement rate limiting in a .NET Core Web API application, we can leverage the AspNetCoreRateLimit library, which offers a robust and flexible solution. Let's dive into the step-by-step process.

Step 1. Install the AspNetCoreRateLimit NuGet Package: Start by installing the AspNetCoreRateLimit package from NuGet. We can use the NuGet Package Manager in Visual Studio or run the following command in the Package Manager Console: Install-Package AspNetCoreRateLimit.

Step 2. Configure Rate Limiting Middleware: In the Program.cs file, add the following code to register the rate-limiting services.

builder.Services.AddMemoryCache();
builder.Services.Configure < IpRateLimitOptions > (builder.Configuration.GetSection("IpRateLimiting"));
builder.Services.AddSingleton < IIpPolicyStore, MemoryCacheIpPolicyStore > ();
builder.Services.AddSingleton < IRateLimitCounterStore, MemoryCacheRateLimitCounterStore > ();
builder.Services.AddSingleton < IRateLimitConfiguration, RateLimitConfiguration > ();
builder.Services.AddInMemoryRateLimiting();

Step 3. Define Rate Limit Policies: In the appsettings.json file, add the following configuration to define rate limit policies.

"IpRateLimiting": {
  "EnableEndpointRateLimiting": true,
  "StackBlockedRequests": false,
  "RealIpHeader": "X-Real-IP",
  "ClientIdHeader": "X-ClientId",
  "HttpStatusCode": 429,
  "IpWhitelist": [],
  "EndpointWhitelist": [],
  "GeneralRules": [{
    "Endpoint": "GET:/WeatherForecast",
    "Period": "1m",
    "Limit": 10
  }]
}

By setting a general rule, we establish a default rate limit policy that applies to your API endpoints. This helps prevent excessive requests from any client or IP address and ensures fair usage and resource allocation. You can define additional policies and customize them according to your application's requirements.

Step 4. Enable Rate Limiting Middleware: In the Program.cs file, add the following code to enable rate-limiting middleware: app.UseIpRateLimiting();

Full Program.cs file code

using AspNetCoreRateLimit;

var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();

// Services add for rate limiting
builder.Services.AddMemoryCache();
builder.Services.Configure < IpRateLimitOptions > (builder.Configuration.GetSection("IpRateLimiting"));
builder.Services.AddSingleton < IIpPolicyStore, MemoryCacheIpPolicyStore > ();
builder.Services.AddSingleton < IRateLimitCounterStore, MemoryCacheRateLimitCounterStore > ();
builder.Services.AddSingleton < IRateLimitConfiguration, RateLimitConfiguration > ();
builder.Services.AddInMemoryRateLimiting();

var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment()) {
  app.UseSwagger();
  app.UseSwaggerUI();
}
app.UseIpRateLimiting();
app.UseRouting();
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();

Step 5. Test Rate Limiting

Now, we can test the rate-limiting implementation by making API requests. If a client exceeds the defined limits, they will receive a status code 429, "API calls quota exceeded! maximum admitted 10 per 1m." response.

Pros of Rate Limiting

  • Rate limiting helps protect your API from abusive behavior, such as excessive requests or denial-of-service attacks.
  • By limiting the number of requests per client or IP address, rate limiting helps prevent resource exhaustion and safeguards server performance.
  • By restricting the number of login attempts or requests to sensitive endpoints, rate limiting helps mitigate the risk of unauthorized access or data breaches.
  • Rate limiting allows API providers to implement different tiers or pricing plans based on usage. By setting different rate limits for various subscription levels or pricing tiers, you can offer differentiated services and monetize your API effectively.

Cons of Rate Limiting

  • Implementing rate limiting requires careful configuration and management. Determining the optimal rate limits, differentiating between client types, and handling edge cases can introduce complexity and additional overhead to the development and maintenance process.
  • In some cases, rate limiting can inadvertently affect users who require higher-than-average request rates. Stricter rate limits or misconfigurations may lead to users hitting rate limits and experiencing disruptions or reduced functionality.
  • Rate limiting introduces an additional layer of processing and checks for each incoming request, which can impact the overall performance of your API.

Conclusion

Rate limiting is a vital mechanism for protecting your web API and ensuring its stability and availability. In this article, we explored how to implement rate limiting in a .NET Core Web API application using the AspNetCoreRateLimit library. By following the step-by-step process outlined above, you can enforce rate limits on API endpoints and prevent abuse or excessive usage.


Similar Articles