Rate Limiting Algorithms in .NET Core

Introduction

Rate limiting is a fundamental strategy for maintaining the stability and security of applications. It plays a crucial role in preventing abuse, protecting resources, and ensuring fair usage of APIs or services. In this comprehensive guide, we’ll explore various rate limiting algorithms in the context of .NET Core, providing code snippets for implementing

  1. Token Bucket,
  2. Sliding Window,
  3. Fixed Window,
  4. and Concurrency limiters.

Token Bucket Algorithm

The Token Bucket algorithm is a versatile approach to rate limiting. It involves maintaining a bucket of tokens, where each token represents a unit of work. Clients can only perform an action if they possess an available token.

Let’s implement a Token Bucket rate limiter in .NET Core:

public class TokenBucketRateLimiter
{
    private readonly int capacity;
    private readonly Queue<DateTime> tokens;

    public TokenBucketRateLimiter(int capacity)
    {
        this.capacity = capacity;
        this.tokens = new Queue<DateTime>();
    }

    public bool TryConsume()
    {
        lock (tokens)
        {
            if (tokens.Count < capacity)
            {
                tokens.Enqueue(DateTime.Now);
                return true;
            }
            return false;
        }
    }
}

Usage example:

var rateLimiter = new TokenBucketRateLimiter(5);

if (rateLimiter.TryConsume())
{
    Console.WriteLine("Request processed successfully");
}
else
{
    Console.WriteLine("Rate limit exceeded");
}

2. Sliding Window Algorithm

The Sliding Window algorithm divides time into fixed intervals, allowing a certain number of requests within each interval. As time progresses, the window slides, and new requests are considered.

Here’s a Sliding Window rate limiter in .NET Core:

public class SlidingWindowRateLimiter
{
    private readonly int capacity;
    private readonly Queue<DateTime> window;

    public SlidingWindowRateLimiter(int capacity)
    {
        this.capacity = capacity;
        this.window = new Queue<DateTime>();
    }

    public bool TryConsume()
    {
        lock (window)
        {
            CleanExpiredTokens();

            if (window.Count < capacity)
            {
                window.Enqueue(DateTime.Now);
                return true;
            }
            return false;
        }
    }

    private void CleanExpiredTokens()
    {
        var now = DateTime.Now;

        while (window.Count > 0 && (now - window.Peek()).TotalSeconds >= 1)
        {
            window.Dequeue();
        }
    }
}

Usage example:

var rateLimiter = new SlidingWindowRateLimiter(10);

if (rateLimiter.TryConsume())
{
    Console.WriteLine("Request processed successfully");
}
else
{
    Console.WriteLine("Rate limit exceeded");
}

3. Fixed Window Algorithm

The Fixed Window algorithm divides time into non-overlapping intervals, allowing a specific number of requests in each interval. Requests exceeding the limit within the window are rejected.

Here’s a Fixed Window rate limiter in .NET Core:

public class FixedWindowRateLimiter
{
    private readonly int capacity;
    private readonly Queue<DateTime> window;

    public FixedWindowRateLimiter(int capacity)
    {
        this.capacity = capacity;
        this.window = new Queue<DateTime>();
    }

    public bool TryConsume()
    {
        lock (window)
        {
            CleanExpiredTokens();

            if (window.Count < capacity)
            {
                window.Enqueue(DateTime.Now);
                return true;
            }
            return false;
        }
    }

    private void CleanExpiredTokens()
    {
        var now = DateTime.Now;

        while (window.Count > 0 && (now - window.Peek()).TotalSeconds >= 1)
        {
            window.Dequeue();
        }
    }
}

Usage example:

var rateLimiter = new FixedWindowRateLimiter(15);

if (rateLimiter.TryConsume())
{
    Console.WriteLine("Request processed successfully");
}
else
{
    Console.WriteLine("Rate limit exceeded");
}

4. Concurrency Limiter

Concurrency limiting restricts the number of concurrent operations. It is especially useful for scenarios where the number of simultaneous requests needs to be controlled.

Here’s a simple Concurrency limiter in .NET Core:

public class ConcurrencyLimiter
{
    private readonly SemaphoreSlim semaphore;

    public ConcurrencyLimiter(int maxConcurrentRequests)
    {
        this.semaphore = new SemaphoreSlim(maxConcurrentRequests);
    }

    public async Task<bool> TryEnterAsync()
    {
        return await semaphore.WaitAsync(TimeSpan.Zero);
    }

    public void Release()
    {
        semaphore.Release();
    }
}

Usage example:

var concurrencyLimiter = new ConcurrencyLimiter(5);

if (await concurrencyLimiter.TryEnterAsync())
{
    try
    {
        Console.WriteLine("Request processed successfully");
    }
    finally
    {
        concurrencyLimiter.Release();
    }
}
else
{
    Console.WriteLine("Concurrency limit exceeded");
}

Conclusion

Effectively implementing rate limiting algorithms is essential for creating robust and scalable applications. By incorporating Token Bucket, Sliding Window, Fixed Window, and Concurrency limiters in your .NET Core applications, you can ensure fair resource usage, protect against abuse, and maintain system integrity. Choose the algorithm that aligns with your specific requirements and integrate it seamlessly to enhance the performance and reliability of your application.


Similar Articles