Implement Rate Limiting in REST API in .NET 8

Hi Everyone, Today, we will understand the concept of Rate limiting. To Follow the REST Standard guidelines for API development, check out this blog REST API Standards for Effective API Development in .NET.

Introduction

Rate limiting is a crucial technique in web development to control the number of requests a client can make to an API within a specified time frame. This helps prevent abuse, ensure fair usage, and protect server resources. In this article, we'll explore how to implement rate limiting in a REST API using .NET 8 Web API.

Rate limiting helps manage traffic and ensure that no single client can overwhelm the server. It can be implemented in various ways.

  • Per-IP Rate Limiting: Limits the number of requests from a single IP address.
  • Per-User Rate Limiting: Limits the number of requests per authenticated user.
  • Global Rate Limiting: Limits the total number of requests to the API, regardless of the client.

Not. In this article, we will focus on Per- per-IP rate Limiting implementation.

Setting up a .NET 8 Web API Project

First, create a new .NET 8 Web API project.

Web API

Adding Rate Limiting Middleware

To implement rate limiting, we'll create a custom middleware. Middleware in ASP.NET Core allows us to inspect and modify requests and responses.

Creating the Middleware: Create a new class called RateLimitingMiddleware.cs

using Microsoft.AspNetCore.Http;
using System.Collections.Concurrent;
using System.Threading.Tasks;
namespace RateLimiting
{
    public class RateLimitingMiddleware
    {
        private readonly RequestDelegate _next;
        private static readonly ConcurrentDictionary<string, RequestLog> _requestLogs = new();
        public RateLimitingMiddleware(RequestDelegate next)
        {
            _next = next;
        }
        public async Task InvokeAsync(HttpContext context)
        {
            var ipAddress = context.Connection.RemoteIpAddress.ToString();
            var now = DateTime.UtcNow;
            var requestLog = _requestLogs.GetOrAdd(ipAddress, new RequestLog());
            lock (requestLog)
            {
                requestLog.Requests.Enqueue(now);
                while (requestLog.Requests.Count > 0 && (now - requestLog.Requests.Peek()).TotalSeconds > 60)
                {
                    requestLog.Requests.Dequeue();
                }
                if (requestLog.Requests.Count > 10)
                {
                    context.Response.StatusCode = StatusCodes.Status429TooManyRequests;
                    context.Response.WriteAsync("Rate limit exceeded");
                    return;
                }
            }
            await _next(context);
        }
    }
    public class RequestLog
    {
        public Queue<DateTime> Requests { get; } = new();
    }

}

In this middleware, we keep track of each IP address’s request history using a queue. We enforce a limit of 10 requests per minute per IP address.

Registering the Middleware: In the Program.cs file, add the middleware to the pipeline.

using RateLimiting;
var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
builder.Services.AddControllers();
// Learn more about configuring Swagger/OpenAPI at https://aka.ms/aspnetcore/swashbuckle
builder.Services.AddEndpointsApiExplorer();
builder.Services.AddSwaggerGen();
var app = builder.Build();
// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
    app.UseSwagger();
    app.UseSwaggerUI();
}
app.UseMiddleware<RateLimitingMiddleware>();
app.UseHttpsRedirection();
app.UseAuthorization();
app.MapControllers();
app.Run();

Testing the Rate Limiting

To test the rate-limiting functionality, you can use tools like Postman or Curl to make requests to your API. Make more than 10 requests within a minute from the same IP address to see if you receive a 429 Too Many Requests response.

Rate limiting

 

Additional Rate Limiting Configurations

For a more robust rate-limiting solution, consider the following enhancements.

  • Distributed Caching: Use distributed caches like Redis to store rate-limit data, especially if you have multiple servers.
  • Rate Limit Headers: Include headers in responses to inform clients about the rate limits and their usage.
  • Dynamic Limits: Allow configuration of rate limits through app settings or a database.

Advantages and disadvantages of rate limiting
 

Advantages Disadvantages
Prevents Abuse Complex Implementation
Guards against misuse and attacks. Adding rate limiting can be complex to implement and configure.
Protects Resources Possible Overhead
Reduces risk of server resource exhaustion. It may introduce some overhead in terms of processing and storage.
Enhances Security Can Impact Legitimate Users
Mitigates denial-of-service and brute-force attacks. It might restrict legitimate high-frequency use cases, affecting user experience.
Improves Performance Requires Maintenance
Helps maintain server performance by controlling request load. Needs regular updates and monitoring to adjust limits and rules.
Ensures Fairness Potential for Misconfiguration
Ensures equal access to resources among users. Misconfigured limits can either be too restrictive or too lenient.
Provides Better User Experience Might Require Scaling
Helps avoid service degradation and downtime. High-traffic environments might need more sophisticated rate-limiting strategies.


Conclusion

Rate limiting is essential for maintaining the performance and reliability of your APIs. You can easily implement rate limiting using custom middleware. This approach allows you to manage traffic and protect your API from abuse, ensuring a better experience for all users.

Feel free to extend and customize the rate-limiting logic based on your specific needs and use cases. Happy coding!


Similar Articles