How to Implement API Rate Limiting in a NodeJS Express Application

  What is Rate Limiting? 

 Rate Limiting is a "protection shield" that you place between incoming requests and your controller logic. It helps prevent abuse, brute-force attacks, and traffic spikes by controlling how frequently clients can hit your API.

Rate limiting is a technique used to control the number of requests a client can make to a server within a specific time frame. It prevents:

  • DDoS attacks

  • Brute-force login attempts

  • API overuse by a single user

  • Unintentional client bugs flooding your server

In short, it's like a bouncer for your server




fig1. Rate-limit

  • 🤔 Why Use Rate Limiting?

  • Here’s why it matters:

  • 🔒 Security: Blocks malicious actors trying to overload your endpoints.

  • ⚖️ Fair Usage: Ensures no single user hogs the API.

  • 💸 Cost Control: Reduces unnecessary server usage (especially important with cloud billing).

  • 🧘 Stability: Maintains smooth performance under high traffic.


⚙️ Rate Limiting Algorithms

    There are mainly two widely used algorithms behind the scenes for API rate-limiting:
  • Token Bucket Algorithm
  •  Leaky Bucket Algorithm

1. Token Bucket Algorithm   GFG

The Token Bucket algorithm is a simple and effective method used in networking for traffic shaping and rate limiting
                 

                                                            fig2. Token Bucket Algorithm 


How it works:
  • A bucket holds tokens.

  • Each incoming request uses one token.

  • The bucket refills over time (e.g., 1 token per minute).

  • If the bucket is empty → no more requests are allowed until it refills.

 Advantages:

  • Good for burst traffic (e.g., 5 quick requests then pause)

  • Flexible – not strictly one request per second


 2. Leaky Bucket Algorithm   GFG

    Imagine a bucket with a small hole leaking water at a constant rate.

    

                                                    
                                                      fig3: Leaky Bucket Algorithm 

How it works:

  • New requests are like water entering the bucket.

  • Water exits the bucket at a fixed rate.

  • If water (requests) comes too fast → bucket overflows, and extra requests are rejected.


 Advantages:

  • Good for consistent traffic flow

  • Ensures strict, smooth rate control



🚀 Implementing Rate Limiting in Express.js

    🔧 Step 1: Install the Package

            npm install express-rate-limit

       Step 2: Setup in Your App

    const express = require('express');
    const rateLimit = require('express-rate-limit');

    const app = express();

        // Limit: 100 requests per 15 minutes
        const limiter = rateLimit({
      windowMs: 15 * 60 * 1000, // 15 minutes
        max: 100, // Limit per IP
      message: 'Too many requests from this IP, please try again later.',
});

    // Apply to all routes
    app.use(limiter);

    app.get('/', (req, res) => {
      res.send('Welcome to the rate-limited API!');
});

    app.listen(3000, () => {
      console.log('Server running on http://localhost:3000');
});

📌 Apply to Specific Routes

    // Apply limiter only to login route
     app.use('/login', limiter);


📊 Real-World Use Cases 

  • Scenario   Suggested Limit
    Public APIs 60 requests/min
    Login Endpoint     5 requests/min
    Admin Routes 20 requests/min
    Internal APIs No limit or custom


🏁 Conclusion

Rate limiting is a simple yet powerful way to protect your APIs and servers. Whether you're building a small app or managing a production-scale API, implementing rate limits should be one of the first things you do to ensure fairness, security, and stability.






Comments

Post a Comment

Popular posts from this blog

What Is a Proxy? A Simple Explanation for Beginners

Optimizing API Performance with NodeCache in Node.js