Created by Stan on 10-04-2023
Rate limiting is a technique used to limit the number of requests made to a server by a particular client. It is a commonly used method to protect servers from excessive traffic and ensure that they continue to function properly. In this article, we will explore how to implement rate limiting in a Ruby application using Redis.
We will be using the following gems:
- Sinatra
- Redis
- Rack
- Rspec
# Gemfile.rb source 'https://rubygems.org' gem 'sinatra' gem 'rack', '~> 2.2', '>= 2.2.4' gem 'redis' group :development, :test do gem 'rspec' end
We can then runbundle install
to install the required gems.
To implement rate limiting in our Ruby application, we will define a new middleware class that limits the rate at which clients can send requests. We will use Redis to store the request counts for each client.
Here is the implementation for the RateLimiter
middleware:
# rate_limiter.rb require 'redis' class RateLimiter def initialize(app, options = {}) @app = app @limit = options[:limit] || 100 @period = options[:period] || 60 @redis = Redis.new end def call(env) key = "#{env['REMOTE_ADDR']}:#{Time.now.to_i / @period}" count = @redis.incr(key) @redis.expire(key, @period) if count == 1 if count > @limit [429, { 'Content-Type' => 'text/plain' }, ['Rate limit exceeded']] else @app.call(env) end end end
In this code, we define a new RateLimiter
class that takes an application object and an options hash as arguments. The @limit
and @period
instance variables specify the maximum number of requests and the time period, respectively. We use the redis
gem to create a new Redis client and increment the request count for each client.
The call
method is where the rate limiting logic is implemented. We first generate a unique key for each client based on their IP address and the current time period. We then increment the request count for that key and set the key to expire after the time period has elapsed if this is the first request. Finally, we check if the request count exceeds the maximum limit and return a 429 Too Many Requests
response if it does.
Next, we will create a simple Sinatra application that uses our RateLimiter middleware to limit the rate of requests:
# app.rb require 'sinatra/base' require_relative 'rate_limiter' class MyApp < Sinatra::Base use RateLimiter, limit: 100, period: 60 get '/' do 'Hello, world!' end end
In this code, we define a new Sinatra application that uses our RateLimiter
middleware by callinguse RateLimiter, limit: 100, period: 60
. We also define a simple route that returns a "Hello, world!" message.
To test our rate limiter middleware, we will create a new file called app_spec.rb
and write some RSpec tests:
# spec/app_spec.rb require 'rack/test' require_relative 'app' RSpec.describe MyApp do include Rack::Test::Methods def app MyApp end context 'when the rate limit is not exceeded' do it 'returns a 200 OK status' do get '/' expect(last_response.status).to eq(200) end end context 'when the rate limit is exceeded' do it 'returns a 429 Too Many Requests error' do 101.times { get '/' } expect(last_response.status).to eq(429) end end end
In this code, we use the Rack::Test
library to simulate HTTP requests to our Sinatra application. We define a new RSpec example group that tests the behavior of our rate limiter middleware when the rate limit is not exceeded and when it is exceeded.
Run then specs!
rspec .
All our specs should be passing.
In this article, we explored how to implement rate limiting in a Ruby application using Redis. We defined a new middleware class that limits the rate at which clients can send requests and uses Redis to store the request counts for each client. We then created a simple Sinatra application that uses our rate limiter middleware and wrote some RSpec tests to ensure that it works as expected.
Rate limiting is a powerful technique that can help protect your servers from excessive traffic and ensure that they continue to function properly. By using Redis to store request counts, we can easily implement rate limiting in our Ruby applications and ensure that our servers remain responsive and available to our users.
Architecture
Posted on 08 Apr, 2023Architecture
Posted on 07 Apr, 2023Architecture
Posted on 08 Apr, 2023