Rate Limiting
Protect your API with intelligent rate limiting using Unkey Ratelimit
Overview
Tusflow uses Unkey Ratelimit for distributed rate limiting. The rate limiting middleware protects your API from abuse by limiting request frequency based on IP address and HTTP method.
Different rate limits are applied for various operations (uploads, chunks, metadata) to optimize for real-world usage patterns.
Implementation
The rate limiting middleware uses Unkey Ratelimit client with sliding window algorithm:
Configuration
Rate limits are configured in ratelimit-config.ts
:
Alternative: Using Upstash Ratelimit
Tusflow also supports using Upstash Ratelimit as an alternative to Unkey Ratelimit. This option provides a Redis-based distributed rate limiting solution with Cloudflare Workers integration.
Install Dependencies
First, install the required packages:
Implementation
Replace the Unkey Ratelimit implementation with the following code:
Configuration
Update your configuration to use the Upstash-specific settings:
Environment Variables
Add the following environment variables to your .env
file or deployment configuration:
Update Bindings Type
Make sure to update your honoTypes.ts
file to include the new environment variables:
Upstash provides a serverless Redis instance with a REST API, making it well-suited for edge environments like Cloudflare Workers.
Rate Limit Headers
The middleware sets standard rate limit headers:
Integration
Rate limiting is part of the security middleware stack:
Features
-
Method-Based Limits
- Different limits for uploads vs chunks
- Configurable per HTTP method
- Default fallback limits
- Flexible token allocation
-
Distributed Rate Limiting
- Sliding window algorithm
- Global rate limit state
- High availability
-
Smart Rate Limiting
- IP-based identification
- Cloudflare integration
- Automatic header management
- Graceful error handling
Error Handling
When rate limit is exceeded:
Best Practices
Configure Limits
- Set appropriate token counts
- Adjust time windows
- Configure block duration
- Enable in production
Monitor Usage
- Track rate limit headers
- Monitor blocked requests
- Set up alerts
- Analyze patterns
Handle Failures
- Implement retry logic
- Respect retry-after
- Cache rate limit state
- Handle errors gracefully
Always test rate limiting configuration in staging before deploying to production to ensure it doesn't impact legitimate users.
Last updated on