Redis-backed deduplication middleware for Taskiq that prevents duplicate tasks from being queued or executed concurrently
Documentation: https://taskiq-deduplication.d3vyce.fr
Source Code: https://github.com/d3vyce/taskiq-deduplication
Installation #
uv add taskiq-deduplicationQuick Start #
from taskiq_redis import ListQueueBroker
from taskiq_deduplication import RedisDeduplicationMiddleware, DuplicateTaskError
broker = ListQueueBroker("redis://localhost:6379").with_middlewares(
RedisDeduplicationMiddleware(redis_url="redis://localhost:6379"),
)
@broker.task
async def send_report(user_id: int) -> None:
...
# First dispatch acquires the lock — succeeds.
await send_report.kiq(user_id=42)
# Second dispatch while the first is queued or running — raises.
try:
await send_report.kiq(user_id=42)
except DuplicateTaskError:
pass # already queued or runningFeatures #
- Sender-side deduplication — rejects duplicate tasks at dispatch time via a Redis queue lock, before they reach the broker.
- Worker-side detection — logs concurrent duplicate executions without raising, keeping
SmartRetryMiddlewaresafe from retry storms. - Configurable TTL — set a global default or override per task with the
deduplication_ttllabel. - Explicit lock key — pin any task to a fixed Redis key with
deduplication_key, bypassing fingerprint computation entirely. - Partial fingerprint — deduplicate on a subset of kwargs with
deduplication_key_fields, ignoring irrelevant arguments. - Per-task opt-out — disable deduplication for individual tasks with the
deduplicationlabel.
License #
MIT License - see LICENSE for details.
Contributing #
Contributions are welcome! Please feel free to submit issues and pull requests.