High-Performance Job Queue for Bun
No Redis. No infrastructure. Just Bun.
import { Queue, Worker } from 'bunqueue/client';
const queue = new Queue('emails', { embedded: true });
await queue.add('welcome', { to: 'user@example.com', subject: 'Welcome aboard!', template: 'onboarding',});
const worker = new Worker('emails', async (job) => { await sendEmail(job.data); await job.updateProgress(100); return { sent: true };}, { embedded: true, concurrency: 5 });
worker.on('completed', (job, result) => { console.log(`Done: ${job.id}`, result);});
worker.on('failed', (job, error) => { console.error(`Failed: ${job.id}`, error.message);});Built different
Everything you need for production workloads. Nothing you don’t.
Zero External Dependencies
No Redis, no RabbitMQ, no Kafka. Just Bun’s native SQLite with WAL mode. Deploy anywhere in seconds — your entire queue infrastructure is a single file.
Blazing Fast
150K+ ops/sec with sub-millisecond latency. Sharded priority queues scale with your CPU cores.
Production Ready
Stall detection, dead letter queues, retries with backoff, rate limiting, S3 backups. All built-in.
Drop-in BullMQ Replacement
Familiar Queue and Worker API. Migrate from BullMQ in minutes, not days. Same patterns, same mental model, 32x better performance.
Full TypeScript
End-to-end type safety with comprehensive generics. Your IDE autocompletes everything.
Cron & Scheduling
Cron expressions, delayed jobs, repeatable tasks. Schedule anything with timezone support.
Two modes, one API
Run embedded in your process or as a standalone TCP server. Same code, different scale.
In-process queue
Zero network overhead. Perfect for single-process apps, serverless, and prototyping.
- Zero config setup
- No network latency
- Microsecond operations
- Single process only
Standalone server
Multi-client architecture over TCP. Built for microservices and distributed workers.
- Multiple client connections
- HTTP + TCP APIs
- CLI management
- Horizontal scaling