Async multi-transport logging — console, file, database, analytics — with a first-class
NestJS module,
request tracing, built-in search, and zero-dep OpenTelemetry. Replace winston and pino in one install.
npm install logixia
import { createLogger, ConsoleTransport, FileTransport } from 'logixia';
const logger = createLogger({
level: 'debug',
transports: [
new ConsoleTransport({ colorize: true, pretty: true }),
new FileTransport({ filename: 'logs/app.log', rotate: true }),
],
});
// Structured logging — context travels with every log
const reqLogger = logger.child({ requestId: 'req-abc', userId: 42 });
reqLogger.info('User login', { method: 'oauth', provider: 'google' });
reqLogger.warn('Rate limit approaching', { remaining: 5, limit: 100 });
reqLogger.error('Payment failed', { orderId: 'ord-123', code: 'DECLINED' });
winston needs 500 lines of config. pino is fast but inflexible. console.log doesn't scale. Every team ends up building a custom wrapper anyway.
// npm install winston winston-daily-rotate-file
// + configure, + wrapper, + NestJS adapter...
import * as winston from 'winston';
const logger = winston.createLogger({
level: 'info',
format: winston.format.combine(
winston.format.timestamp(),
winston.format.errors({ stack: true }),
winston.format.splat(),
winston.format.json()
),
transports: [
new winston.transports.Console({
format: winston.format.combine(
winston.format.colorize(),
winston.format.simple()
),
}),
new (require('winston-daily-rotate-file'))({
filename: 'logs/%DATE%.log',
datePattern: 'YYYY-MM-DD',
// ... 20 more options
}),
],
}); // and you still need a NestJS adapter...
// npm install logixia — that's it
import { createLogger, ConsoleTransport, FileTransport } from 'logixia';
const logger = createLogger({
level: 'debug',
transports: [
new ConsoleTransport({ colorize: true }),
new FileTransport({
filename: 'logs/app.log',
rotate: true,
maxFiles: 7,
}),
],
});
// NestJS? One import.
// import { LogixiaModule } from 'logixia/nest';
logger.info('Ready', { version: process.env.npm_package_version });
From local development to multi-service production. logixia grows with you without changing your API.
Log to console, rotating files, databases, analytics platforms, and custom webhooks — all asynchronously, in parallel, without blocking your event loop.
Drop-in LogixiaModule with async configuration, DI support, per-module log levels, and request context propagation via AsyncLocalStorage.
Query your stored logs with full-text search, level filters, time ranges, and field conditions — without spinning up Elasticsearch. Works on any database transport.
Automatic trace/span/parent correlation. Every log line carries traceId, spanId, and traceFlags — no extra packages needed.
logger.child() creates request-scoped loggers. Context fields (requestId, userId, tenantId) auto-attach to every log in that scope via AsyncLocalStorage.
JSON in production, human-readable in development. Automatic log level colorization, timestamps, diff highlighting. One flag switches the output format.
Send error logs to Slack, PagerDuty, or any webhook. Ship analytics events to Segment, Mixpanel, or PostHog. Batched delivery with retry and backpressure.
Six levels: trace, debug, info, warn, error, fatal. Runtime level changes without restart. Sampling support — log only X% of debug events in high-throughput paths.
Extend with any transport by implementing the Transport interface. Two methods: write() and close(). Full TypeScript types.
From basic setup to production-grade configuration.
import { createLogger, ConsoleTransport, FileTransport } from 'logixia';
const logger = createLogger({
level: process.env.LOG_LEVEL ?? 'info',
transports: [
new ConsoleTransport({
colorize: process.env.NODE_ENV !== 'production',
pretty: process.env.NODE_ENV !== 'production',
}),
new FileTransport({
filename: 'logs/app.log',
rotate: true,
maxSize: '20m',
maxFiles: 14,
}),
],
});
logger.info('Server started', { port: 3000, env: process.env.NODE_ENV });
logger.warn('Config missing', { key: 'REDIS_URL', fallback: 'localhost' });
logger.error('Unhandled error', { stack: err.stack, context: 'app' });
import { Module } from '@nestjs/common';
import { LogixiaModule } from 'logixia/nest';
import { ConsoleTransport, FileTransport } from 'logixia';
import { ConfigService } from '@nestjs/config';
@Module({
imports: [
LogixiaModule.forRootAsync({
inject: [ConfigService],
useFactory: (config: ConfigService) => ({
level: config.get('LOG_LEVEL', 'info'),
transports: [
new ConsoleTransport({ colorize: config.get('NODE_ENV') !== 'production' }),
new FileTransport({ filename: 'logs/nest.log', rotate: true }),
],
}),
}),
],
})
export class AppModule {}
// In your service:
@Injectable()
export class UserService {
constructor(private readonly logger: LogixiaLogger) {}
async findUser(id: string) {
this.logger.debug('Finding user', { id });
// logger automatically carries the request context
}
}
import {
createLogger,
ConsoleTransport,
FileTransport,
DatabaseTransport,
WebhookTransport,
AnalyticsTransport,
} from 'logixia';
const logger = createLogger({
level: 'debug',
transports: [
// Dev console with colors
new ConsoleTransport({ colorize: true, pretty: true }),
// Rotating file logs
new FileTransport({ filename: 'logs/app.log', rotate: true, maxFiles: 7 }),
// Persist errors to DB for search
new DatabaseTransport({
adapter: 'postgres',
connectionString: process.env.DATABASE_URL!,
minLevel: 'warn',
tableName: 'application_logs',
}),
// Slack alerts for errors
new WebhookTransport({
url: process.env.SLACK_WEBHOOK_URL!,
minLevel: 'error',
batch: { size: 5, intervalMs: 1000 },
}),
// Analytics events
new AnalyticsTransport({
provider: 'segment',
writeKey: process.env.SEGMENT_WRITE_KEY!,
minLevel: 'info',
}),
],
});
import { createLogger, createRequestContext } from 'logixia';
import type { Request, Response, NextFunction } from 'express';
const logger = createLogger({ level: 'info', transports: [...] });
// Express middleware — attaches context to every log in this request
export function requestLogger(req: Request, res: Response, next: NextFunction) {
const ctx = createRequestContext({
requestId: req.headers['x-request-id'] as string ?? crypto.randomUUID(),
userId: req.user?.id,
tenantId: req.headers['x-tenant-id'] as string,
ip: req.ip,
method: req.method,
path: req.path,
});
// reqLogger carries all context fields automatically
const reqLogger = logger.withContext(ctx);
req.logger = reqLogger;
reqLogger.info('Request started');
res.on('finish', () => {
reqLogger.info('Request completed', {
status: res.statusCode,
duration: Date.now() - req.startTime,
});
});
next();
}
import { LogSearcher } from 'logixia';
const searcher = new LogSearcher({
adapter: 'postgres',
connectionString: process.env.DATABASE_URL!,
tableName: 'application_logs',
});
// Full-text search with filters
const results = await searcher.search({
query: 'payment failed',
levels: ['error', 'fatal'],
from: new Date('2025-01-01'),
to: new Date('2025-01-31'),
fields: { userId: '42', tenantId: 'acme' },
limit: 50,
orderBy: 'timestamp',
order: 'desc',
});
console.log(results.total); // total matching logs
console.log(results.logs); // LogEntry[] — fully typed
import { Transport, LogEntry } from 'logixia';
// Implement 2 methods. That's all.
class DatadogTransport implements Transport {
private readonly apiKey: string;
constructor({ apiKey }: { apiKey: string }) {
this.apiKey = apiKey;
}
async write(entry: LogEntry): Promise<void> {
await fetch('https://http-intake.logs.datadoghq.com/v1/input/' + this.apiKey, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
level: entry.level,
message: entry.message,
timestamp: entry.timestamp,
...entry.fields,
}),
});
}
async close(): Promise<void> {
// flush pending batches
}
}
// Use like any built-in transport
const logger = createLogger({
transports: [new DatadogTransport({ apiKey: process.env.DD_API_KEY! })],
});
logixia routes each log to every transport in parallel. Failures in one transport never affect the others.
Colorized, pretty-printed output for local dev. JSON mode for production log aggregators (Datadog, Grafana Loki).
Rotating log files with size and date-based rotation, compression, and configurable retention. Buffered writes for performance.
PostgreSQL, MySQL, MongoDB, SQLite adapters. Enables built-in log search. Schema auto-migration included.
POST logs to Slack, PagerDuty, Discord, or any HTTP endpoint. Batched delivery with retry and per-level filtering.
Ship log events to Segment, Mixpanel, PostHog, or Amplitude. Maps log fields to analytics event properties automatically.
Implement the Transport interface. Two methods: write() and close(). Send logs anywhere.
Every missing feature in the table below represents something your team builds and maintains manually.
| Feature | logixia | winston | pino | bunyan |
|---|---|---|---|---|
| TypeScript-first | ✓ | ~ | ~ | ✗ |
| NestJS module | ✓ | ~ | ~ | ✗ |
| Async transports | ✓ | ~ | ✓ | ✗ |
| Database transport | ✓ | ~ | ✗ | ~ |
| Built-in log search | ✓ | ✗ | ✗ | ✗ |
| OpenTelemetry tracing | ✓ | ✗ | ~ | ✗ |
| Request context | ✓ | ✗ | ✓ | ✗ |
| Webhook / analytics transport | ✓ | ✗ | ✗ | ✗ |
| File rotation built-in | ✓ | ~ | ~ | ~ |
| Zero prod dependencies | ✓ | ✗ | ~ | ✗ |
~ partial/via plugin · ✓ built-in · ✗ unavailable
npm install logixia
pnpm add logixia
yarn add logixia
bun add logixia
import { createLogger, ConsoleTransport } from 'logixia';
// 1. Create a logger — export and share it
const logger = createLogger({
level: 'debug',
transports: [new ConsoleTransport({ colorize: true })],
});
// 2. Log with structured context
logger.info('App started', { version: '1.0.0', port: 3000 });
// 3. Create request-scoped child loggers
const reqLogger = logger.child({ requestId: 'abc-123', userId: 42 });
reqLogger.debug('Processing request');
reqLogger.warn('Slow query detected', { duration: 1240, query: 'SELECT ...' });
// All logs from reqLogger automatically include requestId and userId
logixia gives you async multi-transport, NestJS integration, request tracing, and search out of the box. One package. Done.