1.0.0 • Published 5 months ago
@saas-platform/logging v1.0.0
@saas-platform/logging
Environment-aware logging library that automatically selects the appropriate transport based on deployment environment.
🚀 Features
- Environment Detection - Auto-detects Lambda, Kubernetes, or Local
- Multiple Transports - SQS (Lambda), RabbitMQ (Kubernetes), Console (Local)
- Structured Logging - JSON format with metadata and trace correlation
- Batching Support - Efficient message batching for high-throughput scenarios
- Error Resilience - Graceful fallbacks when transports fail
📦 Installation
npm install @saas-platform/logging🔧 Quick Start
import { logger, LoggerFactory } from '@saas-platform/logging';
// Use default logger (auto-detects environment)
await logger.info('Application started');
await logger.error('Something went wrong', { userId: '123' });
// Create service-specific loggers
const dbLogger = LoggerFactory.database('user-service');
await dbLogger.info('Database connected');
// Create component-specific loggers
const authLogger = LoggerFactory.createForComponent('api', 'authentication');
await authLogger.warn('Invalid login attempt', { email: 'user@example.com' });🌍 Environment Configuration
Lambda (AWS SQS)
DEPLOYMENT_ENV=lambda
LOG_QUEUE_URL=https://sqs.us-east-1.amazonaws.com/123456789/app-logs
AWS_REGION=us-east-1Kubernetes (RabbitMQ)
DEPLOYMENT_ENV=kubernetes
RABBITMQ_URL=amqp://user:password@rabbitmq:5672
LOG_EXCHANGE=logs
LOG_ROUTING_KEY=applicationLocal Development (Console)
DEPLOYMENT_ENV=local
NODE_ENV=development
ENABLE_PRETTY_LOGS=true🔍 Trace Management
import { TraceManager } from '@saas-platform/logging';
// Set trace context (usually from request headers)
TraceManager.setTraceId('trace-abc-123');
TraceManager.setCorrelationId('correlation-def-456');
// All subsequent logs will include trace information
await logger.info('Processing request'); // Includes traceId and correlationId👶 Child Loggers
const baseLogger = LoggerFactory.create({ service: 'order-service' });
// Create child logger with additional context
const orderLogger = baseLogger.child({
orderId: 'order-123',
customerId: 'customer-456'
});
await orderLogger.info('Order processing started');
// Logs will include orderId and customerId in every message🛠️ Utility Methods
// Database operations
await logger.logDatabaseOperation('SELECT', 'users', 150, { rowCount: 10 });
// Cache operations
await logger.logCacheOperation('GET', 'user:123', true, { ttl: 300 });
// API requests
await logger.logAPIRequest('POST', '/api/users', 201, 250, { userId: 'user-456' });📊 Log Format
All logs follow a structured JSON format:
{
"level": "info",
"message": "User authentication successful",
"timestamp": "2024-01-15T10:30:00.000Z",
"service": "auth-service",
"component": "authentication",
"traceId": "trace-abc-123",
"correlationId": "correlation-def-456",
"data": {
"userId": "user-123",
"tenantId": "tenant-456"
},
"metadata": {
"environment": "kubernetes",
"hostname": "auth-pod-abc123",
"pid": 1,
"version": "1.0.0"
}
}🏭 Custom Transport Configuration
import { LoggerFactory, SQSTransport } from '@saas-platform/logging';
// Custom SQS configuration
const customLogger = LoggerFactory.create({
service: 'custom-service',
transport: {
environment: 'lambda',
queueUrl: 'https://sqs.us-west-2.amazonaws.com/123456789/custom-logs',
enableBatching: true,
batchSize: 25,
flushInterval: 3000
}
});🚨 Error Handling
try {
// Some operation
} catch (error) {
// Error objects are automatically formatted
await logger.error('Operation failed', error);
// Or with additional context
await logger.error('Database save failed', {
error,
userId: 'user-123',
operation: 'create'
});
}🔄 Graceful Shutdown
// Flush any pending logs before shutdown
await logger.flush();
// Close transport connections
await logger.close();📈 Production Best Practices
- Set appropriate log levels based on environment
- Use child loggers for request-scoped context
- Always flush logs before Lambda/container shutdown
- Monitor transport health and have fallback strategies
- Use structured data instead of string interpolation
🏗️ Integration with Monitoring
The structured JSON format integrates seamlessly with:
- AWS CloudWatch (from SQS messages)
- ELK Stack (Elasticsearch, Logstash, Kibana)
- Grafana + Loki (log aggregation)
- Datadog, New Relic (APM tools)
📄 License
MIT License
1.0.0
5 months ago