@asaidimu/indexed v3.0.1
@asaidimu/indexed
A simple and efficient TypeScript library providing a document-oriented database interface for IndexedDB, complete with robust schema management, powerful querying, flexible pagination, an event-driven architecture, and built-in telemetry.
🔗 Quick Links
- Overview & Features
- Installation & Setup
- Usage Documentation
- Project Architecture
- Development & Contributing
- Additional Information
🚀 Overview & Features
@asaidimu/indexed
is a modern TypeScript library designed to simplify interactions with the browser's native IndexedDB. It abstracts away the complexities of low-level IndexedDB APIs, offering a high-level, document-oriented interface that feels similar to popular NoSQL databases. This library empowers developers to manage structured data in the browser with ease, providing robust schema enforcement, flexible querying capabilities, and advanced features like migrations and performance monitoring.
It's ideal for single-page applications, progressive web apps, or any client-side project requiring persistent data storage that goes beyond simple key-value pairs, ensuring data integrity and a streamlined development experience. By providing a familiar API pattern, @asaidimu/indexed
significantly reduces the learning curve and boilerplate typically associated with IndexedDB, allowing developers to focus on application logic rather than database mechanics.
✨ Key Features
- Document-Oriented Interface: Interact with your data using familiar document patterns (create, read, update, delete) and an API that mirrors common NoSQL paradigms.
- Comprehensive Schema Management: Define detailed data schemas using
@asaidimu/anansi
, including field types, validation constraints (e.g.,maxLength
,min
,unique
), and indexes for optimized queries. Supports complex nested schemas. - Flexible Data Access: Retrieve documents using
find
(single document),filter
(array of matching documents), andlist
(paginated documents) methods. - Advanced Querying: Leverage the powerful Query DSL (from
@asaidimu/query
) for expressive and efficient data retrieval, supporting various operators and logical combinations. - Pagination Support: Seamlessly paginate through large datasets using both offset-based and cursor-based strategies, returning asynchronous iterators for efficient batch processing.
- Event-Driven Architecture: Subscribe to database, collection, and document-level events (
document:create
,document:write
,document:update
,document:delete
,document:read
,collection:create
,collection:delete
,collection:update
,collection:read
,migrate
,telemetry
) for real-time monitoring and reactive programming patterns. - Built-in Telemetry: Gain insights into database operation performance, arguments, and outcomes with an optional, pluggable telemetry system, crucial for debugging and optimization.
- Schema Migrations: Define and apply schema changes over time using
SchemaChange
objects withinSchemaDefinition
, ensuring data compatibility and evolution across application versions. - Automatic ID Generation: New documents automatically receive a unique
$id
(UUID v4) if not explicitly provided, along with$created
,$updated
, and$version
metadata. - TypeScript Support: Full type definitions ensure type safety and an excellent developer experience, with strong interfaces for all API components.
📦 Installation & Setup
Prerequisites
- Node.js: v18 or higher recommended.
- Package Manager: npm, yarn, or Bun.
- Environment: A browser environment supporting IndexedDB. For Node.js testing, a compatible shim like
fake-indexeddb
andjsdom
is used internally.
Installation Steps
To add @asaidimu/indexed
to your project, use your preferred package manager:
# Using npm
npm install @asaidimu/indexed
# Using yarn
yarn add @asaidimu/indexed
# Using Bun
bun add @asaidimu/indexed
Configuration
The library requires minimal configuration during the DatabaseConnection
initialization. You provide a database name, and can optionally enable telemetry.
import { DatabaseConnection } from '@asaidimu/indexed';
// Basic initialization: Connects to or creates 'myCoolAppDB'
const db = await DatabaseConnection({
name: 'myCoolAppDB'
});
// Initialization with telemetry enabled: Useful for performance monitoring
const dbWithTelemetry = await DatabaseConnection({
name: 'myCoolAppDB',
enableTelemetry: true
});
The database connection is internally cached, so subsequent calls to DatabaseConnection
with the same name will return the existing instance.
Verification
After installation, you can quickly verify by attempting to import and initialize the database in your environment:
import { DatabaseConnection } from '@asaidimu/indexed';
async function verifyInstallation() {
let db;
try {
db = await DatabaseConnection({ name: 'test_db_verification' });
console.log('IndexedDB Document Store initialized successfully!');
} catch (error) {
console.error('Failed to initialize IndexedDB Document Store:', error);
} finally {
if (db) {
db.close(); // Don't forget to close the connection to free resources
console.log('Database connection closed.');
}
}
}
verifyInstallation();
📖 Usage Documentation
Basic Usage
Let's start with a simple example: defining a schema, creating a collection, and performing basic CRUD operations.
import { DatabaseConnection } from '@asaidimu/indexed';
import type { SchemaDefinition } from '@asaidimu/anansi'; // Crucial for robust schema definition
// 1. Define your document interface
// The '$id' property and other metadata ($created, $updated, $version)
// will be automatically added by the library to your Document instances.
interface Product {
name: string;
price: number;
inStock: boolean;
category?: string;
tags?: string[];
}
// 2. Define your schema using SchemaDefinition from @asaidimu/anansi
// This schema will dictate the structure and validation rules for documents
// stored in the 'products' collection.
const productSchema: SchemaDefinition = {
name: 'products',
version: '1.0.0',
description: 'Schema for product documents',
fields: {
// '$id' is the internal key path for IndexedDB.
// It's implicitly handled by the library if not defined in your fields.
// If your data naturally has an 'id' property, it will coexist.
name: { type: 'string', required: true, constraints: [{ name: 'minLength', parameters: 3 }] },
price: { type: 'number', required: true, constraints: [{ name: 'min', parameters: 0 }] },
inStock: { type: 'boolean', required: true },
category: { type: 'string', required: false },
tags: { type: 'array', required: false, itemsType: 'string' }
},
indexes: [
{ fields: ['name'], type: 'normal' },
{ fields: ['category'], type: 'normal' },
{ fields: ['price'], type: 'btree' },
{ fields: ['name', 'category'], type: 'composite', unique: true }
],
constraints: [
// Example: A schema-level constraint (requires '@asaidimu/anansi' to support this)
// {
// name: 'uniqueProductNameInCategory',
// fields: ['name', 'category'],
// rules: { operator: 'and', rules: [] }, // This would be more complex
// errorMessage: 'Product name must be unique within its category.'
// }
],
// Migrations can be defined here for schema evolution
migrations: []
};
async function runExample() {
let db; // Declare db here to ensure it's accessible in finally block
try {
// 3. Connect to the database
db = await DatabaseConnection({ name: 'myCommerceDB', enableTelemetry: true });
console.log('Database connected.');
// 4. Create or get your collection (equivalent to an IndexedDB Object Store)
let productsCollection;
try {
productsCollection = await db.collection<Product>('products');
console.log('Collection "products" already exists. Accessing it.');
} catch (e: any) {
// If the collection doesn't exist, create it using the defined schema.
if (e.type === 'SCHEMA_NOT_FOUND') {
productsCollection = await db.createCollection<Product>(productSchema);
console.log('Collection "products" created successfully!');
} else {
throw e; // Re-throw other unexpected errors
}
}
// 5. Create a new document in the collection
const newProduct = await productsCollection.create({
name: 'Laptop Pro X',
price: 1200.00,
inStock: true,
category: 'Electronics',
tags: ['tech', 'gadget']
});
console.log('Created Product:', newProduct);
// Documents have special meta-properties and methods.
// '$id' is automatically generated (UUID v4 in this version)
console.log('New Product ID:', newProduct.$id);
console.log('Product created at:', newProduct.$created);
console.log('Product version:', newProduct.$version);
// 6. Find a document by query
const foundProduct = await productsCollection.find({
field: '$id', // Querying by the internal '$id'
operator: 'eq',
value: newProduct.$id
});
if (foundProduct) {
console.log(`Found Product: ${foundProduct.name} (ID: ${foundProduct.$id})`);
console.log('Current price:', foundProduct.price);
// 7. Update a document
const updated = await foundProduct.update({ price: 1150.00, inStock: false });
if (updated) {
console.log('Product price updated to:', foundProduct.price);
// The in-memory document instance is updated immediately,
// but 'read()' ensures we have the absolute latest from the DB,
// useful if other parts of the app might have modified it.
await foundProduct.read();
console.log('Product in stock status after read:', foundProduct.inStock);
}
// 8. Filter documents based on criteria
await productsCollection.create({ name: 'Mechanical Keyboard', price: 75, inStock: true, category: 'Electronics' });
await productsCollection.create({ name: 'Ergonomic Mouse', price: 40, inStock: true, category: 'Accessories' });
await productsCollection.create({ name: 'Webcam 1080p', price: 60, inStock: false, category: 'Accessories' });
const electronics = await productsCollection.filter({
field: 'category',
operator: 'eq',
value: 'Electronics'
});
console.log('Electronics products:', electronics.map(p => p.name));
// 9. List documents with pagination (offset-based example)
console.log('Listing all products (offset pagination, 2 items per page):');
const productIterator = await productsCollection.list({ type: 'offset', offset: 0, limit: 2 });
let pageNum = 1;
// Use for await...of to iterate over the async iterator
for await (const batch of productIterator) {
if (batch.length === 0) {
console.log('No more data.');
break;
}
console.log(`--- Page ${pageNum++} ---`);
batch.forEach(p => console.log(`- ${p.name} ($${p.price})`));
}
// 10. Subscribe to document-level events (e.g., 'document:update' for 'foundProduct')
const unsubscribeProductUpdate = await foundProduct.subscribe('document:update', (event) => {
console.log(`[Document Event] Product updated: ID ${event.data?.$id}, new data: ${JSON.stringify(event.data)}`);
});
await foundProduct.update({ price: 1100.00 }); // This will trigger the event
unsubscribeProductUpdate(); // Clean up subscription
// 11. Delete a document
const deleted = await foundProduct.delete();
if (deleted) {
console.log(`Product "${foundProduct.name}" (ID: ${foundProduct.$id}) deleted successfully.`);
}
} else {
console.log('Product not found after creation, which is unexpected.');
}
// 12. Delete the entire collection
await db.deleteCollection('products');
console.log('Collection "products" deleted.');
} catch (error) {
console.error('An error occurred during the example run:', error);
} finally {
// 13. Ensure the database connection is closed
if (db) {
db.close();
console.log('Database connection closed.');
}
}
}
runExample();
Database API
The Database
interface provides methods for managing collections (equivalent to IndexedDB object stores) and subscribing to global database events.
import { DatabaseConnection } from '@asaidimu/indexed';
import type { Database, Collection, DatabaseEvent, DatabaseEventType, TelemetryEvent } from '@asaidimu/indexed';
import type { SchemaDefinition } from '@asaidimu/anansi';
interface DatabaseConfig {
name: string; // The name of your IndexedDB database
indexSchema?: string; // Optional: name for the internal schema index store (default: "$schema")
keyPath?: string; // Optional: key path for the internal $schema store (default: "$id")
enableTelemetry?: boolean; // Optional: enables performance telemetry (default: false)
}
/**
* Creates a new database connection or retrieves an existing one from an in-memory cache.
* This is the primary entry point for interacting with the IndexedDB.
* Subsequent calls with the same database name will return the cached instance.
*/
function DatabaseConnection(config: DatabaseConfig): Promise<Database>;
interface Database {
/**
* Accesses an existing collection (schema model) by name.
* @param schemaName - The name of the schema/collection to access.
* @returns A promise resolving to the schema's Collection instance.
* @throws DatabaseError if the schema does not exist.
*/
collection: <T>(schemaName: string) => Promise<Collection<T>>;
/**
* Creates a new collection (schema model) in the database.
* This operation increments the database version to allow for object store creation.
* @param schema - The schema definition for the new collection.
* @returns A promise resolving to the created schema's Collection instance.
* @throws DatabaseError if the schema already exists or is invalid.
*/
createCollection: <T>(schema: SchemaDefinition) => Promise<Collection<T>>;
/**
* Deletes an existing collection (schema model) by name from the database.
* This operation increments the database version to allow for object store deletion.
* @param schemaName - The name of the schema/collection to delete.
* @returns A promise resolving to `true` if successful.
* @throws DatabaseError if the schema is not found or an internal error occurs.
*/
deleteCollection: (schemaName: string) => Promise<boolean>;
/**
* Updates an existing collection's schema definition.
* This operation updates the stored metadata of the schema.
* For actual structural changes to the IndexedDB object store, schema migration
* logic within your application (potentially driven by `@asaidimu/anansi`'s
* migration definitions) should manage the database version increment and store modifications.
* @param schema - The updated schema definition.
* @returns A promise resolving to `true` if successful.
* @throws DatabaseError if the schema is not found or an internal error occurs.
*/
updateCollection: (schema: SchemaDefinition) => Promise<boolean>;
/**
* Subscribes to database-level events.
* @param event - The event type to subscribe to (e.g., "collection:create", "telemetry").
* @param callback - The function to call when the event occurs.
* @returns An unsubscribe function.
*/
subscribe: (
event: DatabaseEventType | "telemetry",
callback: (event: DatabaseEvent | TelemetryEvent) => void
) => () => void;
/**
* Closes the connection to the underlying IndexedDB database.
* It's good practice to close connections when no longer needed to free up resources.
*/
close: () => void;
}
Collection API
A Collection<T>
provides methods for managing documents within a specific schema (object store). The generic type T
represents the shape of your application data within this collection.
import type { Document, CollectionEvent, CollectionEventType, TelemetryEvent } from '@asaidimu/indexed';
import type { PaginationOptions, QueryFilter } from '@asaidimu/query';
interface Collection<T> {
/**
* Finds a single document matching the specified query.
* @param query - The query filter to apply.
* @returns A promise resolving to the matching document (as a Document<T> instance) or `null` if not found.
*/
find: (query: QueryFilter<T>) => Promise<Document<T> | null>;
/**
* Lists documents based on the provided pagination options.
* Supports both offset-based and cursor-based pagination.
* @param query - The pagination options (e.g., limit, offset, cursor, direction).
* @returns A promise resolving to an AsyncIterator, which yields arrays of Document<T>.
*/
list: (query: PaginationOptions) => Promise<AsyncIterator<Document<T>[]>>;
/**
* Filters documents based on the provided query and returns all matching documents.
* @param query - The query filter to apply.
* @returns A promise resolving to an array of matching Document<T> instances.
*/
filter: (query: QueryFilter<T>) => Promise<Document<T>[]>;
/**
* Creates a new document in this collection.
* The document is automatically assigned internal metadata like `$id`, `$created`, and `$version`.
* @param initial - The initial data for the document.
* @returns A promise resolving to the newly created Document<T> instance.
*/
create: (initial: T) => Promise<Document<T>>;
/**
* Subscribes to collection-level events.
* @param event - The event type to subscribe to (e.g., "collection:read", "telemetry").
* @param callback - The function to call when the event occurs.
* @returns An unsubscribe function.
*/
subscribe: (
event: CollectionEventType | TelemetryEventType,
callback: (event: CollectionEvent<T> | TelemetryEvent) => void
) => () => void;
}
Document API
A Document<T>
represents a single record in a collection and provides methods for interacting with that specific document. The generic type T
represents your custom data shape, and the library automatically adds internal properties like $id
, $created
, $updated
, and $version
.
import type { DocumentEvent, DocumentEventType, TelemetryEvent, TelemetryEventType } from '@asaidimu/indexed';
type Document<T> =
{
readonly [K in keyof T]: T[K]; // Your defined document properties, made read-only
} &
{
/**
* A unique identifier for the document. Automatically generated as a UUID v4
* if not provided during creation. This is the IndexedDB key.
*/
$id?: string;
/**
* A timestamp indicating when the document was created (ISO 8601 format).
* Automatically set on creation.
*/
$created?: string | Date;
/**
* A timestamp indicating when the document was last updated (ISO 8601 format).
* Automatically updated on calls to `update()`.
*/
$updated?: string | Date;
/**
* A number representing how many times the document has changed.
* Incremented on calls to `update()`.
*/
$version?: number;
/**
* Fetches the latest data for this document from the database.
* Updates the in-memory document instance to reflect any changes.
* @returns A promise resolving to `true` if successful and found, or `false` if an error occurs or not found.
*/
read: () => Promise<boolean>;
/**
* Updates the document in the database with the provided partial properties.
* Also updates the in-memory document instance and increments `$version` and `$updated`.
* @param props - Partial object containing the fields to update.
* @returns A promise resolving to `true` if successful, or `false` if an error occurs.
*/
update: (props: Partial<T>) => Promise<boolean>;
/**
* Deletes the document from its collection in the database.
* @returns A promise resolving to `true` if successful, or `false` if an error occurs.
*/
delete: () => Promise<boolean>;
/**
* Subscribes to document-level events.
* @param event - The event type to subscribe to (e.g., "document:update", "document:delete", "telemetry").
* @param callback - The function to call when the event occurs.
* @returns A promise resolving to an unsubscribe function.
*/
subscribe: (
event: DocumentEventType | TelemetryEventType,
callback: (event: DocumentEvent<T> | TelemetryEvent) => void
) => Promise<() => void>;
}
Schema Definition
@asaidimu/indexed
utilizes the SchemaDefinition
from the external library @asaidimu/anansi
to enforce data integrity and structure. This allows for rich schema definitions, including explicit field types, built-in and custom constraints, indexes for optimized queries, and a mechanism for defining migration plans to evolve your data over time.
For a detailed understanding of SchemaDefinition
and its capabilities, please refer to the documentation for @asaidimu/anansi
.
Key aspects of SchemaDefinition
as used by @asaidimu/indexed
include:
name
: Unique identifier for the collection/schema (corresponds to an IndexedDB object store name).version
: Version string for the schema, important for tracking changes.fields
: A record defining each field'stype
(string
,number
,boolean
,array
,object
,dynamic
),required
status,constraints
,default
values, and more.indexes
: Definitions for IndexedDB indexes, used for optimized queries on specific fields.constraints
: Schema-wide validation rules applied when documents are created or updated.migrations
: An array ofMigration
objects, each detailing atomicSchemaChange
operations (e.g.,addField
,removeField
,modifyField
,addIndex
,removeIndex
,addConstraint
). While the library handles IndexedDB object store creation/deletion based on schema name, the detailedSchemaChange
andMigration
objects are primarily used by@asaidimu/anansi
for complex schema evolution and data transformation.
Querying
The find
and filter
methods of a Collection
utilize the QueryFilter
DSL from @asaidimu/query
for expressive and flexible data retrieval. This powerful query language allows you to specify conditions, apply logical operators, and target specific fields.
import type { QueryFilter } from '@asaidimu/query';
// QueryFilter structure:
type QueryFilter<T> = {
field: keyof T | string; // The field to query on (can be '$id', '$created', etc.)
operator: "eq" | "ne" | "gt" | "gte" | "lt" | "lte" | "in" | "nin" | "contains" | "startsWith" | "endsWith" | "exists" | "notExists";
value?: any; // The value to compare against
} | {
operator: "and" | "or" | "not" | "nor" | "xor"; // Logical operators for combining conditions
conditions: QueryFilter<T>[]; // Array of nested query filters
};
// Example usage:
// Find a user by email address
const userByEmail = await usersCollection.find({
field: 'email',
operator: 'eq',
value: 'john@example.com'
});
// Filter products that are in stock AND cost less than 100
const affordableInStock = await productsCollection.filter({
operator: 'and',
conditions: [
{ field: 'inStock', operator: 'eq', value: true },
{ field: 'price', operator: 'lt', value: 100 }
]
});
// Find products with 'laptop' in their name OR are in the 'Electronics' category
const relevantProducts = await productsCollection.filter({
operator: 'or',
conditions: [
{ field: 'name', operator: 'contains', value: 'laptop' },
{ field: 'category', operator: 'eq', value: 'Electronics' }
]
});
Pagination
The list
method of a Collection
provides robust pagination capabilities, allowing you to efficiently retrieve documents in batches. It supports both traditional offset-based pagination and more efficient cursor-based pagination, returning an AsyncIterator
for seamless integration into for await...of
loops.
import type { PaginationOptions } from '@asaidimu/query';
import type { Collection, Document } from '@asaidimu/indexed';
interface OffsetPaginationOptions {
type: "offset"; // Specifies offset-based pagination
offset: number; // The number of documents to skip from the beginning
limit: number; // The maximum number of documents to return in a batch
}
interface CursorPaginationOptions {
type: "cursor"; // Specifies cursor-based pagination
cursor?: string; // Optional: The $id of the document to start (or continue) from
direction: "forward" | "backward"; // The direction of iteration from the cursor
limit: number; // The maximum number of documents to return in a batch
}
type PaginationOptions = OffsetPaginationOptions | CursorPaginationOptions;
// Example: Offset-based pagination
async function fetchProductsOffset(productsCollection: Collection<Product>) {
console.log('\n--- Fetching Products (Offset Pagination) ---');
let currentPage = 0;
const pageSize = 2; // Number of items per page
while (true) {
const iterator = await productsCollection.list({
type: "offset",
offset: currentPage * pageSize,
limit: pageSize
});
// The iterator yields a single batch per call to .next()
const { value: batch, done } = await iterator.next();
if (batch.length > 0) {
console.log(`Page ${currentPage + 1}:`);
batch.forEach(product => console.log(`- ${product.name} (ID: ${product.$id})`));
currentPage++;
}
// If the batch is empty or we've reached the end, stop.
if (done || batch.length < pageSize) {
console.log('--- End of Offset Pagination ---');
break;
}
}
}
// Example: Cursor-based pagination (simple forward iteration)
async function fetchProductsCursor(productsCollection: Collection<Product>) {
console.log('\n--- Fetching Products (Cursor Pagination) ---');
let lastProductId: string | undefined = undefined; // Used as the cursor for the next batch
const pageSize = 2;
while (true) {
const iterator = await productsCollection.list({
type: "cursor",
cursor: lastProductId,
direction: "forward", // "next" or "prev" for IDBCursorDirection are internally used
limit: pageSize
});
const { value: batch, done } = await iterator.next();
if (batch.length > 0) {
console.log('Next Batch:');
batch.forEach(product => console.log(`- ${product.name} (ID: ${product.$id})`));
// Update the cursor to the ID of the last document fetched
lastProductId = batch[batch.length - 1].$id;
}
if (done || batch.length < pageSize) { // If done or last batch is smaller than limit
console.log('--- End of Cursor Pagination ---');
break;
}
}
}
// To run these examples, ensure you have documents in your 'products' collection.
// e.g., await productsCollection.create({ name: 'Product A', price: 10, inStock: true });
// ... and so on for several products.
Telemetry
@asaidimu/indexed
includes a built-in telemetry system that can be enabled during database initialization. This feature provides detailed performance metrics and contextual information for database operations, proving highly useful for debugging, performance monitoring, and analytics.
To enable telemetry when connecting to your database:
import { DatabaseConnection } from '@asaidimu/indexed';
const db = await DatabaseConnection({
name: 'myAppDB',
enableTelemetry: true
});
Once enabled, you can subscribe to telemetry
events at the Database
, Collection
, or Document
level to capture granular insights:
import type { TelemetryEvent } from '@asaidimu/indexed';
// Subscribe to database-level telemetry: captures all operations at the DB level
const unsubscribeDbTelemetry = db.subscribe("telemetry", (event: TelemetryEvent) => {
console.log(`[DB Telemetry] Method: ${event.method}`);
console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
if (event.metadata.error) {
console.error(`Error: ${event.metadata.error.message}, Stack: ${event.metadata.error.stack}`);
}
console.log('Arguments:', event.metadata.args);
console.log('Result:', event.metadata.result);
console.log('Context:', event.metadata.context);
console.log('---');
});
// Example usage to trigger DB telemetry
await db.createCollection({ name: 'users', version: '1.0.0', fields: { /* ... */ } });
unsubscribeDbTelemetry(); // Clean up
// Subscribe to collection-level telemetry: specific to operations on a collection
const productsCollection = await db.collection<Product>('products');
const unsubscribeCollectionTelemetry = productsCollection.subscribe("telemetry", (event: TelemetryEvent) => {
console.log(`[Collection Telemetry - Products] Method: ${event.method}`);
console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
console.log('---');
});
// Example usage to trigger Collection telemetry
await productsCollection.create({ name: 'New Gadget', price: 99, inStock: true });
await productsCollection.find({ field: 'name', operator: 'eq', value: 'New Gadget' });
unsubscribeCollectionTelemetry(); // Clean up
// Subscribe to document-level telemetry: for operations on a specific document
const myProduct = await productsCollection.find({ field: 'name', operator: 'eq', value: 'Laptop Pro X' });
if (myProduct) {
const unsubscribeDocumentTelemetry = await myProduct.subscribe("telemetry", (event: TelemetryEvent) => {
console.log(`[Document Telemetry - ${myProduct.name}] Method: ${event.method}`);
console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
console.log('---');
});
await myProduct.update({ price: 1099.99 }); // This will trigger the document-level telemetry event
unsubscribeDocumentTelemetry(); // Clean up
}
The TelemetryEvent
structure provides comprehensive details about each captured operation:
type TelemetryEvent = {
type: "telemetry";
method: string; // The name of the method called (e.g., "create", "find", "updateCollection", "update")
timestamp: number; // Unix timestamp (milliseconds) when the operation completed
metadata: {
args: any[]; // Arguments passed to the method
performance: {
durationMs: number; // Execution duration of the operation in milliseconds
};
context: {
userAgent: string | undefined; // Browser user agent string (from globalThis.navigator?.userAgent)
};
result?: {
type: 'array' | string; // Type of the operation's result (e.g., 'array', 'object', 'number', 'boolean')
size?: number; // Size if the result is an array (e.g., for list/filter operations)
};
error: {
message: string;
name: string;
stack?: string;
} | null; // Error details (message, name, stack trace) if the operation failed, null otherwise
};
}
Error Handling
The library provides specific error types to help you handle different failure scenarios gracefully. All custom errors extend from DatabaseError
, making it easy to catch and distinguish them from generic JavaScript errors.
import { DatabaseError, DatabaseErrorType } from '@asaidimu/indexed';
import type { SchemaDefinition } from '@asaidimu/anansi'; // For SchemaDefinition type
enum DatabaseErrorType {
/** The schema (collection) does not exist when trying to access or modify it. */
SCHEMA_NOT_FOUND = "SCHEMA_NOT_FOUND",
/** The schema (collection) already exists when trying to create a new one with the same name. */
SCHEMA_ALREADY_EXISTS = "SCHEMA_ALREADY_EXISTS",
/** The provided schema name is invalid (e.g., empty or reserved). */
INVALID_SCHEMA_NAME = "INVALID_SCHEMA_NAME",
/** The schema definition itself is malformed or violates validation rules. */
INVALID_SCHEMA_DEFINITION = "INVALID_SCHEMA_DEFINITION",
/** An attempt to subscribe to a database event failed. */
SUBSCRIPTION_FAILED = "SUBSCRIPTION_FAILED",
/** A generic internal error occurred during a database operation. */
INTERNAL_ERROR = "INTERNAL_ERROR",
}
class DatabaseError extends Error {
public type: DatabaseErrorType; // The specific type of database error
public schema?: SchemaDefinition; // Associated schema if the error relates to a schema operation
/**
* Constructs a new DatabaseError instance.
* @param type - The specific DatabaseErrorType.
* @param message - A human-readable message describing the error.
* @param schema - Optional: The SchemaDefinition related to the error, if applicable.
*/
constructor(type: DatabaseErrorType, message: string, schema?: SchemaDefinition) {
super(message);
this.name = type; // Set the error name to the error type for easier identification
this.type = type;
this.schema = schema;
}
}
// Example usage of error handling:
async function safeCreateCollection(db: Database, schema: SchemaDefinition) {
try {
await db.createCollection(schema);
console.log(`Collection "${schema.name}" created successfully.`);
} catch (error) {
if (error instanceof DatabaseError) {
// Handle specific database errors
switch (error.type) {
case DatabaseErrorType.SCHEMA_ALREADY_EXISTS:
console.warn(`Collection "${schema.name}" already exists. Skipping creation.`);
break;
case DatabaseErrorType.INVALID_SCHEMA_DEFINITION:
console.error(`Invalid schema definition for "${schema.name}": ${error.message}`);
break;
case DatabaseErrorType.INTERNAL_ERROR:
console.error(`An internal database error occurred: ${error.message}`);
break;
case DatabaseErrorType.SCHEMA_NOT_FOUND:
console.error(`Schema "${schema.name}" not found: ${error.message}`);
break;
case DatabaseErrorType.SUBSCRIPTION_FAILED:
console.error(`Subscription failed: ${error.message}`);
break;
default:
console.error(`Unhandled Database Error (${error.type}): ${error.message}`);
}
} else {
// Handle unexpected non-DatabaseError errors
console.error('An unexpected error occurred:', error);
}
}
}
Event System
The library leverages a lightweight event-driven design, allowing you to subscribe to various lifecycle and data-related events across the database, collections, and individual documents. This facilitates reactive programming, real-time updates, and integration with other parts of your application.
Database Events
Emitted from the Database
instance. These events provide insights into schema management and database-wide activities.
collection:create
: Triggered when a new collection has been successfully created.collection:update
: Triggered when an existing collection's schema metadata has been modified (e.g., viaupdateCollection
).collection:delete
: Triggered when a collection has been successfully removed from the database.collection:read
: Triggered when a collection has been accessed viadb.collection()
.migrate
: Reserved for future detailed migration lifecycle events.telemetry
: (IfenableTelemetry
is true) Provides performance and context data for database-level operations.
import { DatabaseConnection } from '@asaidimu/indexed';
import type { DatabaseEvent, DatabaseEventType } from '@asaidimu/indexed';
const db = await DatabaseConnection({ name: 'myAppDB' });
// Example: Log when a new collection is created
db.subscribe("collection:create", (event: DatabaseEvent) => {
console.log(`[DB Event] New collection created: ${event.schema?.name} at ${new Date(event.timestamp).toLocaleString()}`);
});
// Example: Log when a collection is deleted
db.subscribe("collection:delete", (event: DatabaseEvent) => {
console.log(`[DB Event] Collection deleted: ${event.schema?.name} at ${new Date(event.timestamp).toLocaleString()}`);
});
// To trigger:
// await db.createCollection({ name: 'users', version: '1.0.0', fields: { /* ... */ } });
// await db.deleteCollection('users');
Collection Events
Emitted from a Collection
instance. These events provide insights into document lifecycle actions within a specific collection.
document:create
: Triggered when a new document is successfully created in this collection (often fromcollection.create()
).collection:read
: Triggered when documents within this collection have been accessed viafind
,list
,filter
methods.telemetry
: (IfenableTelemetry
is true) Provides performance and context data for collection-level operations (e.g.,find
,list
,filter
,create
).
import type { Collection, CollectionEvent, CollectionEventType } from '@asaidimu/indexed';
import type { Product } from './your-types-file'; // Assuming Product is defined
const productsCollection: Collection<Product> = await db.collection<Product>('products');
// Example: Log when a document is created in the products collection
productsCollection.subscribe("document:create", (event: CollectionEvent<Product>) => {
console.log(`[Collection Event] Document created in '${event.model}' collection at ${new Date(event.timestamp).toLocaleString()}. Doc ID: ${event.document?.$id}`);
});
// Example: Log when documents are accessed (find, list, filter)
productsCollection.subscribe("collection:read", (event: CollectionEvent<Product>) => {
console.log(`[Collection Event] Documents accessed in '${event.model}' collection using method: '${event.method}' at ${new Date(event.timestamp).toLocaleString()}`);
});
// To trigger:
// await productsCollection.create({ name: 'Test Product', price: 10, inStock: true });
// await productsCollection.find({ field: 'name', operator: 'eq', value: 'Test Product' });
Document Events
Emitted from a Document
instance. These events provide granular details about changes and access to a specific document.
document:create
: Triggered just after a new document instance is created and persisted, often fromcollection.create()
.document:write
: Triggered after a document is initially written to the store (e.g., bycollection.create()
).document:update
: Triggered when the document's properties have been successfully updated. The event payload includes the updated data.document:delete
: Triggered when the document has been successfully deleted from the database.document:read
: Triggered when the document's data has been read or accessed (e.g., viadocument.read()
or during its initial retrieval/creation by a collection method).telemetry
: (IfenableTelemetry
is true) Provides performance and context data for document-level operations (e.g.,read
,update
,delete
).
import type { Document, DocumentEvent, DocumentEventType } from '@asaidimu/indexed';
import type { Product } from './your-types-file'; // Assuming Product is defined
const myProduct: Document<Product> = await productsCollection.create({
name: 'Book', price: 25, inStock: true
});
// Example: Log when the specific document is updated
const unsubscribeUpdate = await myProduct.subscribe("document:update", (event: DocumentEvent<Product>) => {
console.log(`[Document Event] Product (ID: ${event.data?.$id}) updated at ${new Date(event.timestamp).toLocaleString()}. New data:`, event.data);
});
// Example: Log when the specific document is deleted
const unsubscribeDelete = await myProduct.subscribe("document:delete", (event: DocumentEvent<Product>) => {
console.log(`[Document Event] Product (ID: ${event.data?.$id}) deleted at ${new Date(event.timestamp).toLocaleString()}`);
});
// To trigger:
// await myProduct.update({ price: 30 }); // Triggers 'document:update'
// await myProduct.delete(); // Triggers 'document:delete'
// Remember to call unsubscribeUpdate() and unsubscribeDelete() when done.
🏗️ Project Architecture
@asaidimu/indexed
is structured to provide a clear separation of concerns, from low-level IndexedDB interactions to high-level document management and event handling.
Core Components
DatabaseConnection
(src/database.ts
):- The primary entry point for the library.
- Manages the lifecycle of the IndexedDB connection, including opening and closing.
- Orchestrates IndexedDB versioning when creating or deleting object stores (collections).
- Maintains an internal
$schema
object store to persist schema definitions, enabling schema management. - Provides access to
Collection
instances.
Collection<T>
(src/document.ts
viacreateDocumentCursor
):- Represents an abstraction over an IndexedDB object store.
- Provides high-level methods (
create
,find
,filter
,list
) for interacting with documents within that store. - Integrates with
@asaidimu/query
for powerful filtering capabilities andsrc/paginate.ts
for list operations. - Manages collection-level events.
Document<T>
(src/document.ts
viacreateDocument
):- Represents a single document (record) within a
Collection
. - Automatically injects internal metadata like
$id
(UUID v4),$created
,$updated
, and$version
. - Exposes methods (
read
,update
,delete
) for manipulating the specific document. - Manages document-level events.
- Represents a single document (record) within a
Store
(src/store.ts
):- A low-level wrapper providing direct, simplified access to IndexedDB's
IDBObjectStore
operations. - Handles IndexedDB transactions (
executeTransaction
,executeDatabaseTransaction
), requests, and cursor management. - Used internally by
createDocument
andcreateDocumentCursor
to perform database operations.
- A low-level wrapper providing direct, simplified access to IndexedDB's
- Event Bus (
@asaidimu/events
):- A lightweight, integrated event system used across
Database
,Collection
, andDocument
instances. - Facilitates internal communication and enables external subscriptions for reactive programming and monitoring.
- A lightweight, integrated event system used across
- Telemetry Proxy (
src/utils.ts
):- A Proxy-based decorator that wraps public API methods (on
Database
,Collection
, andDocument
instances) ifenableTelemetry
is true. - Transparently captures method calls, execution time, arguments, results, and errors.
- Emits structured
telemetry
events to the respective event buses for consumption.
- A Proxy-based decorator that wraps public API methods (on
Data Flow
- Connection Initialization:
DatabaseConnection
opens or re-uses an IndexedDB connection. This also ensures the internal$schema
object store is created if it doesn't exist. - Schema & Collection Management:
db.createCollection(schema)
: Triggers an IndexedDB version change by reopening the database with an incremented version, allowing a new object store to be created. Theschema
definition is then saved in the internal$schema
store.db.collection(name)
: Retrieves aCollection
instance tied to an existing object store.
- Collection Operations:
Collection
methods (find
,list
,filter
,create
) delegate to the low-levelStore
component.- For query operations (
find
,list
,filter
), theStore
'scursor
method iterates records, and the@asaidimu/query
'smatch
function applies the filtering logic. - For
create
, initial data is passed tocreateDocument
which then usesStore.put
to persist the new document. - All data retrieved via
find
,list
,filter
is wrapped intoDocument
instances, making them interactive.
- Document Operations:
Document
methods (read
,update
,delete
) directly callStore
methods (e.g.,getById
,put
,delete
) using the document's internal$id
as the key.
- Event Emission & Telemetry:
- Throughout these operations,
Database
,Collection
, andDocument
instances emit relevant lifecycle events (e.g.,document:create
,document:update
,collection:read
) via their internal event buses. - If
enableTelemetry
is active, theTelemetry Proxy
intercepts public API calls, records performance metrics and context, and emits structuredtelemetry
events before forwarding the original call.
- Throughout these operations,
⚙️ Development & Contributing
We welcome contributions! Please read through these guidelines to get started.
Development Setup
- Clone the repository:
git clone https://github.com/asaidimu/indexed.git cd indexed
- Install dependencies:
bun install # or npm install or yarn install
- Build the project:
Thebun run build # Compiles TypeScript source to dist/ for CJS and ESM formats.
postbuild
script also copiesREADME.md
,LICENSE.md
, anddist.package.json
into thedist/
folder, preparing the package for npm publication.
Scripts
The package.json
defines several useful scripts for development, building, and testing:
bun ci
: Installs project dependencies.bun clean
: Removes thedist/
directory, cleaning up build artifacts.bun prebuild
: Executesbun clean
andbun run .sync-package.ts
(syncs version details frompackage.json
todist.package.json
).bun build
: Compiles TypeScript source files (index.ts
) intodist/
for CommonJS (cjs
) and ES Module (esm
) formats, along with generating TypeScript declaration files (.d.ts
).bun postbuild
: Copies essential files (README.md
,LICENSE.md
,dist.package.json
) into thedist/
directory, which are included in the published npm package.bun test
: Runs unit and integration tests using Vitest in watch mode.bun test:run
: Executes all tests once and exits. Suitable for CI/CD pipelines.bun test:debug
: Runs tests in debug mode, useful for stepping through code.bun test:ci
: An alias forbun test:run
, designed for continuous integration environments.
Testing
Tests are written with Vitest and provide comprehensive coverage of the library's functionality. To run the tests:
bun test
This will start Vitest in watch mode, automatically re-running tests on file changes. To run tests once (e.g., for CI or a quick check):
bun test:run
The tests are executed in a Node.js environment, simulating a browser using fake-indexeddb
and jsdom
. This ensures consistent and fast test execution without requiring a real browser.
Contributing Guidelines
Please review our CONTRIBUTING.md for detailed information on:
- Reporting bugs effectively.
- Suggesting and discussing new features.
- The process for making pull requests.
- Our coding standards and commit message conventions (which follow Conventional Commits).
Issue Reporting
If you encounter any bugs, have feature requests, or questions, please open an issue on our GitHub Issues page. Provide as much detail as possible to help us understand and address your concerns.
📚 Additional Information
Troubleshooting
- Database not opening/upgrading:
- Ensure your browser supports IndexedDB.
- When working directly with IndexedDB's
indexedDB.open()
, ensure you are providing a version number greater than the current version if you intend to create or modify object stores.@asaidimu/indexed
handles this internally forcreateCollection
anddeleteCollection
.
- "SCHEMA_NOT_FOUND" error:
- Verify that the collection name you are trying to access with
db.collection()
ordb.deleteCollection()
was previously created usingdb.createCollection()
. - Check for typos in the collection name.
- Verify that the collection name you are trying to access with
- Data not persisting or updating:
- Remember that all database operations are asynchronous and return Promises. Always use
await
or.then()
to ensure operations complete and their results are handled. - After modifying properties on a
Document
instance, you must calldocument.update(props)
to persist those changes to the database. For new documents,create()
automatically persists them.
- Remember that all database operations are asynchronous and return Promises. Always use
- Asynchronous operations:
- It's crucial to handle the asynchronous nature of all API calls. Incorrect handling (e.g., missing
await
) can lead to unexpected behavior or errors.
- It's crucial to handle the asynchronous nature of all API calls. Incorrect handling (e.g., missing
- Closing connections:
- While IndexedDB connections are generally managed by the browser, explicitly calling
db.close()
when your application no longer needs the database connection is a good practice to free up resources and prevent resource leaks, especially in long-running applications or during testing.
- While IndexedDB connections are generally managed by the browser, explicitly calling
FAQ
Q: Is this library a full-fledged database replacement?
A: IndexedDB Document Store
provides a robust client-side persistence layer for structured data, making it suitable for many web application needs (e.g., offline capabilities, caching, local data synchronization). It is not a replacement for server-side databases (like MongoDB, PostgreSQL) but aims to bring a similar document-oriented development experience to the browser's local storage.
Q: How does @asaidimu/indexed
handle schema migrations?
A: The SchemaDefinition
(from @asaidimu/anansi
) includes a migrations
array where you can define a series of SchemaChange
objects. The library uses this information to manage schema evolution. While the core library handles IndexedDB object store creation/deletion on version changes, the detailed migration logic for data transformation during schema updates would typically be handled within @asaidimu/anansi
's migration capabilities, which this library integrates with by updating the stored schema metadata.
Q: Can I use this in a Node.js environment?
A: IndexedDB is fundamentally a browser API. While this library is written in TypeScript and can be built for Node.js, using it directly in a Node.js server environment requires a polyfill like fake-indexeddb
(which is used for testing) to simulate the browser's IndexedDB API. For server-side Node.js applications, a dedicated server-side database solution is generally more appropriate and performant.
Q: How do I handle large datasets with this library?
A: IndexedDB itself is designed for significant client-side data storage, capable of holding gigabytes of data. @asaidimu/indexed
enhances this with efficient cursor
-based iteration and advanced pagination
options (list
method), making it suitable for managing large datasets by processing them in manageable batches rather than loading everything into memory at once.
Q: How are $id
values generated?
A: As of version 2.0.0, $id
values for new documents are generated using UUID v4. This provides strong uniqueness guarantees without depending on content hashing, simplifying document creation.
Changelog / Roadmap
- For a detailed history of changes, features, and bug fixes, please refer to the CHANGELOG.md file.
- A formal roadmap is currently TBD, but common future considerations include more advanced query features, deeper integration with
@asaidimu/anansi
for complex schema validation and migrations, and potential performance optimizations through advanced IndexedDB features.
License
This project is licensed under the MIT License. See the LICENSE.md file for full details.
Acknowledgments
- Built on the power of IndexedDB API.
- Utilizes @asaidimu/anansi for robust schema definition and validation.
- Leverages @asaidimu/query for its powerful and declarative querying DSL.
- Employs @asaidimu/events for its internal event system.
- Tested thoroughly with Vitest.
- Built efficiently using tsup for bundling.