@d0paminedriven/fs v6.1.0
@d0paminedriven/fs
“Don’t sweat the small stuff; just write your damn files.”
Let's be real, you likely already have a project manager breathing down your neck for updates on the hour, JIRA ticket blues bringing you down, Teams/Slack notification pings plaguing your dreams, the "hey, got five minutes?" guy breaking your flow state on the daily, and nth other pressing issues on your mind mid-development.
These situations are all annoying af, which is why this package seamlessly ensures that targeted output paths exist for you (if they don't exist yet, they will before your file is written). Whether you're using writeFileAsync, fetchRemoteWriteLocalLargeFiles, or withWs, this package has you covered no matter how deep your targeted output paths go. Unfortunately, this package can't help with that pesky PM or that flow-breaking co-worker (yet--PRs welcome).
Want to grab some remote files? Throw any file size its way — no problemo — just use the fetchRemoteWriteLocalLargeFiles method. (Proceed with caution in the >=gigabyte range. If your setup can handle it, then by all means.)
🚀 Quickstart
import { Fs } from "@d0paminedriven/fs";
// Set your cwd (project root recommended)
const fs = new Fs(process.cwd());
/** Write to a nested file path (no mkdirp needed) */
await fs.writeFileAsync("public/assets/images/foo/bar.jpg", myImageBuffer);
// No error, no “does this folder exist?”, no drama
/** Fetch and save a massive remote asset */
await fs.fetchRemoteWriteLocalLargeFiles(
"https://cdn.example.com/big-model.glb",
"public/models/my-big-model"
);
// Handles directories, streams file to disk, and never blows up your RAM
/** Read every file in a directory */
const files = fs.readDir("public/assets/images", { recursive: true });
/** Unlink a file (with existence check) */
await fs.unlink("public/assets/old/unused.txt");
/** Need a wait utility for your script? */
await fs.wait(2000); // waits 2 seconds
/** Get the MIME type for any file by extension */
const mime = fs.getMimeTypeForPath("foo.png"); // "image/png"🍳 Recipes / Real-World Scripts
Need to audit or analyze files, generate a TypeScript object from your assets, or automate asset metadata? Here’s a dead-simple pattern to iterate off of:
const files = fs.readDir("public/assets/images", { recursive: true });
/** Map those files to a typescript object with file names as keys + corresponding file sizes as values */
const arr = Array.of<[string, number]>();
function generateTsObjOfFileSizes(withExtensions = true) {
try {
files.forEach(function (file) {
const size = fs.fileSizeMb(`public/assets/images/${file}`);
if (withExtensions === false) {
arr.push([file.split(/\./g)[0] ?? file, size]);
} else {
arr.push([file, size]);
}
});
} catch (err) {
if (err instanceof Error) throw new Error(err.message);
else console.error(`generateTsFileOfFileSizes`, err);
} finally {
const tupleArrToObj = Object.fromEntries(arr);
fs.withWs(
`src/utils/file-sizes.ts`,
`export const publicAssetsImagesFileSizesMb = ${JSON.stringify(tupleArrToObj, null, 2)} as const;`
);
}
}
generateTsObjOfFileSizes();This outputs a TypeScript object you can use for dashboards, asset analytics, or CI/CD checks:
export const fileSizesInMb = {
"chess-atb.png": 1.1277456283569336,
"elegant-stone-tiles-albedo.png": 53.93040370941162,
"port-40.avif": 1.7613801956176758,
"saeukkang.usdz": 3.6647157669067383
} as const;ⓘ assets used were pooled from repos across github via the fetchRemoteWriteLocalLargeFiles method
| npm | downloads | github |
|---|---|---|