2.4.3 • Published 8 months ago
microgradts v2.4.3
MicrogradTS
Video Demo
A tiny Autograd engine Based off of Andrej Karpathy's Micrograd in Python. Implements backpropagation over a dynamically built DAG that operates over scalar values.
Installation
npm install microgradts
yarn add microgradts
Example usage
Below is a slightly contrived example showing a number of possible supported operations:
import { Value, add, mul, pow } from 'microgradts'
const a = new Value(-4.0)
const b = new Value(2.0)
const c = add(a, b)
const d = add(mul(a, b), pow(b, new Value(3))) a * b + b**3
const e = new Value(3.0)
const f = div(d, e)
f.backward();
And an example usage of the Neural Net APi:
const n = new MLP(3, [4, 4, 1]);
const xs = [
[2.0, 3.0, -1.0],
[3.0, -1.0, 0.5],
[0.5, 1.0, 1.0],
[1.0, 1.0, -1.0],
].map((x) => toValues(x));
const ys = toValues([1.0, -1.0, -1.0, 1.0]);
for (let i = 0; i < 200; i++) {
const ypred = xs.map((x) => n.run(x));
const loss = getLoss(ys, ypred as Value[]);
for (const p of n.parameters()) {
p.grad = 0;
}
loss.backward();
for (const p of n.parameters()) {
p.data -= 0.01 * p.grad;
}
console.log(i, loss.data);
}
Todo
- Implement visualization with Graphviz
License
MIT
2.4.3
8 months ago
2.4.2
8 months ago
2.4.1
8 months ago
2.4.0
8 months ago
2.3.0
8 months ago
2.2.0
9 months ago
2.1.0
9 months ago
2.0.2
9 months ago
2.0.1
9 months ago
2.0.0
9 months ago
1.0.10
9 months ago
1.0.9
9 months ago
1.0.7
9 months ago
1.0.6
9 months ago
1.0.5
9 months ago
1.0.4
9 months ago
1.0.3
9 months ago
1.0.2
9 months ago
1.0.1
9 months ago
1.0.0
9 months ago