tinygrad: Something between PyTorch and karpathy/micrograd1

Despite tinygrad's size, it is a fully featured deep learning framework. Due to its extreme simplicity, it is the easiest framework to add new accelerators to, with support for both inference and training. If XLA is CISC, tinygrad is RISC.

... part of T2, get it here

URL: https://tinygrad.org/

Author: tinygrad
Maintainer: The T2 Project <t2 [at] t2-project [dot] org>

License: MIT
Status: Beta
Version: 0.11.0

Download: https://github.com/tinygrad/tinygrad/ tinygrad-0.11.0.tar.gz

T2 source: tinygrad.cache
T2 source: tinygrad.desc

Build time (on reference hardware): 2% (relative to binutils)2

Installed size (on reference hardware): 10.36 MB, 185 files

Dependencies (build time detected): bash coreutils cython diffutils gawk grep gzip openssl python sed setuptools tar

Installed files (on reference hardware): n.a.

1) This page was automatically generated from the T2 package source. Corrections, such as dead links, URL changes or typos need to be performed directly on that source.

2) Compatible with Linux From Scratch's "Standard Build Unit" (SBU).