Phito-Deep is a deep learning framework built from scratch with only numpy. This is being actively developed as part of my learning journey to becoming a machine learning engineer. I'm using it to better understand the underlying algorithms that power modern deep learning frameworks and architectures.
$ pip install phitodeepMNIST quickstart:
import numpy as np
from datasets import load_dataset
from phitodeep.model import SequentialBuilder
from phitodeep.loss import CategoricalCrossEntropy
from phitodeep.optimization.optimizers import Adam
from phitodeep.optimization.initialization import Xavier, InitType
train_dataset = load_dataset("ylecun/mnist", split="train")
test_dataset = load_dataset("ylecun/mnist", split="test")
X_train = train_dataset["image"]
y_train = train_dataset["label"]
X_test = test_dataset["image"]
y_test = test_dataset["label"]
X_train = np.array(X_train).astype(np.float32) / 255.0
y_train = np.array(y_train)
X_test = np.array(X_test).astype(np.float32) / 255.0
y_test = np.array(y_test)
print(X_train.shape, y_train.shape)
model = (
SequentialBuilder()
.flatten()
.dense(784, 128)
.relu()
.dense(128, 64, Xavier(InitType.NORMAL))
.relu()
.dense(64, 10, Xavier(InitType.NORMAL))
.softmax()
.optimizer(Adam())
.loss(CategoricalCrossEntropy())
.alpha(0.05)
.epochs(5)
.batch(64)
.build()
)
model.summary()
model.train(X_train, y_train, X_test, y_test)Interested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.
phitodeep was created by Ralph Dugue. It is licensed under the terms of the Apache License 2.0 license.
phitodeep was created with cookiecutter and the py-pkgs-cookiecutter template.