Shad Amethyst
|
b4a97694a6
|
🎨 Clean things up, add unit tests, add one hot layer
|
2 years ago |
Shad Amethyst
|
bca56a5557
|
✨ Re-order arguments of neura_layer, implement softmax and normalization
|
2 years ago |
Shad Amethyst
|
220c61ff6b
|
✨ Dropout layers
|
2 years ago |
Shad Amethyst
|
8ac82e20e2
|
✨ Working backpropagation :3
|
2 years ago |
Shad Amethyst
|
7a6921a1c1
|
✨ 🔥 Semi-working training, although it seems to be only want to converge to zero
|
2 years ago |
Shad Amethyst
|
d3d5f57a2b
|
✨ 🔥 Attempt at backpropagation
My head is tired :/
|
2 years ago |
Shad Amethyst
|
5a20acf595
|
✨ Add NeuraNetwork
|
2 years ago |
Shad Amethyst
|
7759a6615d
|
🎉 First commit
|
2 years ago |