Rust Neural Network banner

Rust Neural Network

4 devlogs
21h 24m 26s

Writing a neural network from scratch in rust, with as few crates as possible :)

(to learn how everything works lol)

This project uses AI

The demo website is made by AI. That’s it. ONLY the interface :)

Demo Repository

Loading README...

Kubik

Shipped this project!

Hours: 21.41
Cookies: 🍪 640
Multiplier: 29.9 cookies/hr

Oh my gosh that was crazy!! Learned how neural networks learn and run and implemented ALL OF THAT in Rust, all by myself 🦀


I’ve provided demo videos in the github README if you can’t run the demo yourself :p

Kubik

I’m done!!

New magic:

  • Demo website to run the model! (recognize digits)
  • Better demo model (trained longer)
  • and yeah github stuff like readme :p

Thank you for following me during this journey! Learned so much!!! :D

Attachment
0
Kubik

Oh my! ✨

  • Back-propagation! You can now train models!
  • Optimizers! SGD or AdamBuilder (Adam is better)
  • Batch training! Using accumulators and apply (really cool, makes training faster)
  • LOTS of optimizations (got down from 204s to 111s per 10 iters, with lots of memory optims lol)
  • lots of sweating lol
  • Dropout layers: makes some neural nets more robust… works bad with really small nets tho
  • super simple network builder!

What noooooww 🚀

  • I guess a usable demo would be awesome, like a web interface to try out the networks? I mean that would def be fire :p
  • Maybe also more demos, will see

🦀 learned so much stuff!! like wow that was such a good project idea! :DD

Attachment
Attachment
0
Kubik

🦀 Whats new

kinda everything new lol

  • Activation functions (implemented ReLU, Sigmoid, Tanh, Linear) & Softmax
  • CLEAN MATRIX OPERATIONS, type safe! Compiler will say nuh uuhh if the dimensions do not match :DD (that’s so fricking awseomse)
  • you can use operators + * - for matrix operations directly (operator overloading is awesome)
  • Layer structure with weight, bias and activation
  • Network structure, allows you to add layers (type safe! cant add a layer whose input doesn’t match the previous output)
  • Network allows you to run(input) and gives you the output of the neural network

And now…. 😅

TIME FOR TRAININGG!! (backpropagation gonna be hardddd)


Sorry for basically no screenshots but there’s not much to show… lol

Attachment
Attachment
0
Kubik

Woohoo!! I wrote useful functions for matrices operation:

  • multiplication, scalar, addition, subtraction, transposition and mapping with a function

holy moly i got multiplication on my own!!

-> I tried not to use AI & Internet AT ALL, and I MADE IT YIIIPEEE

oh and the time in the benchmark is sometimes for small matrices in multiplication and HUGE ones for addition and stuff

Attachment
1

Comments

kevinchou0813
kevinchou0813 3 months ago

wow thats really cool