less than 1 minute read

Here I have written code for Adam, Momentum and RMS optimizer in Jax. Jax is mainly built for high performance machine learning research.

To find out more about Jax check out:

https://github.com/google/jax

https://github.com/hips/autograd

https://www.tensorflow.org/xla

Dataset I have used is an MNIST. Code for RMSProp optimizer is in test.ipynb, momentum in test1.ipynb and Adam in test3.ipynb

Check out the code at:

https://github.com/rajanarasimhan/Jax_Optimiser

Updated: