Use ML-FLOW and TensorFlow2.0(Keras) to record all the experiments on the Fashion MNIST dataset.
-
Updated
Nov 21, 2022 - Jupyter Notebook
Use ML-FLOW and TensorFlow2.0(Keras) to record all the experiments on the Fashion MNIST dataset.
Neural Network
Playground for trials, attempts and small projects.
This code implements neural network from scratch without using any library
MachineLearningCurves is a collection of abstract papers, insights, and research notes focusing on various topics in machine learning.
Variance normalising pre-training of neural networks.
Comapring different methods of weight initialization and optimizers using PyTorch
This repository explores the impact of various weight initialization methods on a neural network's performance, comparing zero, random, and He initialization. It includes visualizations of cost function and decision boundaries.
Deep Learning with TensorFlow Keras and PyTorch
Data driven initialization for neural network models
Why don't we initialize the weights of a neural network to zero?
This repo will describe the preparation process of deep learning weights before the training to capture essential information about data fed
RNN-LSTM: From Applications to Modeling Techniques and Beyond - Systematic Review
Neural Networks: Zero to Hero. I completed the tutorial series by Andrej Karpathy
Making a Deep Learning Framework with C++
FloydHub porting of deeplearning.ai course assignments
How weight initialization affects forward and backward passes of a deep neural network
Neural_Networks_From_Scratch
PREDICT THE BURNED AREA OF FOREST FIRES WITH NEURAL NETWORKS
A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.
Add a description, image, and links to the weight-initialization topic page so that developers can more easily learn about it.
To associate your repository with the weight-initialization topic, visit your repo's landing page and select "manage topics."