Deep Learning โ€” 7 : Optimize your Neural Networks through Dropouts & Regularization.
pub.towardsai.netยท1d
Flag this post

Member-only story

Overfitting of Deep Neural Networks, Bias-Variance trade-off, Dropouts & Regularization

6 min read9 hours ago

โ€“

In classical neural networks, people mostly tried 2โ€“3 layered networks.

Press enter or click to view image in full size

Photo by Google DeepMind on Unsplash

There were a few hiccups that made them avoid deeper architectures, such as:

  • Vanishing Gradients, this made it difficult to train deep networks.
  • Too little data, not enough samples, leading to overfitting.
  • Little or Limited computational power.

For example:

In deep neural networks, if we have thousands of weights but not millions โ€ฆ

Similar Posts

Loading similar posts...