Preventing Overfitting

The model in the picture is overfitting.

The model in the picture is overfitting.

When your model makes good predictions on the same data that was used to train it but shows poor results with data that hasn't seen before, we say that the model is overfitting.

Here are 7 ways you can deal with overfitting in Deep Learning neural networks.

  1. Train your model on more data

The more data you feed the model, the more likely it will start generalizing (instead of memorizing the training set.) Look at the relationship between dataset size and error.

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/2a8992ce-d512-4c0b-b0ca-6ddb1b2acd7d/Ektjp1EX0AA2nA4.jpeg

  1. Augment your dataset

You can automatically augment your dataset by transforming existing images in different ways to make the data more diverse. Some examples:

  1. Make your model simpler

The more complex your model is, the more capacity it has to memorize the dataset (hence, the easier it will overfit.)

Simplifying the model will force it to generalize.

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/bfea0954-ee2d-4f99-846b-48785c66bb3d/EktoNskXYAAlH_F.jpeg

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/20cae229-4362-4b0d-a5ed-8fd3ad77ef3b/EktoTSnWkAAlJ9K.png