Few shot learning

Few shot learning is the ability of a model to learn new behaviours, having been shown only a "few" examples of the desired behaviour.

It is typically useful for:

     1. Scarcity of data: Collecting large amounts of labelled data is impractical and/or costly. Few shot learning makes it feasible to tackle machine learning tasks with limited training examples.

     2. Rapid adaptation: Enables models to quickly adapt to new tasks or classes without the need for extensive retraining.

     3. Efficient training: Requires less computational resources and time compared to traditional deep learning methods.

     4. Generalization: Encourages models to generalize from the limited number of examples available, often leading to more robust and versatile systems that can perform well on other, related tasks.

     5. Low resource settings: Particularly helpful in low-resource settings, for example, with certain medical diagnoses, where collecting extensive labelled data is challenging due to privacy concerns and/or the scarcity of experts.

Few shot learning is not to be confused with zero shot learning and one shot learning. See zero shot learning. See one shot learning.

Related Articles

No items found.