Types of Meta Learning

Any meta learning system should include the following elements.

  • The system in place must be combined with a subsystem.
  • Experience can be gathered by exploiting meta intelligence from previous learning sessions or different domains.
  • The learning habits must be dynamically selected.
  • Here are some common approaches.
  • Using (cyclic) networks with external or internal memory.
  • Learning effective distance metrics.
  • Explicitly optimising model parameters for fast learning.

Model-Based

Model-Based meta learning systems update their parameters with speed, with only a couple of training techniques. This point can be reached by its own internal structure or by being controlled by different meta learner systems.

Memory-Augmented Neural Networks

Any Memory-Augmented Neural Network, otherwise known as MANN are known for their ability to process new information rapidly and thus being able to become accustomed to any new task after a small number of examples.

Meta Networks

Meta Networks learn meta level intelligence that covers a wide range of tasks and shifts in their inductive biases through fast parameterization for speedy generalization.

Metric Based

Metric Based meta learning mostly comes down to an idea similar to nearest neighbors algorithms, its weight is generated through a kernel function. The intelligence it aims to obtain is that of metric or distance function over objects. The idea of a good metric is problem-dependent, and there should be an established relationship between its inputs in the work space which will facilitate the problem solving.

Convolutional Siamese Neural Network

Siamese Neural Network can be constructed of two of the same networks whose output is trained and joined. The aim is to learn how they compliment each other between their input data samples. The networks are the same, have the same weight and the same network parameters.

Matching Networks

These networks learn networks that conducts small labeled support sets and any unlabelled copy or example, which eliminates the need to fine-tune or adapt to any new class types.

Relation Network

This network is solely trained from scratch. During the meta learning phase, this network learns to learn deep distance metric and then compares small numbers of images through episodes that are designed to simulate a fast moving setting.

Prototypical Networks

Prototypical networks learn how metric space, depending on its classification, can perform after computing the distances to prototype in representations of each class. This imitates a similar yet simpler inductive bias that produces more satisfied results through limited data regime.

Optimisation Based

This type of meta learning’s aim is to adjust the optimization algorithm so that the model can excel at learning when provided with very few examples.

LSTM Meta Learner

LSTM based meta learner attempts to pinpoint the exact optimization algorithm that is used to teach other learner neural network classifiers in the fast shot regime. This allows the network to learn the appropriate parameter updates specifically designed for the time when a certain set of updates will be made.

Temporal Discreteness.

MAML, which is short for Model-Agnostic Meta-Learning is more of a general type of optimization algorithm, which is compatible with any type of system that gains its knowledge from gradient descent.