Difference between revisions of "Multi-Task Learning (MTL)"
Line 10: | Line 10: | ||
* [[Learning Techniques]] | * [[Learning Techniques]] | ||
− | training a single neural network to apply multiple kinds of labels when inferring, for example, objects seen in an image. | + | training a single neural network to apply multiple kinds of labels when inferring, for example, objects seen in an image. Multi-task learning is a type of supervised learning that involves fitting a model on one dataset that addresses multiple related problems. It involves devising a model that can be trained on multiple related tasks in such a way that the performance of the model is improved by training across the tasks as compared to being trained on any single task. Multi-task learning can be a useful approach to problem-solving when there is an abundance of input data labeled for one task that can be shared with another task with much less labeled data. For example, it is common for a multi-task learning problem to involve the same input patterns that may be used for multiple different outputs or supervised learning problems. In this setup, each output may be predicted by a different part of the model, allowing the core of the model to generalize across each task for the same inputs. A popular example of multi-task learning is where the same word embedding is used to learn a distributed representation of words in text that is then shared across multiple different natural language processing supervised learning tasks |
<youtube>UdXfsAr4Gjw</youtube> | <youtube>UdXfsAr4Gjw</youtube> |
Revision as of 13:33, 8 December 2019
YouTube search... ...Google search
training a single neural network to apply multiple kinds of labels when inferring, for example, objects seen in an image. Multi-task learning is a type of supervised learning that involves fitting a model on one dataset that addresses multiple related problems. It involves devising a model that can be trained on multiple related tasks in such a way that the performance of the model is improved by training across the tasks as compared to being trained on any single task. Multi-task learning can be a useful approach to problem-solving when there is an abundance of input data labeled for one task that can be shared with another task with much less labeled data. For example, it is common for a multi-task learning problem to involve the same input patterns that may be used for multiple different outputs or supervised learning problems. In this setup, each output may be predicted by a different part of the model, allowing the core of the model to generalize across each task for the same inputs. A popular example of multi-task learning is where the same word embedding is used to learn a distributed representation of words in text that is then shared across multiple different natural language processing supervised learning tasks