Difference between revisions of "Out-of-Distribution (OOD) Generalization"
(Created page with "Out-of-Distribution (OOD) generalization refers to the ability of a machine learning model to generalize to new data that comes from a different distribution than the trai...") |
m (BPeat moved page Out-of-Distribution (OOD) to Out-of-Distribution (OOD) Generalization without leaving a redirect) |
(No difference)
| |
Revision as of 06:53, 27 May 2023
Out-of-Distribution (OOD) generalization refers to the ability of a machine learning model to generalize to new data that comes from a different distribution than the training data. This is a challenging problem because the testing distribution is unknown and different from the training distribution. There are several methods for improving out-of-distribution generalization. According to a survey on the topic, existing methods can be categorized into three parts based on their positions in the whole learning pipeline: unsupervised representation learning, supervised model learning and optimization. Another approach to out-of-distribution generalization is via learning domain-invariant features or hypothesis-invariant features.
Source: Conversation with Bing, 5/27/2023 (1) Teaching Algorithmic Reasoning via In-context Learning - arXiv.org. https://arxiv.org/pdf/2211.09066.pdf. (2) Algorithmic prompting or how to teach math to a large language model. https://the-decoder.com/how-to-teach-math-to-a-large-language-model/. (3) 7 Examples of Algorithms in Everyday Life for Students. https://www.learning.com/blog/7-examples-of-algorithms-in-everyday-life-for-students/. (4) How to write the Algorithm step by step? - Programming-point. http://programming-point.com/algorithm-step-by-step/.