Difference between revisions of "Principal Component Analysis (PCA)"
m |
m |
||
Line 20: | Line 20: | ||
[https://www.bing.com/news/search?q=Principal+Component+Analysis+PCA&qft=interval%3d%228%22 ...Bing News] | [https://www.bing.com/news/search?q=Principal+Component+Analysis+PCA&qft=interval%3d%228%22 ...Bing News] | ||
+ | * [[Backpropagation]] ... [[Activation Functions]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]] | ||
* [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Optimizer]] ... [[Train, Validate, and Test]] | * [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Optimizer]] ... [[Train, Validate, and Test]] | ||
* [[...find outliers]] | * [[...find outliers]] | ||
− | |||
* [[Supervised|Supervised Learning]] ... [[Semi-Supervised]] ... [[Self-Supervised]] ... [[Unsupervised]] | * [[Supervised|Supervised Learning]] ... [[Semi-Supervised]] ... [[Self-Supervised]] ... [[Unsupervised]] | ||
* [[Embedding]]: [[Agents#AI-Powered Search|Search]] ... [[Clustering]] ... [[Recommendation]] ... [[Anomaly Detection]] ... [[Classification]] ... [[Dimensional Reduction]] ... [[...find outliers]] | * [[Embedding]]: [[Agents#AI-Powered Search|Search]] ... [[Clustering]] ... [[Recommendation]] ... [[Anomaly Detection]] ... [[Classification]] ... [[Dimensional Reduction]] ... [[...find outliers]] |
Revision as of 00:15, 11 July 2023
YouTube ... Quora ...Google search ...Google News ...Bing News
- Backpropagation ... Activation Functions ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- AI Solver ... Algorithms ... Administration ... Model Search ... Discriminative vs. Generative ... Optimizer ... Train, Validate, and Test
- ...find outliers
- Supervised Learning ... Semi-Supervised ... Self-Supervised ... Unsupervised
- Embedding: Search ... Clustering ... Recommendation ... Anomaly Detection ... Classification ... Dimensional Reduction ... ...find outliers
- T-Distributed Stochastic Neighbor Embedding (t-SNE) ..non-linear
- How to Calculate Principal Component Analysis (PCA) from Scratch in Python | Jason Brownlee - Machine Learning Mastery
- Data Science Concepts Explained to a Five-year-old | Megan Dibble - Toward Data Science
- Causation vs. Correlation - Multivariate Additive Noise Model (MANM)
- Independent Component Analysis (ICA) | University of Helsinki
- Linear Non-Gaussian Acyclic Model (ICA-LiNGAM) | S. Shimizu, P. Hoyer, A. Hyvarinen, and A. Kerminen - University of Helsinki
- Greedy DAG Search (GDS) | Alain Hauser and Peter Biihlmann
- Feature-to-Feature Regression for a Two-Step Conditional Independence Test | Q. Zhang, S. Filippi, S. Flaxman, and D. Sejdinovic
- A Beginner's Guide to Eigenvectors, Eigenvalues, PCA, Covariance and Entropy Learning | Chris Nicholson - A.I. Wiki pathmind
- Everything you did and didn't know about PCA | Alex Williams - Its Neutonal
Principal Component Analysis (PCA) goal is to reduce the dimensionality of a data set consisting of a large number of interrelated variables, while retaining as much as possible of the variation present in the data set. This is accomplished by linearly transforming the data into a new coordinate system where (most of) the variation in the data can be described with fewer dimensions than the initial data. The new dimensions are called principal components, and they are uncorrelated and ordered by the amount of variance they explain. PCA can help you simplify large data tables, visualize multidimensional data, and identify hidden patterns in your data. This data reduction technique allows the simplifying multidimensional data sets to 2 or 3 dimensions for plotting purposes and visual variance analysis.
- Center (and standardize) data
- First principal component axis
- Across centroid of data cloud
- Distance of each point to that line is minimized, so that it crosses the maximum variation of the data cloud
- Second principal component axis
- Orthogonal to first principal component
- Along maximum variation in the data
- First PCA axis becomes x-axis and second PCA axis y-axis
- Continue process until the necessary number of principal components is obtained
NumXL