Difference between revisions of "Forecasting"
m |
m |
||
| Line 2: | Line 2: | ||
|title=PRIMO.ai | |title=PRIMO.ai | ||
|titlemode=append | |titlemode=append | ||
| − | |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, TensorFlow, Facebook, Google, Nvidia, Microsoft, Azure, Amazon, AWS | + | |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, TensorFlow, Meta, Facebook, Google, Nvidia, Microsoft, Azure, Amazon, AWS |
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
}} | }} | ||
| Line 387: | Line 387: | ||
<youtube>XK3cEJw93jA</youtube> | <youtube>XK3cEJw93jA</youtube> | ||
<b>Lecture 5: VAR and VEC Models | <b>Lecture 5: VAR and VEC Models | ||
| − | </b><br>This is Lecture 5 in my Econometrics course at Swansea University. Watch Live on The Economic Society Facebook page Every Monday 2:00 pm (UK time) October 2nd - December 2017. http://facebook.com/TheEconomicSociety/ In this lecture, I explain how to estimate a vector autoregressive model. We started with explaining the Autoregressive Process to explain the behaviour of a time series and how to present such process in different forms. Then we explained the basic conditions required to estimate a VAR model. The data need to be stationary. You need to choose the optimal lag length. The model must be stable. After estimation, we could test for causality among variables using Granger causality tests. Because VAR models are often difficult to interpret, we can use the impulse responses and variance | + | </b><br>This is Lecture 5 in my Econometrics course at Swansea University. Watch Live on The Economic Society [[Meta|Facebook]] page Every Monday 2:00 pm (UK time) October 2nd - December 2017. http://facebook.com/TheEconomicSociety/ In this lecture, I explain how to estimate a vector autoregressive model. We started with explaining the Autoregressive Process to explain the behaviour of a time series and how to present such process in different forms. Then we explained the basic conditions required to estimate a VAR model. The data need to be stationary. You need to choose the optimal lag length. The model must be stable. After estimation, we could test for causality among variables using Granger causality tests. Because VAR models are often difficult to interpret, we can use the impulse responses and variance |
decompositions. The impulse responses trace out the responsiveness of the dependent variables in the VAR to shocks to the error term. A unit shock is applied to each variable and its effects are noted. Variance Decomposition offers a slightly different method of examining VAR dynamics. They give the proportion of the movements in the dependent variables that are due to their ‘own’ shocks, versus shocks to the other variables. It gives information about the relative importance of each shock to the variables in the VAR. We also covered the concept of co-integration, and how to test for cointegration. Then we discussed the Error Correction Model and Vector Error Correction Model VECM. | decompositions. The impulse responses trace out the responsiveness of the dependent variables in the VAR to shocks to the error term. A unit shock is applied to each variable and its effects are noted. Variance Decomposition offers a slightly different method of examining VAR dynamics. They give the proportion of the movements in the dependent variables that are due to their ‘own’ shocks, versus shocks to the other variables. It gives information about the relative importance of each shock to the variables in the VAR. We also covered the concept of co-integration, and how to test for cointegration. Then we discussed the Error Correction Model and Vector Error Correction Model VECM. | ||
|} | |} | ||
Revision as of 21:33, 8 February 2023
...Google search Youtube search...
|
|
Contents
- 1 Qualitative Forecasting
- 2 Quantitative Forecasting
- 2.1 Time Series Forecasting
- 2.1.1 Time Series AutoML
- 2.1.2 Time Series Forecasting - Statistical
- 2.1.2.1 Autoregression (AR)
- 2.1.2.2 Moving Average (MA)
- 2.1.2.3 Autoregressive Moving Average (ARMA)
- 2.1.2.4 Autoregressive Integrated Moving Average (ARIMA)
- 2.1.2.5 Seasonal Autoregressive Integrated Moving-Average (SARIMA)
- 2.1.2.6 Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)
- 2.1.2.7 Vector Autoregression (VAR)
- 2.1.2.8 Volume Weighted Moving Average (VWMA)
- 2.1.2.9 Vector Autoregression Moving-Average (VARMA)
- 2.1.2.10 Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)
- 2.1.3 Smoothing
- 2.1.4 Time Series Forecasting - Deep Learning
- 2.1 Time Series Forecasting
- 3 Demand Forecasting
Qualitative Forecasting
|
|
Delphi
|
|
Quantitative Forecasting
|
|
Time Series Forecasting
- How to Tune LSTM Hyperparameters with Keras for Time Series Forecasting | Matt Dancho
- How (not) to use Machine Learning for time series forecasting: Avoiding the pitfalls | Vegard Flovik KDnuggeets
- How (not) to use Machine Learning for time series forecasting: Avoiding the pitfalls | Vegard Flovik - KDnuggets
- Time Series Prediction - 8 Techniques | Siraj Raval
- Amazon Forecast | AWS
- 7 Ways Time-Series Forecasting Differs from Machine Learning | Roman Josue de las Heras Torres
- Finding Patterns and Outcomes in Time Series Data - Hands-On with Python | ViralML.com
- Applying Statistical Modeling and Machine Learning to Perform Time-Series Forecasting | Tamara Louie
- Stationarity in time series analysis | Shay Palachy - Towards Data Science
- [http://www.youtube.com/
|
|
Time Series AutoML
|
|
|
|
Time Series Forecasting - Statistical
Classical time series forecasting methods may be focused on linear relationships, nevertheless, they are sophisticated and perform well on a wide range of problems, assuming that your data is suitably prepared and the method is well configured. 11 Classical Time Series Forecasting Methods in Python (Cheat Sheet) | Jason Brownlee - Machine Learning Mastery
|
|
|
|
|
|
Autoregression (AR)
YouTube search... ...Google search
|
|
Moving Average (MA)
YouTube search... ...Google search
|
|
Autoregressive Moving Average (ARMA)
YouTube search... ...Google search
|
|
Autoregressive Integrated Moving Average (ARIMA)
YouTube search... ...Google search
|
|
|
|
Seasonal Autoregressive Integrated Moving-Average (SARIMA)
YouTube search... ...Google search
|
|
Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)
YouTube search... ...Google search
|
|
Vector Autoregression (VAR)
YouTube search... ...Google search
|
|
Volume Weighted Moving Average (VWMA)
YouTube search... ...Google search
|
|
Vector Autoregression Moving-Average (VARMA)
YouTube search... ...Google search
|
|
Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)
YouTube search... ...Google search
|
|
Smoothing
|
|
|
|
Simple Exponential Smoothing (SES)
YouTube search... ...Google search
|
|
Holt's Exponential Smoothing
YouTube search... ...Google search
|
|
Winter's (Holt-Winter's) Exponential Smoothing (HWES)
YouTube search... ...Google search
|
|
Time Series Forecasting - Deep Learning
Applying deep learning methods like Multilayer Neural Networks and Long Short-Term Memory (LSTM) Recurrent Neural Network models to time series forecasting problems.| Jason Brownlee - Machine Learning Mastery
|
|
|
|
|
|
Demand Forecasting
|
|