-
Notifications
You must be signed in to change notification settings - Fork 24
Traditional forecasting models
This wiki page is dedicated to the discussion of traditional time series models and how they fit into the giotto-time
framework.
A good overview of some fundamental forecasting methods is available here.
A fitting procedure for ARIMA is described here. Ideally we should support all the things they mention there.
- Naive methods (Average, Naive, Seasonal naive, Drift). They are simple to implement and they should be there in a time series library. The implementation should be quite easy and compatible with our framework.
A lot of decomposition methods are available. We should put some of them in the library.
- (optional) Classical. Looks like deprecated though.
- (optional) X11. Better than the one before. The implementation does not look easy.
- (optional) SEATS. Only for monthly and quarter data.
- STL decomposition. The only disadvantage of STL compared to previous methods is that it does not handle automatically trading day and calendar variations.
In my idea, we should provide a sort of decomposition
suite in the library, with default parameters that work well for most cases and some advanced ones to perform most advanced decompositions.
- Logaritmic transformation. Useful for example if you want positive forecasts.
- Box-Cox transformation
- A general way to compute prediction intervals would be through bootstrap (to investigate further).
- A way worth investigating would be through Gradient Boosting
Analysis of the residual is very important for time series forecasting. We should provide some tools to automatically analyze and plot the residuals. In particular check that the residuals have the following properties:
- Uncorrelated. e.g. with ACF or Portmanteau tests
- Zero mean.
- Constant variance.
- Normally distributed
We already implemented a nice interface for regression models.
What is missing is a way to automatically add a lot of potentially useful predictors automatically to the X
matrix. What I have in mind is something similar to the fast.ai
get_transforms()
function, that automatically generates time series features.
The standard method is simple. However it is not linear in the weights. Not clear how to fit it in our framework.
Exponential smoothing is not limited to a single method. Effort has been put in improving the base model with seasonality and trend components. 9 variations are available depending on how you add trend and seasonality components.
Be careful: Exponential Smoothing requires a non-linear optimization to find the best parameter. The implementation has to be careful to be optimized for speed.
The basic hypothesis is a stationary time series.
- AR part: already there. We have to investigate if different methods to solve the linear system are currently used.
- MA part: crucial part. How to add it to our framework?. Here is a reference on how to compute the coefficients.
- I part: necessary since the input time series need to be stationary. We may want to integrate this part with the seasonality/trend removal part of the library.
- SARIMA: as for the I part, we need to syncronize with the seasonality part of the library to do something consistent.
Furthermore, we need to implement:
- ACF and PACF. They are used to determine the p and q coefficients in ARIMA.
- AIC and BIC.
Different approaches to tackle the problem:
- Bottom-up. You forecast the low-level time series and then sum up.
- Top-down. Apparently only for hierarchical structures. You forecast the top-level time series and you split it into its low-level components with fixed proportions. There are also methods to forecast proportions.
- Middle-level. Mix of the previous two. The time series to be forecasted is a mid-level one in the hierarchy diagram
- Optimal Reconciliation. First independently produce forecasts for all series at all the levels of the hierarchy and then use a regression model solved by Generalised Least Squares (GLS) estimator to combine these to give coherent forecast.