Introduction to the

In business applications, time is the most important factor in improving success rates. But most companies struggle to keep up. But with the development of technology, there are many effective methods that allow us to predict the future. Don’t worry, this article isn’t about time machines, it’s about practical stuff. This paper will discuss the methods of prediction. One kind of prediction is time dependent, and this way of dealing with time dependent data is called time series modeling. This model can find some hidden information in time-related data to assist decision making. The time series model is a very useful model when dealing with time series data. Most companies use time series data to analyze year two sales, website traffic, competitive position and much more. However, many people do not understand the field of time series analysis. So, if you don’t know about time series models. This article will introduce you to the processing steps of the time series model and its related techniques. Introduction to time series Model * 2. Exploring time series data using R language * 3. Introduction to ARMA Time series model * 4. The framework and application of ARIMA Time series model

Let’s get started

1. Introduction to time series model

Let’s begin. This section includes stationary sequence, random walk,Rho coefficient, and Dickey Fuller test of stationarity. If you don’t know this, don’t worry – the following concepts will be covered in detail in this section, and I bet you’ll enjoy my introduction.

Stationary series

There are three criteria to judge whether a sequence is stationary or not:

1. Mean is a constant independent of time t. The following figure (left) satisfies the condition of stationary series, and the following figure (right) is obviously time-dependent.

  1. Variance is a constant independent of time t. This property is called homogeneity of variance. The following figure shows what is and is not variance alignment. (Note the different distribution along the way on the right-hand side.)

  2. Covariance is a constant that depends only on time interval K, not time t. In the figure below (right), you can notice that the curve gets closer and closer over time. So the covariance of the red sequence is not constant.

Why do we care about stationary time series?

You can’t model a time series unless your time series is stationary. In many cases, the time stationary condition is often not satisfied, so the first thing to do is to make the time series stationary, and then try to predict the time series using a random model. There are many ways to smooth out data, such as eliminating long-term trends and differentiating.

Random walk

This is the basic concept of time series. You’re probably familiar with this concept. However, many in industry still view the random walk as a stationary sequence. In this section, I’ll use some mathematical tools to help understand this concept. Let’s start with an example

example: Imagine a girl randomly moving around on a giant chessboard. Here, the next position only depends on the last position.

\

(source: scifun.chem.wisc.edu/WOP/RandomW…).

Now imagine you’re in a closed room and you can’t see this girl. But you want to predict where this girl will be at different times. How can ability predict a bit more accurate? Of course you get worse and worse as time goes on. At time t=0, you must know where this girl is. At the next moment the girl moves to one of the eight squares in the enclosure, at which point the probability of your prediction is reduced to 1/8. Moving on, let’s formalize the sequence:

$X(t) = X(t-1) + Er(t)$ This is the randomness of the girl at each point in time.Copy the code

 

Now we recurse over all x time points, and we’ll end up with the following equation:

$X(t) = X(0) + Sum(Er(1),Er(2),Er(3)..... Er(t))$Copy the code

 

Now, let’s try to test the stationary hypothesis of the random walk: 1. Is the mean constant?

E[X(t)] = E[X(0)] + Sum(E[Er(1)],E[Er(2)],E[Er(3)]..... E[Er(t)])Copy the code

 

We know that since the expected value of the random interference term of the random process is 0. So far: E[X(t)] = E[X(0)] = constant 2. Is the variance constant?

Var[X(t)] = Var[X(0)] + Sum(Var[Er(1)],Var[Er(2)],Var[Er(3)]..... Var[Er(t)]) Var[X(t)] = t * Var(Error) = Time correlationCopy the code

 

Therefore, we infer that the random walk is not a stationary process because it has a time-varying variance. Furthermore, if we examine the covariance, we see that the covariance depends on time.

Let’s do something even more interesting

We already know that a random walk is a non-stationary process. Let’s introduce a new coefficient into the equation and see if we can develop a formula to check for stationarity. Rho factor

X(t) = Rho * X(t-1) + Er(t)
Copy the code

 

Now, we’re going to change the Rho and see if we can make this sequence smooth. We’re just looking here, we’re not doing the stationarity test.

Let’s start with a completely stationary sequence with Rho=0. Here’s a diagram of the time series:



Increase the value of Rho to 0.5 and we get the following figure:



You might notice that our period has gotten longer, but basically there doesn’t seem to be a serious violation of the stationarity hypothesis. Now let’s take the more extreme case of ρ= 0.9



We still see a return from extreme values to zero after a certain interval. This series also does not violate the obvious non-stationarity. Now, let’s take a random walk with ρ= 1



This is clearly a violation of the fixed condition. What makes rho= 1 so special? In this case, the stationarity is not satisfiedtest? Let’s look at the math

The expectation of X(t) = Rho * X(t-1) + Er(t) is:

E[X(t)] = Rho *E[ X(t-1)]
Copy the code

 

This formula makes sense. The next X(or time point t) is pulled to the value of the X above Rho*. For example, if x (t — 1) = 1, E[x (t)] = 0.5 (Rho= 0.5). Now, if you go from zero to any direction the next step you want to expect is zero. The only thing that can make expectations bigger is the error rate. What happens when Rho becomes 1? There is no possibility of any further decline.

Dickey Fuller Test of stationarity

The e final point to learn here is the Dickey Fuller test. In statistics, the Dickey-Fuller test tests the presence of unit roots in an autoregressive model. Here there is an adjustment based on the Rho coefficients above, converting the formula to the Dickey-Fuller test

X(t) = Rho * X(t-1) + Er(t)
=>  X(t) - X(t-1) = (Rho - 1) X(t - 1) + Er(t)
Copy the code

 

We test if Rho – 1=0 to see if the difference is significant. If the null hypothesis is not true, we have a stationary time series. Stationarity testing and transforming a sequence into a stationarity sequence are the most important parts of time series models. So keep all the concepts mentioned in this section in mind for the next section. Let’s look at the time series example.

2. Use R to explore time series

In this section we will learn how to use R to process time series. Here we are just exploring time series, not modeling time series. The data used in this section is the built-in data in R: AirPassengers. This data set is the number of international air passengers per month from 1949 to 1960.

In the data set

The following code will help us in the data set and be able to see some small data sets.

Class (AirPassengers) 3 [1] "ts" 4 # check (AirPassengers) Start (AirPassengers) 6 [1] 1949 1 7 # end(AirPassengers) 9 [1] 1960 12 10 (Airpassengers) 12 [1] 12 13 # time series (Airpassengers) 14 > summary(Airpassengers) 15 Min 1ST qu. Median Mean 3rd qu.max. 16 104.0 180.0 265.5 280.3 360.5 622.0Copy the code

 

Detailed data in the matrix

1 #The number of passengers are distributed across The spectrum 2 > plot(AirPassengers >abline(reg=lm(AirPassengers~time(AirPassengers)) 5 #Copy the code

 

1 > cycle(AirPassengers) 2 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec 3 1949 1 2 3 4 5 6 7 8 9 10 11 12 4 1950 1 2 3 4 5 6 7 8 9 10 11 12 5 1951 12 3 4 5 6 7 8 9 10 11 12 6 1952 12 3 4 5 6 7 8 9 10 11 12 7 1953 12 3 4 5 6 7 8 9 10 11 12 7 10 10 10 11 12 8 1954 12 3 4 56 7 8 9 10 11 12 9 1955 12 3 4 56 7 8 9 10 11 12 10 1956 12 3 4 56 7 8 9 10 11 12 11 1957 12 12 3 4 5 6 7 8 9 10 11 12 13 1959 12 3 4 5 6 7 8 9 10 11 12 14 1960 12 3 4 5 6 7 8 9 10 11 12 14 16 > plot(aggregate(AirPassengers,FUN=mean)) Boxplot (AirPassengers~cycle(AirPassengers)Copy the code

 

Important inference

  1. The yearly trend shows that the number of passengers is increasing each year
  2. The mean and variance in July and August were much higher than in other months
  3. The average varies from month to month, but the variance is small. So, you can see that it’s very cyclical. , a cycle of 12 months or less.

Looking at the data, testing the data, is the most important part of modeling a time series – without this step, you won’t know if the series is stationary or not. As in this example, we already know a lot of details about this model. Next, we will build some time series models and the characteristics of these models, as well as make some predictions.

3. ARMA Time series model

ARMA is also called autoregressive moving average mixed model. ARMA models are often used in time series. In ARMA model, AR stands for autoregression and MA stands for moving average. If these terms sound complicated, don’t worry – we’ll brief you on these concepts in a few minutes. We will see the characteristics of these models now. Before you start, remember that AR or MA does not apply to non-stationary sequences. In practice you may have a non-stationary sequence, the first thing you need to do is to turn the sequence into a stationary sequence (by differential differentiation/transformation), and then choose the time series model you can use. First, this article introduces the two models (AR&MA) separately. Let’s take a look at the characteristics of these models.

Autoregressive time series model

Let’s understand the AR model from the following example: The current state of a country’s GDP (x(t)) depends on last year’s GDP (x(t-1)). This assumes that a country’s GDP this year depends on last year’s GDP and new factories and services opened this year. But the main dependence of GDP on last year’s GDP. Then, the formula of GDP is:

X (t) = alpha * x(t -- 1) + error (t) (1)Copy the code

 

This equation is the AR formula. Formula (1) shows that the next point is completely dependent on the previous point. Alpha is a coefficient and we want to find alpha to minimize the error rate. X of t minus 1 also depends on x of t.

For example, x(t) represents the amount of juice sold in a city on a given day. In winter, few suppliers carry juice. Suddenly one day, the temperature rises and the demand for juice jumps to 1000. However, after a few days, the temperature drops. But it’s well known that people drink juice on hot days, and 50 percent of those people still drink juice on cold days. Over the next few days, the percentage drops to 25 percent (50 percent of 50 percent), and then gradually drops to a small number over the next few days. The following figure illustrates the inertia of AR sequences:

Moving average time series model

Let’s do another example of moving averages. It is easy to understand that a company generates a certain type of package. As a competitive market, the sales of bags increased from zero. So one day he did an experiment to design and make different bags that weren’t available for purchase. So let’s say the total demand in the market is 1,000 of these bags. On one day, demand for the bag was so high that inventory soon ran out. At the end of the day there are 100 bags left. Let’s call this error point in time. Several customers still bought the bag in the following days. Here is a simple formula to describe this scenario:

x(t) = beta *  error(t-1) + error (t)
Copy the code

 

Try to draw this, it looks like this:



Notice the difference between the MA and AR models? In the MA model, the noise/impact rapidity is small. In the AR model it will be affected for a long time.

The difference between AR model and MA model

The main difference between AR and MA models lies in the correlation of time series objects at different time points. The MA model uses a linear combination of random disturbances or prediction errors from past periods to express the current predicted value. When n> a certain value, the correlation between x(t) and x(t-n) is always 0.AM model only reflects the influence and effect of relevant factors on the prediction target through the historical observation value of time series variables, and the assumption condition constraint of step model variables is relatively independent. The model can eliminate the difficulties caused by the choice of independent variables and multicollinearity. That is, the correlation between X (t) and x(t-1) in AM model becomes smaller and smaller as time goes by. Use that distinction to your advantage.

Mapping using ACF and PACF

Once we have a stationary time series. We have to answer two of the most important questions; Q1: Is this an AR or MA process? Q2: What is the order of AR or MA processes that we need to utilize?

To solve these two problems, we use two coefficients:

Time series X (t) lag k order sample autocorrelation coefficient (ACF) and lag K period under the case of sample partial autocorrelation coefficient (PACF). The formula is omitted.

ACF and PACF of AR model:

The calculation proves that:

The ACF of -AR is a trailing sequence, that is, no matter how large the lag period k is, the calculated value of ACF is related to the autocorrelation function of order 1 to P hysteresis.

PACF of -AR is truncated sequence, that is, PACF=0 when k> P in lag period.



The blue line above shows values that are significantly different from 0. Obviously, the PACF diagram above shows truncation at the second lag, which means that this is an AR (2) process.

ACF and PACF of MA model:

The ACF of -MA is a truncated sequence, that is, PACF=0 when k> P in lag period.

The PACF of -AR is a trailing sequence, that is, no matter how large the lag period k is, the calculated value of ACF is related to the autocorrelation function of order 1 to P hysteresis.



Obviously, the ACF diagram above truncates the second lag, which means that this is an MA (2) process.

So far, this paper has introduced the use of ACF&PACF diagrams to identify stationary sequence types. Now, I’ll introduce the overall framework of a time series model. In addition, the practical application of the time series model will be discussed.

4. Framework and application of ARIMA time series model

At this point, this paper quickly introduces the basic concepts of time series model, using R to explore time series and ARMA model. Now let’s organize these bits and pieces and do something really interesting.

The framework

The frame below shows how a step-by-step”Do a time series analysis



The first three steps were discussed in the previous section. However, a quick note is needed:

Step 1: Time series visualization

Before building any kind of time series model, it is crucial to analyze its trends. The details we are interested in include trends, cycles, seasonality or random behavior in sequences. This has been introduced in the second part of this article.

Step 2: Sequence stability

Once we know the pattern, the trend, the cycle. We can check if the sequence is stationary. The Dicky-Fuller test is a popular one. This method of testing is introduced in the first part of the opinion. It’s not over here! What if the sequence is found to be nonstationary? There are three commonly used techniques for stabilizing a time series. 1. Trend elimination: Here we simply delete trend components in time series. For example, the equation for my time series is:

x(t) = (mean + trend * t) + error
Copy the code

 

Here I simply delete trend*t from the formula above and establish x(t)=mean+error model 2 difference: this technique is often used to eliminate non-stationarity. Here we are modeling the difference results of the sequence rather than the actual sequence. Such as:

X (t) -- x(t-1) = ARMA (p, q)Copy the code

 

This difference is also part of ARIMA. Now we have three parameters:

p:AR
d:I
q:MA
Copy the code

 

3. Seasonality: Seasonality is directly incorporated into ARIMA model. We’ll talk more about this in the applications section below.

Step 3: Find the optimal parameter

Parameters p and q can be found using ACF and PACF diagrams. In addition to this method, if the correlation coefficient ACF and partial correlation coefficient PACF gradually decrease, it indicates that we need to stabilize the time series and introduce the D parameter.

Step 4: Resume ARIMA model

Having found these parameters, we can now try the resume ARIMA model. The value found from the previous step may only be an approximate estimate, and we need to explore more combinations of (p, D,q). The minimum BIC and AIC model parameters are what we want. We can also try some seasonal ingredients. Here, we’ll notice something seasonal in the ACF/PACF diagram.

Step 5: Predict

At this point, we have the ARIMA model, and we can now make predictions. We can also visualize this trend and cross-verify it.

Application of time series model.

Here we use the previous example. Use this time series to make predictions. We encourage you to look at this data before taking the next step.

Where do we start?

The chart below shows the number of passengers over the years. Before moving on, look at the graph.



Here are my observations:

Passengers have a tendency of increasing year by year.

2. It seems to be seasonal, with each cycle no longer than 12 months.

3. Variance of data increases year by year.

We need to solve two problems before we can do the stability test. First, we need to eliminate variance variance. So here we’re taking the log of this sequence. Second, we need to solve the trend of the sequence. We do it by differentiating the sequence. Now, let’s check the stationarity of the final sequence.

Adf. test(diff(log(AirPassengers)), alternative="stationary", k=0) Install. Packages ("tseries") 3 # library(tSeries 4 data: Diff (log(AirPassengers)) 5 dickey-fuller = -9.6003, Lag order = 0, 6 p-value = 0.01 7 alternative hypothesis: stationaryCopy the code

 

We can see that this sequence is sufficiently stationary to make any time series model. The next step is to find the correct parameters for the ARIMA model. We know that ‘d’ is 1, so we need to make a difference to make the sequence stable. Here we plot the correlation. Here is the ACF diagram of the sequence.

# ACF figure ACF (log (AirPassengers))Copy the code

 

What can be seen from the above table?

It is clear that the ACF is falling very slowly, which means that passenger numbers are not flat. As we have discussed earlier, we are now going to do regression on the logarithmic difference of the sequence rather than directly on the logarithmic difference of the data entropy of the sequence. Let’s look at the ACF and PACF curves after difference.

> acf(diff(log(AirPassengers)))
> pacf(diff(log(AirPassengers)))
Copy the code

 



Obviously ACF ends with the first lag, so we know that p should be 0 and q should be 1 or 2. After several iterations, we found that AIC and BIC were the smallest when (p,d,q) was set to (0,1,1).

1 > fit <- arima(log(AirPassengers), c(0, 1, 1),seasonal = list(order = c(0, 1, 1), P < -predict (fit, n.ahead = 10*12) 3 ts.plot(AirPassengers,2.718^pred$pred, log = "y", Lty = c (1, 3))Copy the code

 

The resources

Chapter 8 Time Series Analysis