In the process of parametric statistical data processing, there are some assumptions made to test multiple regression analysis are

**Multicolinearity, Heteroskidastity, and autocorrelation**. All this statistical assumption test, it also often referred to as classical assumption test regression. This assumption test calculation process can be done by using**SPSS statistical software**.**Read Also: Correlation and Its Characteristics**

Serially, we will discuss about how to easily calculate and interpret regression test using

**SPSS**. On this occasion, we will give an introduction what the purpose of each test of these assumptions and how to draw the conclusion from the calculation of SPSS.

**1. Test Assumptions of Multicolinearity**

**Multicolinearity testing**is performed to determine the multiple regression models were made, whether there is a correlation between the independent variables of research or not.

How to read the results of the calculation of SPSS to test

**multicollinearity assumption**, at least can be done in two ways:**a. Magnitude VIF (Variance Inflation Factor) and Tolerance**

How to determine the regression model based on criteria VIF multikolinieritas free or tolerance is;

- Having VIF around the 1.
- Having Tolerance figures close to 1.

Note: Tolerance = 1 / VIF or can also VIF = 1 / Tolerance

**b. The Magnitude of the Correlation between the Independent Variables**

The determination of whether a multiple regression model is

**or not. It is to look at the correlation coefficient between the independent variable has a value of <0.5 or not. If the correlation value is strong, so there is***multikolinearitas***.***multikolinieritas*
If there is

**Multicollinearity**, then the next step of the data processing is to remove one of the variables X that are strongly correlated from multiple regression model. Another alternative is to perform statistical modeling such as**Bayesian regression or regression Ridge**.**2. Test Assumptions of Heteroskidastity**

This test basically is to look at the pattern of the residual variance from one observation to another observation. If the variance of residual remains, it is said to occur

**and if the variance values obtained is different from one and other observations it is called***homoskedastisitas***. Ideally, Multiple regression models were would be expected not have***heteroskedastisitas***.***heteroskedastisitas*
By using SPSS, this

**heteroskedastisitas assumption test**can detect the presence or absence of a specific pattern on the chart between the axis variables that had been predicted, and the other axis as a residue (predictive value - actual value) which has been on-standardized.
The basis for drawing the conclusions are based on SPSS output graph. If there is a pattern, like dots that no particular form a regular pattern, it can be said to have occurred

**heteroskedastisitas**.**3. Test Assumptions of Autocorrelation**

**Statistical analysis of autocorrelation**is basically to test whether a multiple linear regression model, there is a correlation between bullies error in period t with an error in period t-1 (previously). Ideally, Multiple regression model is a regression that is free of

**autocorrelation**.

Detecting the presence or absence of

**autocorrelation**with SPSS can be seen from the magnitude of the**Durbin-Watson (DW)**. In general benchmark to whether**autocorrelation**is:- DW figure below -2 means there is positive autocorrelation.
- Figures DW between -2 to +2 means no autocorrelation.
- DW figure above +2 means that there is a negative autocorrelation.

The third way of doing classical regression assumptions and how to interpret it will be discussed in the next article of the series of statistical data processing. Hopefully this article can add your insights in understanding the research process you are doing.

For those of you who are working on research for a

**You can learn more about these terms and practice more using SPSS program. Thanks***skripsi, essay, thesis, dissertation.*