Correlation and Regression
Correlation and Regression
Testing the significance of the difference between two means, two standard deviations, two portions, or two
percentages, is an important area of inferential statistics. Comparison between two or more variables often arises in
research or experiments and to be able to make valid conclusions regarding the result of the study, one must apply an
appropriate test statistic. This chapter deals with the discussion of the different test statistics that are commonly used in
research studies.
Correlation Analysis
Correlation analysis is applied in quantifying the association between two continuous variables, for example, a
dependent and independent variable or among two independent variables.
Regression Analysis
Regression analysis refers to assessing the relationship between the outcome variable and one or more
variables. The outcome variable is known as the dependent or response variable and the risk elements, and co-
founders are known as predictors or independent variables. The dependent variable is shown by “y” and the
independent variables are shown by “x” in regression analysis.
The sample of a correlation coefficient is estimated in the correlation analysis. It ranges between -1 and +1, denoted
by r, and quantifies the strength and direction of the linear association among two variables. The correlation
between two variables can either be positive, i.e. a higher level of one variable is related to a higher level of another,
or negative, i.e. a higher level of one variable is related to a lower level of the other.
The sign of the coefficient of correlation shows the direction of the association. The magnitude of the coefficient
shows the strength of the association.
For example, a correlation of r = 0.8 indicates a positive and strong association among two variables, while a
correlation of r = -0.3 shows a negative and weak association. A correlation near to zero shows the non-existence of
linear association among two continuous variables.
Linear Regression
Linear regression is a linear approach to modeling the relationship between the scalar components and one or
more independent variables. If the regression has one independent variable, then it is known as a simple linear
regression. If it has more than one independent variable, then it is known as multiple linear regression. Linear
regression only focuses on the conditional probability distribution of the given values rather than the joint
probability distribution. In general, all the real-world regression models involve multiple predictors. So, the term
linear regression often describes multivariate linear regression.
Republic of the Philippines
CAMARINES SUR POLYTECHNIC COLLEGES
Nabua, Camarines Sur
The coefficient of correlation is measured on a scale that varies from +1 to -1 through 0. The complete
correlation among two variables is represented by either +1 or -1. The correlation is positive when one variable
increases and so does the other; while it is negative when one decreases as the other increases. The absence of
correlation is described by 0.
Where,
The above formulas are used to find the correlation coefficient for the given data. Based on the value obtained
through these formulas, we can determine how strong is the association between two variables.
Y = a + bX
Where,
Y = Dependent variable
X = Independent variable
a = [(∑y)(∑x2 ) – (∑x)(∑xy)] / [n(∑x2 ) – (∑x)2 ]
Republic of the Philippines
CAMARINES SUR POLYTECHNIC COLLEGES
Nabua, Camarines Sur
Regression Coefficient
In the linear regression line, the equation is given by:
Y = b0 + b1X
The observed data sets are given by xi and yi. x and y are the mean values of the respective variables.
We know that there are two regression equations and two coefficients of regression.
Byx = r(σy/σx)
Bxy = r(σx/σy)
Where,
σx = Standard deviation of x
σy = Standard deviation of y