How different will that be between the R-squared of linear regression y~x and square of cor(x,y)How to estimate a multiple regression function that has interactions between the independent variables and is potentially non-linear?Correlation between a linear and a log seriesWhat is the qualitative difference between a Michaelis-Menten model and a log-linear model?Relation between linear regression prediction accuracy and correlationsuitable non-linear equation to capture a 'J-shaped' relationship between x and yWhat's the similarity and the difference between time series regression and “regular” linear regression?Is there a relationship between R squared and linear regression coefficients variances?How to simulate the outcome in a simple linear regression given X and R-squared?What is difference between correlation and simple linear regression (binary dependent variable and continuous independent variable)?Is a linear correlation between logs useful for making predictions with a regression model?

Handling a player (unintentionally) stealing the spotlight

How hard is it to sell a home which is currently mortgaged?

Are these intended activities legal to do in the USA under the VWP?

Find first and last non-zero column in each row of a pandas dataframe

Balanced parentheses using STL C++

Can an editor review manuscript without sending to reviewers?

What is the purpose of putting a capacitor on the primary side of a step-down transformer?

Most elegant way to write a one shot IF

What exactly did Ant-Man see that made him say that their plan worked?

Meaning of じゃないんじゃない?

Movie in a trailer park named Paradise and a boy playing a video game then being recruited by aliens to fight in space

Can two or more lightbeams (from a laser for example) have visible interference when they cross in mid-air*?

Is there reliable evidence that depleted uranium from the 1999 NATO bombing is causing cancer in Serbia?

Is there a legal way for US presidents to extend their terms beyond four years?

How would an order of Monks that renounce their names communicate effectively?

Why was Mal so quick to drop Bester in favour of Kaylee?

Why were the first airplanes "backwards"?

How to securely dispose of a smartphone?

Why do I need two parameters in an HTTP parameter pollution attack?

Can I travel from Germany to England alone as an unaccompanied minor?

Donkey as Democratic Party symbolic animal

Using a concentration spell on top of another spell from another spell list?

Adjective for 'made of pus' or 'corrupted by pus' or something of something of pus

Should I report a leak of confidential HR information?



How different will that be between the R-squared of linear regression y~x and square of cor(x,y)


How to estimate a multiple regression function that has interactions between the independent variables and is potentially non-linear?Correlation between a linear and a log seriesWhat is the qualitative difference between a Michaelis-Menten model and a log-linear model?Relation between linear regression prediction accuracy and correlationsuitable non-linear equation to capture a 'J-shaped' relationship between x and yWhat's the similarity and the difference between time series regression and “regular” linear regression?Is there a relationship between R squared and linear regression coefficients variances?How to simulate the outcome in a simple linear regression given X and R-squared?What is difference between correlation and simple linear regression (binary dependent variable and continuous independent variable)?Is a linear correlation between logs useful for making predictions with a regression model?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2












$begingroup$


Generally, both of them can represent the linear relationship between x and y scale to [0,1].
Are they 99% very similar?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Not the same thing. See answers.
    $endgroup$
    – BruceET
    Jun 19 at 5:41

















2












$begingroup$


Generally, both of them can represent the linear relationship between x and y scale to [0,1].
Are they 99% very similar?










share|cite|improve this question











$endgroup$











  • $begingroup$
    Not the same thing. See answers.
    $endgroup$
    – BruceET
    Jun 19 at 5:41













2












2








2





$begingroup$


Generally, both of them can represent the linear relationship between x and y scale to [0,1].
Are they 99% very similar?










share|cite|improve this question











$endgroup$




Generally, both of them can represent the linear relationship between x and y scale to [0,1].
Are they 99% very similar?







regression correlation generalized-linear-model similarities univariate






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jun 20 at 3:22







Tommy Yu

















asked Jun 19 at 3:27









Tommy YuTommy Yu

184 bronze badges




184 bronze badges











  • $begingroup$
    Not the same thing. See answers.
    $endgroup$
    – BruceET
    Jun 19 at 5:41
















  • $begingroup$
    Not the same thing. See answers.
    $endgroup$
    – BruceET
    Jun 19 at 5:41















$begingroup$
Not the same thing. See answers.
$endgroup$
– BruceET
Jun 19 at 5:41




$begingroup$
Not the same thing. See answers.
$endgroup$
– BruceET
Jun 19 at 5:41










2 Answers
2






active

oldest

votes


















4












$begingroup$

Computationally, R-sq in computer printouts is the square $r^2$of the (Pearson) correlation $r.$ Sometimes $r^2$ is called the 'coefficient of determination'. Because $-1 le r le 1$ we have $0 le r^2 le 1.$



Suppose we have the simple linear regression model
$$Y_i = beta_0 + beta_1 x_i + e_i,$$
for $i = 1,2, dots,n$ where $e_i stackreliidsimmathsfNorm(0, sigma_e).$



In regression $r^2$ is sometimes referred to (intuitively) as the proportion of the variability in $Y$ that is explained by regression on $x.$ This interpretation roughly matches the equation
$$S_x^2 = fracn-1n-2S_Y^2(1 - r^2),$$
where $S_Y^2$ is the sample variance of the $Y_i$ and $S_x^2$ is the sum of squared residuals divided by $n-2,$ sometimes referred to as the variance about the regression line.



Thus if $r = pm 1$ so that $r^2 = 1,$ then $S_x^2 = 0$ and all of the $(x_i,Y_i)$-points lie
precisely on the regression line. Also, if $r approx 0,$ then $S_x^2 approx S_Y^2$ and the $x_i$ have no
role to play in 'explaining' the $Y_i.$



Sometimes, the Pearson 'correlation coefficient' $r$ is said to express the 'linear component' of the association of $X_i$ and $Y_i.$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    @mkt. Thanks for fixing typo.
    $endgroup$
    – BruceET
    Jun 19 at 5:59


















1












$begingroup$

R-squared or coefficient of determination generally has the value between 0 and 1. However cor(x,y) is between -1 and 1.
Definition of R-squared for a multiple linear regression is the square of correlation between output (y) and predicted values(f). For the case of simple linear regression with an intercept and one explanatory variable it happens R-squared is as same as square of cor(x,y).
I hope this helps you.






share|cite|improve this answer










New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





$endgroup$












  • $begingroup$
    Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
    $endgroup$
    – Masoud Norouzi Darabad
    Jun 19 at 5:19













Your Answer








StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f413681%2fhow-different-will-that-be-between-the-r-squared-of-linear-regression-yx-and-sq%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









4












$begingroup$

Computationally, R-sq in computer printouts is the square $r^2$of the (Pearson) correlation $r.$ Sometimes $r^2$ is called the 'coefficient of determination'. Because $-1 le r le 1$ we have $0 le r^2 le 1.$



Suppose we have the simple linear regression model
$$Y_i = beta_0 + beta_1 x_i + e_i,$$
for $i = 1,2, dots,n$ where $e_i stackreliidsimmathsfNorm(0, sigma_e).$



In regression $r^2$ is sometimes referred to (intuitively) as the proportion of the variability in $Y$ that is explained by regression on $x.$ This interpretation roughly matches the equation
$$S_x^2 = fracn-1n-2S_Y^2(1 - r^2),$$
where $S_Y^2$ is the sample variance of the $Y_i$ and $S_x^2$ is the sum of squared residuals divided by $n-2,$ sometimes referred to as the variance about the regression line.



Thus if $r = pm 1$ so that $r^2 = 1,$ then $S_x^2 = 0$ and all of the $(x_i,Y_i)$-points lie
precisely on the regression line. Also, if $r approx 0,$ then $S_x^2 approx S_Y^2$ and the $x_i$ have no
role to play in 'explaining' the $Y_i.$



Sometimes, the Pearson 'correlation coefficient' $r$ is said to express the 'linear component' of the association of $X_i$ and $Y_i.$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    @mkt. Thanks for fixing typo.
    $endgroup$
    – BruceET
    Jun 19 at 5:59















4












$begingroup$

Computationally, R-sq in computer printouts is the square $r^2$of the (Pearson) correlation $r.$ Sometimes $r^2$ is called the 'coefficient of determination'. Because $-1 le r le 1$ we have $0 le r^2 le 1.$



Suppose we have the simple linear regression model
$$Y_i = beta_0 + beta_1 x_i + e_i,$$
for $i = 1,2, dots,n$ where $e_i stackreliidsimmathsfNorm(0, sigma_e).$



In regression $r^2$ is sometimes referred to (intuitively) as the proportion of the variability in $Y$ that is explained by regression on $x.$ This interpretation roughly matches the equation
$$S_x^2 = fracn-1n-2S_Y^2(1 - r^2),$$
where $S_Y^2$ is the sample variance of the $Y_i$ and $S_x^2$ is the sum of squared residuals divided by $n-2,$ sometimes referred to as the variance about the regression line.



Thus if $r = pm 1$ so that $r^2 = 1,$ then $S_x^2 = 0$ and all of the $(x_i,Y_i)$-points lie
precisely on the regression line. Also, if $r approx 0,$ then $S_x^2 approx S_Y^2$ and the $x_i$ have no
role to play in 'explaining' the $Y_i.$



Sometimes, the Pearson 'correlation coefficient' $r$ is said to express the 'linear component' of the association of $X_i$ and $Y_i.$






share|cite|improve this answer











$endgroup$








  • 1




    $begingroup$
    @mkt. Thanks for fixing typo.
    $endgroup$
    – BruceET
    Jun 19 at 5:59













4












4








4





$begingroup$

Computationally, R-sq in computer printouts is the square $r^2$of the (Pearson) correlation $r.$ Sometimes $r^2$ is called the 'coefficient of determination'. Because $-1 le r le 1$ we have $0 le r^2 le 1.$



Suppose we have the simple linear regression model
$$Y_i = beta_0 + beta_1 x_i + e_i,$$
for $i = 1,2, dots,n$ where $e_i stackreliidsimmathsfNorm(0, sigma_e).$



In regression $r^2$ is sometimes referred to (intuitively) as the proportion of the variability in $Y$ that is explained by regression on $x.$ This interpretation roughly matches the equation
$$S_x^2 = fracn-1n-2S_Y^2(1 - r^2),$$
where $S_Y^2$ is the sample variance of the $Y_i$ and $S_x^2$ is the sum of squared residuals divided by $n-2,$ sometimes referred to as the variance about the regression line.



Thus if $r = pm 1$ so that $r^2 = 1,$ then $S_x^2 = 0$ and all of the $(x_i,Y_i)$-points lie
precisely on the regression line. Also, if $r approx 0,$ then $S_x^2 approx S_Y^2$ and the $x_i$ have no
role to play in 'explaining' the $Y_i.$



Sometimes, the Pearson 'correlation coefficient' $r$ is said to express the 'linear component' of the association of $X_i$ and $Y_i.$






share|cite|improve this answer











$endgroup$



Computationally, R-sq in computer printouts is the square $r^2$of the (Pearson) correlation $r.$ Sometimes $r^2$ is called the 'coefficient of determination'. Because $-1 le r le 1$ we have $0 le r^2 le 1.$



Suppose we have the simple linear regression model
$$Y_i = beta_0 + beta_1 x_i + e_i,$$
for $i = 1,2, dots,n$ where $e_i stackreliidsimmathsfNorm(0, sigma_e).$



In regression $r^2$ is sometimes referred to (intuitively) as the proportion of the variability in $Y$ that is explained by regression on $x.$ This interpretation roughly matches the equation
$$S_x^2 = fracn-1n-2S_Y^2(1 - r^2),$$
where $S_Y^2$ is the sample variance of the $Y_i$ and $S_x^2$ is the sum of squared residuals divided by $n-2,$ sometimes referred to as the variance about the regression line.



Thus if $r = pm 1$ so that $r^2 = 1,$ then $S_x^2 = 0$ and all of the $(x_i,Y_i)$-points lie
precisely on the regression line. Also, if $r approx 0,$ then $S_x^2 approx S_Y^2$ and the $x_i$ have no
role to play in 'explaining' the $Y_i.$



Sometimes, the Pearson 'correlation coefficient' $r$ is said to express the 'linear component' of the association of $X_i$ and $Y_i.$







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jun 19 at 5:50

























answered Jun 19 at 5:11









BruceETBruceET

9,7481 gold badge8 silver badges24 bronze badges




9,7481 gold badge8 silver badges24 bronze badges







  • 1




    $begingroup$
    @mkt. Thanks for fixing typo.
    $endgroup$
    – BruceET
    Jun 19 at 5:59












  • 1




    $begingroup$
    @mkt. Thanks for fixing typo.
    $endgroup$
    – BruceET
    Jun 19 at 5:59







1




1




$begingroup$
@mkt. Thanks for fixing typo.
$endgroup$
– BruceET
Jun 19 at 5:59




$begingroup$
@mkt. Thanks for fixing typo.
$endgroup$
– BruceET
Jun 19 at 5:59













1












$begingroup$

R-squared or coefficient of determination generally has the value between 0 and 1. However cor(x,y) is between -1 and 1.
Definition of R-squared for a multiple linear regression is the square of correlation between output (y) and predicted values(f). For the case of simple linear regression with an intercept and one explanatory variable it happens R-squared is as same as square of cor(x,y).
I hope this helps you.






share|cite|improve this answer










New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





$endgroup$












  • $begingroup$
    Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
    $endgroup$
    – Masoud Norouzi Darabad
    Jun 19 at 5:19















1












$begingroup$

R-squared or coefficient of determination generally has the value between 0 and 1. However cor(x,y) is between -1 and 1.
Definition of R-squared for a multiple linear regression is the square of correlation between output (y) and predicted values(f). For the case of simple linear regression with an intercept and one explanatory variable it happens R-squared is as same as square of cor(x,y).
I hope this helps you.






share|cite|improve this answer










New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





$endgroup$












  • $begingroup$
    Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
    $endgroup$
    – Masoud Norouzi Darabad
    Jun 19 at 5:19













1












1








1





$begingroup$

R-squared or coefficient of determination generally has the value between 0 and 1. However cor(x,y) is between -1 and 1.
Definition of R-squared for a multiple linear regression is the square of correlation between output (y) and predicted values(f). For the case of simple linear regression with an intercept and one explanatory variable it happens R-squared is as same as square of cor(x,y).
I hope this helps you.






share|cite|improve this answer










New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





$endgroup$



R-squared or coefficient of determination generally has the value between 0 and 1. However cor(x,y) is between -1 and 1.
Definition of R-squared for a multiple linear regression is the square of correlation between output (y) and predicted values(f). For the case of simple linear regression with an intercept and one explanatory variable it happens R-squared is as same as square of cor(x,y).
I hope this helps you.







share|cite|improve this answer










New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








share|cite|improve this answer



share|cite|improve this answer








edited Jun 19 at 5:15





















New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








answered Jun 19 at 5:03









Masoud Norouzi DarabadMasoud Norouzi Darabad

112 bronze badges




112 bronze badges




New contributor



Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




New contributor




Masoud Norouzi Darabad is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.













  • $begingroup$
    Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
    $endgroup$
    – Masoud Norouzi Darabad
    Jun 19 at 5:19
















  • $begingroup$
    Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
    $endgroup$
    – Masoud Norouzi Darabad
    Jun 19 at 5:19















$begingroup$
Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
$endgroup$
– Masoud Norouzi Darabad
Jun 19 at 5:19




$begingroup$
Definitely they are not same, but for simple linear regression : Y=ax+b (x is a single variable and not a vector), value of coefficient of determination is equal to square of correlation between x and y.
$endgroup$
– Masoud Norouzi Darabad
Jun 19 at 5:19

















draft saved

draft discarded
















































Thanks for contributing an answer to Cross Validated!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f413681%2fhow-different-will-that-be-between-the-r-squared-of-linear-regression-yx-and-sq%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

Circuit construction for execution of conditional statements using least significant bitHow are two different registers being used as “control”?How exactly is the stated composite state of the two registers being produced using the $R_zz$ controlled rotations?Efficiently performing controlled rotations in HHLWould this quantum algorithm implementation work?How to prepare a superposed states of odd integers from $1$ to $sqrtN$?Why is this implementation of the order finding algorithm not working?Circuit construction for Hamiltonian simulationHow can I invert the least significant bit of a certain term of a superposed state?Implementing an oracleImplementing a controlled sum operation

Magento 2 “No Payment Methods” in Admin New OrderHow to integrate Paypal Express Checkout with the Magento APIMagento 1.5 - Sales > Order > edit order and shipping methods disappearAuto Invoice Check/Money Order Payment methodAdd more simple payment methods?Shipping methods not showingWhat should I do to change payment methods if changing the configuration has no effects?1.9 - No Payment Methods showing upMy Payment Methods not Showing for downloadable/virtual product when checkout?Magento2 API to access internal payment methodHow to call an existing payment methods in the registration form?