How to include an interaction with a quadratic term? [closed]Interpreting interactions in a linear model vs quadratic modelInterpreting interaction with quadratic termPlotting interaction effect without significant main effects (not about code)Interaction Terms and Logit ModelsInterpreting interaction effects in a multilevel modelHow to include a linear and quadratic term when also including interaction with those variables?Investigating interactionInteraction term in a linear mixed effect model in RWhy include insignificant main term when interaction term is included?Interpreting two-way interaction in the presence of quadratic interactionWhy include quadratic terms in interactions in lmer?Interpretation of interaction in presence of squared terms

Why do Russians call their women expensive ("дорогая")?

Why does the 6502 have the BIT instruction?

Is floating in space similar to falling under gravity?

NL - iterating all edges of a graph in log space

What problems does SciDraw still solve?

How many chess players are over 2500 Elo?

Glitch in AC sine wave interfering with phase cut dimming

If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?

Do you play the upbeat when beginning to play a series of notes, and then after?

Smart people send dumb people to a new planet on a space craft that crashes into a body of water

Looking after a wayward brother in mother's will

What was this black-and-white film set in the Arctic or Antarctic where the monster/alien gets fried in the end?

shutdown at specific date

Solmization with syllables - du da di

What is the best linguistic term for describing the kw > p / gw > b change, and its usual companion s > h

A Mathematical Discussion: Fill in the Blank

Modern approach to radio buttons

Is this story about US tax office reasonable?

What does uniform continuity mean exactly?

How to prevent bad sectors?

Is it ok to put a subplot to a story that is never meant to contribute to the development of the main plot?

Is this light switch installation safe and legal?

Draw a checker pattern with a black X in the center

Is there an explanation for Austria's Freedom Party virtually retaining its vote share despite recent scandal?



How to include an interaction with a quadratic term? [closed]


Interpreting interactions in a linear model vs quadratic modelInterpreting interaction with quadratic termPlotting interaction effect without significant main effects (not about code)Interaction Terms and Logit ModelsInterpreting interaction effects in a multilevel modelHow to include a linear and quadratic term when also including interaction with those variables?Investigating interactionInteraction term in a linear mixed effect model in RWhy include insignificant main term when interaction term is included?Interpreting two-way interaction in the presence of quadratic interactionWhy include quadratic terms in interactions in lmer?Interpretation of interaction in presence of squared terms






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2












$begingroup$


I want to predict $y$ with $x_1$ and $x_2$ and I suppose that $x_2$ has a quadratic effect on $y$ and that there is an interaction. How to model that?



I've look in previous questions but there seem to be different suggestions.



1. Include all possible effects separately (see model 2):



$y$ ~ $x_1 + x_2 + x_2^2 + x_1 : x_2 + x_1 : x_2^2$



2. Keep all the parts of your polynomial variable together:



$y$ ~ $x_1 + x_2 + x_2^2 + x_1 : (x_2 + x_2^2)$



I use the notation of R where $y$ ~ $x_1 + x_2 + x_1 : x_2$, for example, means that there are two main effects, namely $x_1$ and $x_2$, and an interaction between $x_1$ and $x_2$. In R there is no need to specify the intercept, but it is estimated by default, too.










share|cite|improve this question











$endgroup$



closed as off-topic by mkt, whuber May 22 at 18:30


This question appears to be off-topic. The users who voted to close gave this specific reason:


  • "This question appears to be off-topic because EITHER it is not about statistics, machine learning, data analysis, data mining, or data visualization, OR it focuses on programming, debugging, or performing routine operations within a statistical computing platform. If the latter, you could try the support links we maintain." – mkt, whuber
If this question can be reworded to fit the rules in the help center, please edit the question.






















    2












    $begingroup$


    I want to predict $y$ with $x_1$ and $x_2$ and I suppose that $x_2$ has a quadratic effect on $y$ and that there is an interaction. How to model that?



    I've look in previous questions but there seem to be different suggestions.



    1. Include all possible effects separately (see model 2):



    $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : x_2 + x_1 : x_2^2$



    2. Keep all the parts of your polynomial variable together:



    $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : (x_2 + x_2^2)$



    I use the notation of R where $y$ ~ $x_1 + x_2 + x_1 : x_2$, for example, means that there are two main effects, namely $x_1$ and $x_2$, and an interaction between $x_1$ and $x_2$. In R there is no need to specify the intercept, but it is estimated by default, too.










    share|cite|improve this question











    $endgroup$



    closed as off-topic by mkt, whuber May 22 at 18:30


    This question appears to be off-topic. The users who voted to close gave this specific reason:


    • "This question appears to be off-topic because EITHER it is not about statistics, machine learning, data analysis, data mining, or data visualization, OR it focuses on programming, debugging, or performing routine operations within a statistical computing platform. If the latter, you could try the support links we maintain." – mkt, whuber
    If this question can be reworded to fit the rules in the help center, please edit the question.


















      2












      2








      2


      1



      $begingroup$


      I want to predict $y$ with $x_1$ and $x_2$ and I suppose that $x_2$ has a quadratic effect on $y$ and that there is an interaction. How to model that?



      I've look in previous questions but there seem to be different suggestions.



      1. Include all possible effects separately (see model 2):



      $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : x_2 + x_1 : x_2^2$



      2. Keep all the parts of your polynomial variable together:



      $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : (x_2 + x_2^2)$



      I use the notation of R where $y$ ~ $x_1 + x_2 + x_1 : x_2$, for example, means that there are two main effects, namely $x_1$ and $x_2$, and an interaction between $x_1$ and $x_2$. In R there is no need to specify the intercept, but it is estimated by default, too.










      share|cite|improve this question











      $endgroup$




      I want to predict $y$ with $x_1$ and $x_2$ and I suppose that $x_2$ has a quadratic effect on $y$ and that there is an interaction. How to model that?



      I've look in previous questions but there seem to be different suggestions.



      1. Include all possible effects separately (see model 2):



      $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : x_2 + x_1 : x_2^2$



      2. Keep all the parts of your polynomial variable together:



      $y$ ~ $x_1 + x_2 + x_2^2 + x_1 : (x_2 + x_2^2)$



      I use the notation of R where $y$ ~ $x_1 + x_2 + x_1 : x_2$, for example, means that there are two main effects, namely $x_1$ and $x_2$, and an interaction between $x_1$ and $x_2$. In R there is no need to specify the intercept, but it is estimated by default, too.







      regression interaction quadratic-form






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited May 22 at 6:51







      ErKanns

















      asked May 22 at 6:38









      ErKannsErKanns

      548




      548




      closed as off-topic by mkt, whuber May 22 at 18:30


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question appears to be off-topic because EITHER it is not about statistics, machine learning, data analysis, data mining, or data visualization, OR it focuses on programming, debugging, or performing routine operations within a statistical computing platform. If the latter, you could try the support links we maintain." – mkt, whuber
      If this question can be reworded to fit the rules in the help center, please edit the question.







      closed as off-topic by mkt, whuber May 22 at 18:30


      This question appears to be off-topic. The users who voted to close gave this specific reason:


      • "This question appears to be off-topic because EITHER it is not about statistics, machine learning, data analysis, data mining, or data visualization, OR it focuses on programming, debugging, or performing routine operations within a statistical computing platform. If the latter, you could try the support links we maintain." – mkt, whuber
      If this question can be reworded to fit the rules in the help center, please edit the question.




















          1 Answer
          1






          active

          oldest

          votes


















          4












          $begingroup$

          It's the same formula (meaning that the models are equivalent), just the R notation is different.



          Here is an example with random data:



          x1 <- rnorm(100)
          x2 <- rnorm(100)
          y <- x1 + x2 + x2**2 + x1*x2 + rnorm(100)

          fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:x2 + x1:I(x2^2))

          fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:(x2 + I(x2^2)))

          fit <- lm(y ~ x1 + x2 + I(x2*x2) + x1:(x2 + I(x2*x2)))


          All three of these produce these same results where x1 is interacted with both x2 and the squared version of x2:



          Residuals:
          Min 1Q Median 3Q Max
          -2.12678 -0.64983 0.03115 0.59760 2.26080

          Coefficients:
          Estimate Std. Error t value Pr(>|t|)
          (Intercept) -0.11838 0.12757 -0.928 0.356
          x1 0.95627 0.13901 6.879 6.61e-10 ***
          x2 1.04394 0.09099 11.473 < 2e-16 ***
          I(x2 * x2) 0.94417 0.06015 15.698 < 2e-16 ***
          x1:x2 1.05098 0.12875 8.163 1.45e-12 ***
          x1:I(x2 * x2) 0.05926 0.09656 0.614 0.541
          ---
          Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

          Residual standard error: 1.003 on 94 degrees of freedom
          Multiple R-squared: 0.8412, Adjusted R-squared: 0.8328
          F-statistic: 99.59 on 5 and 94 DF, p-value: < 2.2e-16





          share|cite|improve this answer









          $endgroup$



















            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            4












            $begingroup$

            It's the same formula (meaning that the models are equivalent), just the R notation is different.



            Here is an example with random data:



            x1 <- rnorm(100)
            x2 <- rnorm(100)
            y <- x1 + x2 + x2**2 + x1*x2 + rnorm(100)

            fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:x2 + x1:I(x2^2))

            fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:(x2 + I(x2^2)))

            fit <- lm(y ~ x1 + x2 + I(x2*x2) + x1:(x2 + I(x2*x2)))


            All three of these produce these same results where x1 is interacted with both x2 and the squared version of x2:



            Residuals:
            Min 1Q Median 3Q Max
            -2.12678 -0.64983 0.03115 0.59760 2.26080

            Coefficients:
            Estimate Std. Error t value Pr(>|t|)
            (Intercept) -0.11838 0.12757 -0.928 0.356
            x1 0.95627 0.13901 6.879 6.61e-10 ***
            x2 1.04394 0.09099 11.473 < 2e-16 ***
            I(x2 * x2) 0.94417 0.06015 15.698 < 2e-16 ***
            x1:x2 1.05098 0.12875 8.163 1.45e-12 ***
            x1:I(x2 * x2) 0.05926 0.09656 0.614 0.541
            ---
            Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

            Residual standard error: 1.003 on 94 degrees of freedom
            Multiple R-squared: 0.8412, Adjusted R-squared: 0.8328
            F-statistic: 99.59 on 5 and 94 DF, p-value: < 2.2e-16





            share|cite|improve this answer









            $endgroup$

















              4












              $begingroup$

              It's the same formula (meaning that the models are equivalent), just the R notation is different.



              Here is an example with random data:



              x1 <- rnorm(100)
              x2 <- rnorm(100)
              y <- x1 + x2 + x2**2 + x1*x2 + rnorm(100)

              fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:x2 + x1:I(x2^2))

              fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:(x2 + I(x2^2)))

              fit <- lm(y ~ x1 + x2 + I(x2*x2) + x1:(x2 + I(x2*x2)))


              All three of these produce these same results where x1 is interacted with both x2 and the squared version of x2:



              Residuals:
              Min 1Q Median 3Q Max
              -2.12678 -0.64983 0.03115 0.59760 2.26080

              Coefficients:
              Estimate Std. Error t value Pr(>|t|)
              (Intercept) -0.11838 0.12757 -0.928 0.356
              x1 0.95627 0.13901 6.879 6.61e-10 ***
              x2 1.04394 0.09099 11.473 < 2e-16 ***
              I(x2 * x2) 0.94417 0.06015 15.698 < 2e-16 ***
              x1:x2 1.05098 0.12875 8.163 1.45e-12 ***
              x1:I(x2 * x2) 0.05926 0.09656 0.614 0.541
              ---
              Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

              Residual standard error: 1.003 on 94 degrees of freedom
              Multiple R-squared: 0.8412, Adjusted R-squared: 0.8328
              F-statistic: 99.59 on 5 and 94 DF, p-value: < 2.2e-16





              share|cite|improve this answer









              $endgroup$















                4












                4








                4





                $begingroup$

                It's the same formula (meaning that the models are equivalent), just the R notation is different.



                Here is an example with random data:



                x1 <- rnorm(100)
                x2 <- rnorm(100)
                y <- x1 + x2 + x2**2 + x1*x2 + rnorm(100)

                fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:x2 + x1:I(x2^2))

                fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:(x2 + I(x2^2)))

                fit <- lm(y ~ x1 + x2 + I(x2*x2) + x1:(x2 + I(x2*x2)))


                All three of these produce these same results where x1 is interacted with both x2 and the squared version of x2:



                Residuals:
                Min 1Q Median 3Q Max
                -2.12678 -0.64983 0.03115 0.59760 2.26080

                Coefficients:
                Estimate Std. Error t value Pr(>|t|)
                (Intercept) -0.11838 0.12757 -0.928 0.356
                x1 0.95627 0.13901 6.879 6.61e-10 ***
                x2 1.04394 0.09099 11.473 < 2e-16 ***
                I(x2 * x2) 0.94417 0.06015 15.698 < 2e-16 ***
                x1:x2 1.05098 0.12875 8.163 1.45e-12 ***
                x1:I(x2 * x2) 0.05926 0.09656 0.614 0.541
                ---
                Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

                Residual standard error: 1.003 on 94 degrees of freedom
                Multiple R-squared: 0.8412, Adjusted R-squared: 0.8328
                F-statistic: 99.59 on 5 and 94 DF, p-value: < 2.2e-16





                share|cite|improve this answer









                $endgroup$



                It's the same formula (meaning that the models are equivalent), just the R notation is different.



                Here is an example with random data:



                x1 <- rnorm(100)
                x2 <- rnorm(100)
                y <- x1 + x2 + x2**2 + x1*x2 + rnorm(100)

                fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:x2 + x1:I(x2^2))

                fit <- lm(y ~ x1 + x2 + I(x2^2) + x1:(x2 + I(x2^2)))

                fit <- lm(y ~ x1 + x2 + I(x2*x2) + x1:(x2 + I(x2*x2)))


                All three of these produce these same results where x1 is interacted with both x2 and the squared version of x2:



                Residuals:
                Min 1Q Median 3Q Max
                -2.12678 -0.64983 0.03115 0.59760 2.26080

                Coefficients:
                Estimate Std. Error t value Pr(>|t|)
                (Intercept) -0.11838 0.12757 -0.928 0.356
                x1 0.95627 0.13901 6.879 6.61e-10 ***
                x2 1.04394 0.09099 11.473 < 2e-16 ***
                I(x2 * x2) 0.94417 0.06015 15.698 < 2e-16 ***
                x1:x2 1.05098 0.12875 8.163 1.45e-12 ***
                x1:I(x2 * x2) 0.05926 0.09656 0.614 0.541
                ---
                Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

                Residual standard error: 1.003 on 94 degrees of freedom
                Multiple R-squared: 0.8412, Adjusted R-squared: 0.8328
                F-statistic: 99.59 on 5 and 94 DF, p-value: < 2.2e-16






                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered May 22 at 8:07









                AlexKAlexK

                580111




                580111













                    Popular posts from this blog

                    Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

                    Circuit construction for execution of conditional statements using least significant bitHow are two different registers being used as “control”?How exactly is the stated composite state of the two registers being produced using the $R_zz$ controlled rotations?Efficiently performing controlled rotations in HHLWould this quantum algorithm implementation work?How to prepare a superposed states of odd integers from $1$ to $sqrtN$?Why is this implementation of the order finding algorithm not working?Circuit construction for Hamiltonian simulationHow can I invert the least significant bit of a certain term of a superposed state?Implementing an oracleImplementing a controlled sum operation

                    Magento 2 “No Payment Methods” in Admin New OrderHow to integrate Paypal Express Checkout with the Magento APIMagento 1.5 - Sales > Order > edit order and shipping methods disappearAuto Invoice Check/Money Order Payment methodAdd more simple payment methods?Shipping methods not showingWhat should I do to change payment methods if changing the configuration has no effects?1.9 - No Payment Methods showing upMy Payment Methods not Showing for downloadable/virtual product when checkout?Magento2 API to access internal payment methodHow to call an existing payment methods in the registration form?