# F-test where the coefficients in the H_0 is nonzero Classic List Threaded 7 messages Open this post in threaded view
|

## F-test where the coefficients in the H_0 is nonzero

 Hi,    I try to run the regression    y = beta_0 + beta_1 x    and test H_0: (beta_0, beta_1) =(0,1) against H_1: H_0 is false    I believe I can run the regression    (y-x) = beta_0 +beta_1‘ x    and do the regular F-test (using lm functio) where the hypothesized coefficients are all zero.    Is there any function in R that deal with the case where the coefficients are nonzero? John         [[alternative HTML version deleted]] ______________________________________________ [hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code.
Open this post in threaded view
|

## Re: F-test where the coefficients in the H_0 is nonzero

 This should do it: > x <- rnorm(10) > y <- x+rnorm(10) > fit1 <- lm(y~x) > fit2 <- lm(y~-1 + offset(0 + 1 * x)) > anova(fit2, fit1) Analysis of Variance Table Model 1: y ~ -1 + offset(0 + 1 * x) Model 2: y ~ x   Res.Df     RSS Df Sum of Sq      F Pr(>F) 1     10 10.6381                           2      8  7.8096  2    2.8285 1.4487 0.2904 > On 2 Aug 2018, at 10:30 , John <[hidden email]> wrote: > > Hi, > >   I try to run the regression >   y = beta_0 + beta_1 x >   and test H_0: (beta_0, beta_1) =(0,1) against H_1: H_0 is false >   I believe I can run the regression >   (y-x) = beta_0 +beta_1‘ x >   and do the regular F-test (using lm functio) where the hypothesized > coefficients are all zero. > >   Is there any function in R that deal with the case where the > coefficients are nonzero? > > John > > [[alternative HTML version deleted]] > > ______________________________________________ > [hidden email] mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html> and provide commented, minimal, self-contained, reproducible code. -- Peter Dalgaard, Professor, Center for Statistics, Copenhagen Business School Solbjerg Plads 3, 2000 Frederiksberg, Denmark Phone: (+45)38153501 Office: A 4.23 Email: [hidden email]  Priv: [hidden email] ______________________________________________ [hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code.
Open this post in threaded view
|

## Re: F-test where the coefficients in the H_0 is nonzero

 In reply to this post by miao You can easily test linear restrictions using the function linearHypothesis() from the car package. There are several ways to set up the null hypothesis, but a straightforward one here is:   > library(car) > x <- rnorm(10) > y <- x+rnorm(10) > linearHypothesis(lm(y~x), c("(Intercept)=0", "x=1")) Linear hypothesis test Hypothesis: (Intercept) = 0 x = 1 Model 1: restricted model Model 2: y ~ x   Res.Df     RSS Df Sum of Sq      F Pr(>F) 1     10 10.6218                           2      8  9.0001  2    1.6217 0.7207 0.5155 Jan From: R-help <[hidden email]> on behalf of John <[hidden email]> Date: Thursday, 2 August 2018 at 10:44 To: r-help <[hidden email]> Subject: [R] F-test where the coefficients in the H_0 is nonzero Hi,    I try to run the regression    y = beta_0 + beta_1 x    and test H_0: (beta_0, beta_1) =(0,1) against H_1: H_0 is false    I believe I can run the regression    (y-x) = beta_0 +beta_1‘ x    and do the regular F-test (using lm functio) where the hypothesized coefficients are all zero.    Is there any function in R that deal with the case where the coefficients are nonzero? John         [[alternative HTML version deleted]] ______________________________________________ mailto:[hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code. ______________________________________________ [hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code.
Open this post in threaded view
|

## Re: F-test where the coefficients in the H_0 is nonzero

 Hi,    I try to run the same f-test by lm (with summary) and the function "linearHypothesis" in car package. Why are the results (p-values for the f-test) different? > df1<-data.frame(x=c(2,3,4), y=c(7,6,8)) > lm1<-lm(y~x, df1) > lm1 Call: lm(formula = y ~ x, data = df1) Coefficients: (Intercept)            x         5.5          0.5 > summary(lm1) Call: lm(formula = y ~ x, data = df1) Residuals:    1    2    3  0.5 -1.0  0.5 Coefficients:             Estimate Std. Error t value Pr(>|t|) (Intercept)    5.500      2.693   2.043    0.290 x              0.500      0.866   0.577    0.667 Residual standard error: 1.225 on 1 degrees of freedom Multiple R-squared:   0.25, Adjusted R-squared:   -0.5 F-statistic: 0.3333 on 1 and 1 DF,  p-value: 0.6667 > linearHypothesis(lm1, c("(Intercept)=0", "x=0")) Linear hypothesis test Hypothesis: (Intercept) = 0 x = 0 Model 1: restricted model Model 2: y ~ x   Res.Df   RSS Df Sum of Sq      F Pr(>F) 1      3 149.0 2      1   1.5  2     147.5 49.167 0.1003 2018-08-03 13:54 GMT+08:00 Annaert Jan <[hidden email]>: > You can easily test linear restrictions using the function > linearHypothesis() from the car package. > There are several ways to set up the null hypothesis, but a > straightforward one here is: > > > library(car) > > x <- rnorm(10) > > y <- x+rnorm(10) > > linearHypothesis(lm(y~x), c("(Intercept)=0", "x=1")) > Linear hypothesis test > > Hypothesis: > (Intercept) = 0 > x = 1 > > Model 1: restricted model > Model 2: y ~ x > >   Res.Df     RSS Df Sum of Sq      F Pr(>F) > 1     10 10.6218 > 2      8  9.0001  2    1.6217 0.7207 0.5155 > > > Jan > > From: R-help <[hidden email]> on behalf of John < > [hidden email]> > Date: Thursday, 2 August 2018 at 10:44 > To: r-help <[hidden email]> > Subject: [R] F-test where the coefficients in the H_0 is nonzero > > Hi, > >    I try to run the regression >    y = beta_0 + beta_1 x >    and test H_0: (beta_0, beta_1) =(0,1) against H_1: H_0 is false >    I believe I can run the regression >    (y-x) = beta_0 +beta_1‘ x >    and do the regular F-test (using lm functio) where the hypothesized > coefficients are all zero. > >    Is there any function in R that deal with the case where the > coefficients are nonzero? > > John > >         [[alternative HTML version deleted]] > > ______________________________________________ > mailto:[hidden email] mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help> PLEASE do read the posting guide http://www.R-project.org/> posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > > >         [[alternative HTML version deleted]] ______________________________________________ [hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code.
Open this post in threaded view
|

## Re: F-test where the coefficients in the H_0 is nonzero

 Hi: the F-test is a joint hypothesis ( I never used that function from the car package but it sounds like it is )  and the t-statistics that come  out of a  regression are "conditional" in the sense that they test the significance of one coefficient given the other so you wouldn't expect the two outputs to be the same. On Thu, Aug 9, 2018 at 4:58 AM, John <[hidden email]> wrote: > Hi, > >    I try to run the same f-test by lm (with summary) and the function > "linearHypothesis" in car package. Why are the results (p-values for the > f-test) different? > > > > df1<-data.frame(x=c(2,3,4), y=c(7,6,8)) > > lm1<-lm(y~x, df1) > > lm1 > > Call: > lm(formula = y ~ x, data = df1) > > Coefficients: > (Intercept)            x >         5.5          0.5 > > > summary(lm1) > > Call: > lm(formula = y ~ x, data = df1) > > Residuals: >    1    2    3 >  0.5 -1.0  0.5 > > Coefficients: >             Estimate Std. Error t value Pr(>|t|) > (Intercept)    5.500      2.693   2.043    0.290 > x              0.500      0.866   0.577    0.667 > > Residual standard error: 1.225 on 1 degrees of freedom > Multiple R-squared:   0.25, Adjusted R-squared:   -0.5 > F-statistic: 0.3333 on 1 and 1 DF,  p-value: 0.6667 > > > linearHypothesis(lm1, c("(Intercept)=0", "x=0")) > Linear hypothesis test > > Hypothesis: > (Intercept) = 0 > x = 0 > > Model 1: restricted model > Model 2: y ~ x > >   Res.Df   RSS Df Sum of Sq      F Pr(>F) > 1      3 149.0 > 2      1   1.5  2     147.5 49.167 0.1003 > > 2018-08-03 13:54 GMT+08:00 Annaert Jan <[hidden email]>: > > > You can easily test linear restrictions using the function > > linearHypothesis() from the car package. > > There are several ways to set up the null hypothesis, but a > > straightforward one here is: > > > > > library(car) > > > x <- rnorm(10) > > > y <- x+rnorm(10) > > > linearHypothesis(lm(y~x), c("(Intercept)=0", "x=1")) > > Linear hypothesis test > > > > Hypothesis: > > (Intercept) = 0 > > x = 1 > > > > Model 1: restricted model > > Model 2: y ~ x > > > >   Res.Df     RSS Df Sum of Sq      F Pr(>F) > > 1     10 10.6218 > > 2      8  9.0001  2    1.6217 0.7207 0.5155 > > > > > > Jan > > > > From: R-help <[hidden email]> on behalf of John < > > [hidden email]> > > Date: Thursday, 2 August 2018 at 10:44 > > To: r-help <[hidden email]> > > Subject: [R] F-test where the coefficients in the H_0 is nonzero > > > > Hi, > > > >    I try to run the regression > >    y = beta_0 + beta_1 x > >    and test H_0: (beta_0, beta_1) =(0,1) against H_1: H_0 is false > >    I believe I can run the regression > >    (y-x) = beta_0 +beta_1‘ x > >    and do the regular F-test (using lm functio) where the hypothesized > > coefficients are all zero. > > > >    Is there any function in R that deal with the case where the > > coefficients are nonzero? > > > > John > > > >         [[alternative HTML version deleted]] > > > > ______________________________________________ > > mailto:[hidden email] mailing list -- To UNSUBSCRIBE and more, see > > https://stat.ethz.ch/mailman/listinfo/r-help> > PLEASE do read the posting guide http://www.R-project.org/> > posting-guide.html > > and provide commented, minimal, self-contained, reproducible code. > > > > > > > >         [[alternative HTML version deleted]] > > ______________________________________________ > [hidden email] mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help> PLEASE do read the posting guide http://www.R-project.org/> posting-guide.html > and provide commented, minimal, self-contained, reproducible code. >         [[alternative HTML version deleted]] ______________________________________________ [hidden email] mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-helpPLEASE do read the posting guide http://www.R-project.org/posting-guide.htmland provide commented, minimal, self-contained, reproducible code.