Strange paradox

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Strange paradox

R help mailing list-2
Hello,

I am currently analysed two nested models using the same sample. Both the simpler model (Model 1 ~ x1 + x2) and the more complex model (Model 2 ~ x1 + x2 + x3 + x4) yield the same adjusted R-square. Yet the p-value associated with the deviance statistic is highly significant (p=0.0047), suggesting that the confounders (x3 and x4) account for the prediction of the dependent variable.

Does anyone have an explanation of this strange paradox?

Thank you for any suggestion.

Anne

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Strange paradox

Bert Gunter-2
This list is about R programming. Statistics questions, which this is, are
generally off topic here. Try posting on a statistics list like
stats.stackexchange.com instead.

Cheers,
Bert

Bert Gunter

"The trouble with having an open mind is that people keep coming along and
sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )


On Fri, Oct 5, 2018 at 1:48 AM CHATTON Anne via R-help <[hidden email]>
wrote:

> Hello,
>
> I am currently analysed two nested models using the same sample. Both the
> simpler model (Model 1 ~ x1 + x2) and the more complex model (Model 2 ~ x1
> + x2 + x3 + x4) yield the same adjusted R-square. Yet the p-value
> associated with the deviance statistic is highly significant (p=0.0047),
> suggesting that the confounders (x3 and x4) account for the prediction of
> the dependent variable.
>
> Does anyone have an explanation of this strange paradox?
>
> Thank you for any suggestion.
>
> Anne
>
> ______________________________________________
> [hidden email] mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Strange paradox

Michael Friendly
In reply to this post by R help mailing list-2
Yes-- there's no paradox; the adjusted R^2 and deviance are looking
at/testing different things.

Also you don't say *what* deviance you are looking at, but
your interpretation of the deviance is probably wrong.
A significant test for
anova(model2, model1)
says that x3 & x4 add significantly to prediction, over and above x1, x2

On 10/5/2018 4:45 AM, CHATTON Anne via R-help wrote:

> Hello,
>
> I am currently analysed two nested models using the same sample. Both the simpler model (Model 1 ~ x1 + x2) and the more complex model (Model 2 ~ x1 + x2 + x3 + x4) yield the same adjusted R-square. Yet the p-value associated with the deviance statistic is highly significant (p=0.0047), suggesting that the confounders (x3 and x4) account for the prediction of the dependent variable.
>
> Does anyone have an explanation of this strange paradox?
>
> Thank you for any suggestion.
>
> Anne
>


--
Michael Friendly     Email: friendly AT yorku DOT ca
Professor, Psychology Dept. & Chair, ASA Statistical Graphics Section
York University      Voice: 416 736-2100 x66249 Fax: 416 736-5814
4700 Keele Street    Web:   http://www.datavis.ca
Toronto, ONT  M3J 1P3 CANADA

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Strange paradox

R help mailing list-2
In reply to this post by R help mailing list-2
Dear all,
Thank you for your remarks.
The data under analysis were multiply-imputed using Mice.
To compare the nested models, I used the following R codes by van Buuren:
pool.compare (Model2, Model1, method = c("wald"), data = NULL)
As far as I know the Wald statistic tests the null hypothesis that the extra parameters are all zero. But I might be wrong...

-----Message d'origine-----
De : CHATTON Anne
Envoyé : vendredi, 5 octobre 2018 10:46
À : '[hidden email]' <[hidden email]>
Objet : Strange paradox

Hello,

I am currently analysed two nested models using the same sample. Both the simpler model (Model 1 ~ x1 + x2) and the more complex model (Model 2 ~ x1 + x2 + x3 + x4) yield the same adjusted R-square. Yet the p-value associated with the deviance statistic is highly significant (p=0.0047), suggesting that the confounders (x3 and x4) account for the prediction of the dependent variable.

Does anyone have an explanation of this strange paradox?

Thank you for any suggestion.

Anne

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.