glmmADMB: Generalized Linear Mixed Models using AD Model Builder

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Hans Skaug

Dear R-users,

Half a year ago we put out the R package "glmmADMB" for fitting
overdispersed count data.

http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html

Several people who used this package have requested
additional features. We now have a new version ready.
The major new feature is that glmmADMB allows Bernoulli responses
with logistic and probit links. In addition there is
a "ranef.glmm.admb()" function for getting the random effects.

The download site is still:

http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html

The package is based on the software ADMB-RE, but the full
unrestricted R-package is made freely available by Otter Research Ltd
and does not require ADMB-RE to run. Versions for Linux and Windows
are available.

We are still happy to get feedback for users, and to get suggestions
for improvement.

We have set up a forum at http://www.otter-rsch.ca/phpbb/ for discussions
about the software.

Regards,

Hans

_____________________________
Hans Julius Skaug

Department of Mathematics
University of Bergen
Johannes Brunsgate 12
5008 Bergen
Norway
ph. (+47) 55 58 48 61

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Roel de Jong
Dear R-users,

because lme(r) & glmmpql, which are based on Penalized Quasi Likelihood,
are not very robust with Bernoulli responses, I wanted to test glmmADMB.
I run the following simulation study:

500 samples are drawn with the model specification:
y = (intercept*f1+pred2*f2+pred3*f3)+(intercept*ri+pred2*rs)
     where pred2 and pred3 are predictors distributed N(0,1)
     f1..f3 are fixed effects, f1=-1, f2=1.5, f3=0.5
     ri is random intercept with associated variance var_ri=0.2
     rs is random slope with associated variance var_rs=0.4
     the covariance between ri and rs "covr"=0.1

1500 units/dataset, class size=30

convergence:
~~~~~~~~~~~~
No crashes.
5/500 Datasets had on exit a gradient of the log-likelihood > 0.001
though. Removing the datasets with questionable convergence doesn't seem
to effect the simulation analysis.

bias:
~~~~~~
f1=-1.00531376
f2= 1.49891060
f3= 0.50211520
ri= 0.20075947
covr=0.09886267
rs= 0.38948382

Only the random slope "rs" is somewhat low, but i don't think it is of
significance

coverage alpha=.95: (using asymmetric confidence intervals)
~~~~~~~~~~~~~~~~~~~~~~~~
f1=0.950
f2=0.950
f3=0.966
ri=0.974
covr=0.970
rs=0.970

While some coverages are somewhat high, confidence intervals based on
asymptotic theory will not have exactly the nominal coverage level, but
with simulations (parametric bootstrap) that can be corrected for.

I can highly recommend this excellent package to anyone fitting these
kinds of models, and want to thank Hans Skaug & Dave Fournier for their
hard work!

Roel de Jong.


Hans Julius Skaug wrote:

> Dear R-users,
>
> Half a year ago we put out the R package "glmmADMB" for fitting
> overdispersed count data.
>
> http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>
> Several people who used this package have requested
> additional features. We now have a new version ready.
> The major new feature is that glmmADMB allows Bernoulli responses
> with logistic and probit links. In addition there is
> a "ranef.glmm.admb()" function for getting the random effects.
>
> The download site is still:
>
> http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>
> The package is based on the software ADMB-RE, but the full
> unrestricted R-package is made freely available by Otter Research Ltd
> and does not require ADMB-RE to run. Versions for Linux and Windows
> are available.
>
> We are still happy to get feedback for users, and to get suggestions
> for improvement.
>
> We have set up a forum at http://www.otter-rsch.ca/phpbb/ for discussions
> about the software.
>
> Regards,
>
> Hans
>
> _____________________________
> Hans Julius Skaug
>
> Department of Mathematics
> University of Bergen
> Johannes Brunsgate 12
> 5008 Bergen
> Norway
> ph. (+47) 55 58 48 61
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Douglas Bates
On 12/15/05, Roel de Jong <[hidden email]> wrote:
> Dear R-users,
>
> because lme(r) & glmmpql, which are based on Penalized Quasi Likelihood,
> are not very robust with Bernoulli responses,

The current version of lmer takes method =  "PQL" (the default) or
"Laplace" or "AGQ" although AGQ is not available for vector-valued
random effects in that version so one must be content with "PQL" or
"Laplace"

> I wanted to test glmmADMB.  I run the following simulation study:

>
> 500 samples are drawn with the model specification:
> y = (intercept*f1+pred2*f2+pred3*f3)+(intercept*ri+pred2*rs)
>      where pred2 and pred3 are predictors distributed N(0,1)
>      f1..f3 are fixed effects, f1=-1, f2=1.5, f3=0.5
>      ri is random intercept with associated variance var_ri=0.2
>      rs is random slope with associated variance var_rs=0.4
>      the covariance between ri and rs "covr"=0.1
>
> 1500 units/dataset, class size=30

Could you make the datasets, or the code that generates them,
available?  My code for such a simulation would be

genGLMM <- function(nobs, gsiz, fxd, Sigma, linkinv = binomial()$linkinv)
{
    ngrp <- nobs/gsiz
    ranef <- matrix(rnorm(ngrp * ncol(Sigma)), nr = ngrp) %*% chol(Sigma)
    pred2 <- rnorm(nobs)
    pred3 <- rnorm(nobs)
    mm <- model.matrix(~pred2 + pred3)
    rmm <- model.matrix(~pred2)
    grp <- gl(n = 1500/30, k = 30, len = 1500)
                                        # linear predictor
    lp <- as.vector(mm %*% fxd + rowSums(rmm * ranef[grp,]))
    resp <- as.integer(runif(nobs) < linkinv(lp))
    data.frame(resp = resp, pred2 = pred2, pred3 = pred3, grp = grp)
}

Running this function gives
> nobs <- 1500
> gsiz <- 30
> fxd <- c(-1, 1.5, 0.5)
> Sigma <- matrix(c(0.2, 0.1, 0.1, 0.4), nc = 2)
> set.seed(123454321)
> sim1 <- genGLMM(nobs, gsiz, fxd, Sigma)
> (fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial))
Generalized linear mixed model fit using PQL
Formula: resp ~ pred2 + pred3 + (pred2 | grp)
   Data: sim1
 Family: binomial(logit link)
      AIC      BIC    logLik deviance
 1403.522 1440.714 -694.7609 1389.522
Random effects:
 Groups Name        Variance Std.Dev. Corr
 grp    (Intercept) 0.44672  0.66837
        pred2       0.55629  0.74585  0.070
# of obs: 1500, groups: grp, 50

Estimated scale (compare to 1)  0.9032712

Fixed effects:
             Estimate Std. Error z value  Pr(>|z|)
(Intercept) -1.081710   0.121640 -8.8927 < 2.2e-16
pred2        1.607273   0.141697 11.3430 < 2.2e-16
pred3        0.531071   0.072643  7.3107 2.657e-13
> system.time(fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial))
[1] 0.33 0.00 0.33 0.00 0.00
> (fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial, method = "Laplace"))
Generalized linear mixed model fit using Laplace
Formula: resp ~ pred2 + pred3 + (pred2 | grp)
   Data: sim1
 Family: binomial(logit link)
      AIC      BIC    logLik deviance
 1401.396 1438.588 -693.6979 1387.396
Random effects:
 Groups Name        Variance Std.Dev. Corr
 grp    (Intercept) 0.35248  0.59370
        pred2       0.46641  0.68294  0.077
# of obs: 1500, groups: grp, 50

Estimated scale (compare to 1)  0.9854841

Fixed effects:
             Estimate Std. Error z value  Pr(>|z|)
(Intercept) -1.119008   0.121640 -9.1993 < 2.2e-16
pred2        1.680916   0.141697 11.8627 < 2.2e-16
pred3        0.543548   0.072643  7.4825 7.293e-14
> system.time(fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial, method = "Laplace"))
[1] 4.62 0.01 4.65 0.00 0.00

Fitting that model using glmmADMB gives
> (fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp", sim1, "binomial", "logit", "full"))
...
iteration output omitted
...

GLMM's in R powered by AD Model Builder:

  Family: binomial

Fixed effects:
  Log-likelihood: -602.035
  Formula: resp ~ pred2 + pred3
(Intercept)       pred2       pred3
   -1.11990     1.69030     0.54619

Random effects:
  Grouping factor: grp
  Formula: ~pred2
Structure: General positive-definite
               StdDev      Corr
(Intercept) 0.5890755
pred2       0.6712377 0.1023698

Number of Observations: 1500
Number of Groups: 50

The "Laplace" method in lmer and the default method in glmm.admb,
which according to the documentation is the Laplace approximation,
produce essentially the same model fit.  One difference is the
reported value of the log-likelihood, which we should cross-check, and
another difference is in the execution time

> system.time(fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp", sim1, "binomial", "logit", "full"))
...
Iteration output omitted
...
[1]  0.23  0.02 21.44 19.45  0.24

Fitting this model takes about 4.7 seconds with the Laplace
approximation in lmer (and only 0.33 seconds for PQL, which is not
that far off) and about 20 seconds in glmm.admb



> convergence:
> ~~~~~~~~~~~~
> No crashes.
> 5/500 Datasets had on exit a gradient of the log-likelihood > 0.001
> though. Removing the datasets with questionable convergence doesn't seem
> to effect the simulation analysis.
>
> bias:
> ~~~~~~
> f1=-1.00531376
> f2= 1.49891060
> f3= 0.50211520
> ri= 0.20075947
> covr=0.09886267
> rs= 0.38948382
>
> Only the random slope "rs" is somewhat low, but i don't think it is of
> significance
>
> coverage alpha=.95: (using asymmetric confidence intervals)
> ~~~~~~~~~~~~~~~~~~~~~~~~
> f1=0.950
> f2=0.950
> f3=0.966
> ri=0.974
> covr=0.970
> rs=0.970
>
> While some coverages are somewhat high, confidence intervals based on
> asymptotic theory will not have exactly the nominal coverage level, but
> with simulations (parametric bootstrap) that can be corrected for.
>
> I can highly recommend this excellent package to anyone fitting these
> kinds of models, and want to thank Hans Skaug & Dave Fournier for their
> hard work!

I agree.  I am particularly pleased that Otter Research allows access
to a Linux executable of their code (although I would, naturally,
prefer the code to be Open Source).

>
> Roel de Jong.
>
>
> Hans Julius Skaug wrote:
> > Dear R-users,
> >
> > Half a year ago we put out the R package "glmmADMB" for fitting
> > overdispersed count data.
> >
> > http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
> >
> > Several people who used this package have requested
> > additional features. We now have a new version ready.
> > The major new feature is that glmmADMB allows Bernoulli responses
> > with logistic and probit links. In addition there is
> > a "ranef.glmm.admb()" function for getting the random effects.
> >
> > The download site is still:
> >
> > http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
> >
> > The package is based on the software ADMB-RE, but the full
> > unrestricted R-package is made freely available by Otter Research Ltd
> > and does not require ADMB-RE to run. Versions for Linux and Windows
> > are available.
> >
> > We are still happy to get feedback for users, and to get suggestions
> > for improvement.
> >
> > We have set up a forum at http://www.otter-rsch.ca/phpbb/ for discussions
> > about the software.
> >
> > Regards,
> >
> > Hans
> >
> > _____________________________
> > Hans Julius Skaug
> >
> > Department of Mathematics
> > University of Bergen
> > Johannes Brunsgate 12
> > 5008 Bergen
> > Norway
> > ph. (+47) 55 58 48 61
> >
> > ______________________________________________
> > [hidden email] mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
> >
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Roel de Jong
Dear professor Bates,

thank you for your reaction. To make sure that no errors occur in the
data generation process I used the elegant function you so neatly
provided to generate a couple of datasets under the model specification
specified earlier. Running lmer with a Laplace approximation to the
high-dimensional integral in the likelihood gives me a warning an then
this show-stopper:

Warning: IRLS iterations for PQL did not converge
Error in objective(.par, ...) : Unable to invert singular factor of
downdated X'X

Fitting the dataset with glmmADMB gives no apparent problems and
reasonable estimates. I attached the particular dataset to the email.

The difference in computation time can be attributed to the fact that
glmmadmb uses a generic technique called automatic differentiation with
the Laplace approximation. The same technique can be employed to fit
much more complex nonlinear models, but I'm sure Hans & Dave can tell
more about it.

Best regards,
        Roel de Jong


Douglas Bates wrote:

> On 12/15/05, Roel de Jong <[hidden email]> wrote:
>
>>Dear R-users,
>>
>>because lme(r) & glmmpql, which are based on Penalized Quasi Likelihood,
>>are not very robust with Bernoulli responses,
>
>
> The current version of lmer takes method =  "PQL" (the default) or
> "Laplace" or "AGQ" although AGQ is not available for vector-valued
> random effects in that version so one must be content with "PQL" or
> "Laplace"
>
>
>>I wanted to test glmmADMB.  I run the following simulation study:
>
>
>>500 samples are drawn with the model specification:
>>y = (intercept*f1+pred2*f2+pred3*f3)+(intercept*ri+pred2*rs)
>>     where pred2 and pred3 are predictors distributed N(0,1)
>>     f1..f3 are fixed effects, f1=-1, f2=1.5, f3=0.5
>>     ri is random intercept with associated variance var_ri=0.2
>>     rs is random slope with associated variance var_rs=0.4
>>     the covariance between ri and rs "covr"=0.1
>>
>>1500 units/dataset, class size=30
>
>
> Could you make the datasets, or the code that generates them,
> available?  My code for such a simulation would be
>
> genGLMM <- function(nobs, gsiz, fxd, Sigma, linkinv = binomial()$linkinv)
> {
>     ngrp <- nobs/gsiz
>     ranef <- matrix(rnorm(ngrp * ncol(Sigma)), nr = ngrp) %*% chol(Sigma)
>     pred2 <- rnorm(nobs)
>     pred3 <- rnorm(nobs)
>     mm <- model.matrix(~pred2 + pred3)
>     rmm <- model.matrix(~pred2)
>     grp <- gl(n = 1500/30, k = 30, len = 1500)
>                                         # linear predictor
>     lp <- as.vector(mm %*% fxd + rowSums(rmm * ranef[grp,]))
>     resp <- as.integer(runif(nobs) < linkinv(lp))
>     data.frame(resp = resp, pred2 = pred2, pred3 = pred3, grp = grp)
> }
>
> Running this function gives
>
>>nobs <- 1500
>>gsiz <- 30
>>fxd <- c(-1, 1.5, 0.5)
>>Sigma <- matrix(c(0.2, 0.1, 0.1, 0.4), nc = 2)
>>set.seed(123454321)
>>sim1 <- genGLMM(nobs, gsiz, fxd, Sigma)
>>(fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial))
>
> Generalized linear mixed model fit using PQL
> Formula: resp ~ pred2 + pred3 + (pred2 | grp)
>    Data: sim1
>  Family: binomial(logit link)
>       AIC      BIC    logLik deviance
>  1403.522 1440.714 -694.7609 1389.522
> Random effects:
>  Groups Name        Variance Std.Dev. Corr
>  grp    (Intercept) 0.44672  0.66837
>         pred2       0.55629  0.74585  0.070
> # of obs: 1500, groups: grp, 50
>
> Estimated scale (compare to 1)  0.9032712
>
> Fixed effects:
>              Estimate Std. Error z value  Pr(>|z|)
> (Intercept) -1.081710   0.121640 -8.8927 < 2.2e-16
> pred2        1.607273   0.141697 11.3430 < 2.2e-16
> pred3        0.531071   0.072643  7.3107 2.657e-13
>
>>system.time(fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial))
>
> [1] 0.33 0.00 0.33 0.00 0.00
>
>>(fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial, method = "Laplace"))
>
> Generalized linear mixed model fit using Laplace
> Formula: resp ~ pred2 + pred3 + (pred2 | grp)
>    Data: sim1
>  Family: binomial(logit link)
>       AIC      BIC    logLik deviance
>  1401.396 1438.588 -693.6979 1387.396
> Random effects:
>  Groups Name        Variance Std.Dev. Corr
>  grp    (Intercept) 0.35248  0.59370
>         pred2       0.46641  0.68294  0.077
> # of obs: 1500, groups: grp, 50
>
> Estimated scale (compare to 1)  0.9854841
>
> Fixed effects:
>              Estimate Std. Error z value  Pr(>|z|)
> (Intercept) -1.119008   0.121640 -9.1993 < 2.2e-16
> pred2        1.680916   0.141697 11.8627 < 2.2e-16
> pred3        0.543548   0.072643  7.4825 7.293e-14
>
>>system.time(fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial, method = "Laplace"))
>
> [1] 4.62 0.01 4.65 0.00 0.00
>
> Fitting that model using glmmADMB gives
>
>>(fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp", sim1, "binomial", "logit", "full"))
>
> ...
> iteration output omitted
> ...
>
> GLMM's in R powered by AD Model Builder:
>
>   Family: binomial
>
> Fixed effects:
>   Log-likelihood: -602.035
>   Formula: resp ~ pred2 + pred3
> (Intercept)       pred2       pred3
>    -1.11990     1.69030     0.54619
>
> Random effects:
>   Grouping factor: grp
>   Formula: ~pred2
> Structure: General positive-definite
>                StdDev      Corr
> (Intercept) 0.5890755
> pred2       0.6712377 0.1023698
>
> Number of Observations: 1500
> Number of Groups: 50
>
> The "Laplace" method in lmer and the default method in glmm.admb,
> which according to the documentation is the Laplace approximation,
> produce essentially the same model fit.  One difference is the
> reported value of the log-likelihood, which we should cross-check, and
> another difference is in the execution time
>
>
>>system.time(fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp", sim1, "binomial", "logit", "full"))
>
> ...
> Iteration output omitted
> ...
> [1]  0.23  0.02 21.44 19.45  0.24
>
> Fitting this model takes about 4.7 seconds with the Laplace
> approximation in lmer (and only 0.33 seconds for PQL, which is not
> that far off) and about 20 seconds in glmm.admb
>
>
>
>
>>convergence:
>>~~~~~~~~~~~~
>>No crashes.
>>5/500 Datasets had on exit a gradient of the log-likelihood > 0.001
>>though. Removing the datasets with questionable convergence doesn't seem
>>to effect the simulation analysis.
>>
>>bias:
>>~~~~~~
>>f1=-1.00531376
>>f2= 1.49891060
>>f3= 0.50211520
>>ri= 0.20075947
>>covr=0.09886267
>>rs= 0.38948382
>>
>>Only the random slope "rs" is somewhat low, but i don't think it is of
>>significance
>>
>>coverage alpha=.95: (using asymmetric confidence intervals)
>>~~~~~~~~~~~~~~~~~~~~~~~~
>>f1=0.950
>>f2=0.950
>>f3=0.966
>>ri=0.974
>>covr=0.970
>>rs=0.970
>>
>>While some coverages are somewhat high, confidence intervals based on
>>asymptotic theory will not have exactly the nominal coverage level, but
>>with simulations (parametric bootstrap) that can be corrected for.
>>
>>I can highly recommend this excellent package to anyone fitting these
>>kinds of models, and want to thank Hans Skaug & Dave Fournier for their
>>hard work!
>
>
> I agree.  I am particularly pleased that Otter Research allows access
> to a Linux executable of their code (although I would, naturally,
> prefer the code to be Open Source).
>
>
>>Roel de Jong.
>>
>>
>>Hans Julius Skaug wrote:
>>
>>>Dear R-users,
>>>
>>>Half a year ago we put out the R package "glmmADMB" for fitting
>>>overdispersed count data.
>>>
>>>http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>>>
>>>Several people who used this package have requested
>>>additional features. We now have a new version ready.
>>>The major new feature is that glmmADMB allows Bernoulli responses
>>>with logistic and probit links. In addition there is
>>>a "ranef.glmm.admb()" function for getting the random effects.
>>>
>>>The download site is still:
>>>
>>>http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>>>
>>>The package is based on the software ADMB-RE, but the full
>>>unrestricted R-package is made freely available by Otter Research Ltd
>>>and does not require ADMB-RE to run. Versions for Linux and Windows
>>>are available.
>>>
>>>We are still happy to get feedback for users, and to get suggestions
>>>for improvement.
>>>
>>>We have set up a forum at http://www.otter-rsch.ca/phpbb/ for discussions
>>>about the software.
>>>
>>>Regards,
>>>
>>>Hans
>>>
>>>_____________________________
>>>Hans Julius Skaug
>>>
>>>Department of Mathematics
>>>University of Bergen
>>>Johannes Brunsgate 12
>>>5008 Bergen
>>>Norway
>>>ph. (+47) 55 58 48 61
>>>
>>>______________________________________________
>>>[hidden email] mailing list
>>>https://stat.ethz.ch/mailman/listinfo/r-help
>>>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>>>
>>
>>______________________________________________
>>[hidden email] mailing list
>>https://stat.ethz.ch/mailman/listinfo/r-help
>>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>>
>
>

dep pred2 pred3 grp
0 -0.832299769325581 -1.67880317415923 1
0 -0.308299055464121 -0.364611545217099 1
1 0.630598386837566 -0.830616438489193 1
1 3.10913858356640 -0.521767356153283 1
1 0.903471812881371 1.91484272352174 1
0 0.90426769897266 -0.513292891585524 1
1 1.14495484917281 0.78473139460995 1
0 -0.234525952851387 -0.50697276499942 1
1 0.631041223657181 0.0885911624846033 1
0 -1.28176856712309 0.381144997887118 1
0 -0.199575490516357 -0.816864500080217 1
0 1.1185927379226 -1.89658827063818 1
0 -0.166954429106639 -1.72768529536314 1
0 0.410078885348753 -2.37013110454109 1
0 0.37016590773711 -0.320034871756389 1
0 -1.29319553111599 -0.243436474810949 1
1 1.44188541761604 0.46171661294066 1
1 1.03439142621889 -0.0934696125911877 1
1 0.941957044041486 1.46031495794676 1
0 0.63243411925538 -0.376900200288554 1
0 0.782797578938882 -2.17296669618680 1
0 -0.0434466627421846 -0.897181210612278 1
0 -0.918950768280257 1.40593647574051 1
0 -0.266797680590651 1.32852283384323 1
0 -1.97795808769759 -0.340096532983713 1
0 0.338662524103419 0.317005354534584 1
1 -0.778330737845841 1.83714022902796 1
1 0.167661592741467 -0.898198193914343 1
0 0.108266386948271 -1.31922947083775 1
1 0.799587591780234 1.20654772520622 1
0 -1.39792357224312 -1.17382560290430 2
1 1.41625128975700 0.0321732274019762 2
1 -0.0781989965585817 1.66218225184847 2
0 -1.57620982406135 0.122918359084620 2
0 -0.169615551813902 -1.32992468631561 2
0 0.0744278240167874 -0.793642608076887 2
1 1.33756270180946 1.43265155843536 2
0 0.171687202226864 -0.197474664007768 2
0 -0.253842074055872 0.422229019624458 2
1 1.51037367919741 0.708493875055813 2
0 0.126257617278352 -0.698370297940141 2
0 -1.41052021563966 -0.580830471211872 2
0 -0.740991848275336 -0.831723737484574 2
0 -1.38752149645338 -0.0645813982998873 2
0 -1.48202337888827 -0.810824176493156 2
0 0.434323217159074 -0.235959885057200 2
0 -0.3660190454198 -0.207497034351587 2
0 -1.55757318106926 -0.649666741994972 2
0 0.200298815968865 -1.25786648414456 2
1 0.60068195996731 1.39546043864496 2
1 0.92181552285589 0.169900540315928 2
1 0.963639582191724 -0.96652696802166 2
0 -1.10894528405182 -0.826447468782818 2
1 -0.143705873984295 0.85696358848945 2
1 -0.227197073907219 0.163108840069261 2
0 0.260778722736436 0.702350481108482 2
1 0.0079839656810117 0.640974302851168 2
1 0.520335955317318 0.82948476190548 2
0 -0.470053854425421 -1.41586935932929 2
0 0.318269807834773 0.221705522455281 2
0 -0.490245326301728 0.553620570804301 3
0 -0.276149978720162 -0.88110452374587 3
0 0.220721777999604 -0.653256457257882 3
0 -0.703075262079484 0.329683608971329 3
0 -0.22696243906353 -2.61950724697562 3
0 -1.09431005668124 -0.0232909455712828 3
1 1.28711954899881 0.344320685687299 3
0 -0.151754090955171 -0.0744843243935522 3
0 0.341919519960378 -0.0964112795642981 3
0 -1.12551027730654 -0.84933031360325 3
0 -0.83446590590987 -0.158563062623380 3
0 -0.519454023512489 -1.24574306199025 3
0 0.273951811480161 -0.738838977465222 3
1 0.644417981459438 0.48825723724319 3
0 0.80733294439454 0.38447235881228 3
0 -1.30801233701682 1.32830707446622 3
0 -0.691273996772228 -1.03695064187955 3
0 -0.55275315135725 0.712670673357253 3
1 1.18912706586104 -0.797779680340233 3
0 0.173813919484423 -0.212804192064869 3
0 0.510051214308946 0.407985297642093 3
0 0.0460404954363198 0.0781827813302751 3
0 -0.812726427004377 -0.961698454122676 3
0 0.316803703625498 0.436795213783597 3
0 1.59372338969642 -0.646972799801143 3
0 -0.264479048468343 0.436583419258436 3
1 1.79785791935398 -2.19022854850479 3
0 -1.25542914249998 0.759841939131259 3
0 -0.556703265843941 -0.596140993896742 3
0 -0.138859132587877 0.353420559076652 3
0 -0.54936814739631 0.5651403571148 4
0 0.0865897190005032 -1.28701512066411 4
0 -0.59359121771434 0.137688285767041 4
0 1.29103265636505 -1.20664947663906 4
0 0.593107076808349 0.398984411079726 4
0 -0.226418280356368 0.252338568472061 4
0 -0.622343771507656 -0.150521153702500 4
0 -0.430960330395192 -0.61365230036041 4
0 -0.897917002717756 1.24600197335951 4
0 -0.131652665221839 -1.14902442133866 4
1 1.05932918053364 -0.208043968099735 4
1 2.21374179592236 -2.11459091350310 4
0 -0.0984023158102187 0.620618589465858 4
0 -0.301419555524483 -1.51777513767374 4
0 0.705447946370506 0.535510970861683 4
0 -0.740935396882992 -0.554365880210576 4
1 1.05185254417717 1.44454045163646 4
0 -0.976862868950618 -0.28295698647597 4
0 -1.81719506639831 -0.214322369708696 4
0 -0.249925818554222 0.183451295261106 4
0 -2.21228328895419 0.123877231369233 4
1 1.38439754243109 0.505675670636377 4
0 -0.837221588634301 -1.50135483614086 4
0 -1.17154448913943 -0.708764217205316 4
1 1.50155644902162 0.43999620318975 4
0 -0.516075651409885 1.10156000900777 4
0 -0.583328580107054 -0.136624407275739 4
0 -0.805969095827047 1.23242239143692 4
0 0.520037308192361 -0.0532028414333528 4
0 -1.07675357578869 -0.246422992512653 4
1 0.84674606454176 1.24706822832629 5
1 0.50225429700895 -0.753501502939139 5
0 -0.110696991251422 -1.28963013849050 5
1 1.45669586789922 -0.0266212621300831 5
0 1.19164829320594 -1.6384147982708 5
1 1.2497582567896 -0.366144380048512 5
0 -1.59200279904562 -0.605612412386137 5
1 0.701642320563013 -0.692590957893450 5
0 0.826729750207121 0.104040242709383 5
0 -0.251589578605641 -1.20008533845167 5
0 1.18981892198469 -1.03838218637413 5
0 -0.48226798541373 -0.103516021510253 5
1 0.902430923145418 -1.25114657112499 5
0 0.480518605880176 0.223857697801260 5
0 -2.25438268634199 1.21126580568773 5
0 -0.383234731665990 -0.766293999344111 5
0 0.581319800323777 0.817394753080253 5
0 -3.3165427929598 -0.121062111785958 5
0 -0.1322580996873 -0.150578017302552 5
1 1.53193807897104 0.733939342102887 5
1 1.11908937603262 0.196818601090142 5
0 0.437875954273944 -0.0197136588508132 5
0 -0.0331688915453021 0.33400847926927 5
1 1.34543708556248 0.00907648260071153 5
0 0.213075347139783 -0.646249613641689 5
0 -0.69524684216634 -2.33922277678765 5
0 -1.57279905654483 -1.87538951834327 5
1 1.07936855487838 1.18446965022099 5
1 0.306985931587578 0.724405953653822 5
1 1.23181672693378 -0.0701925304204026 5
0 -0.919888911082209 0.477276404654286 6
0 -0.663571170286572 -1.46630908083576 6
0 -0.724039231476495 -0.061019673195811 6
0 0.0634460402565889 -1.08461362641975 6
1 -0.269323637008873 0.731446475758064 6
1 -0.146752286719258 1.51369083876776 6
0 -0.35095531255735 -0.234620941343001 6
0 0.273206507398862 -0.393903382508312 6
1 -0.182666577034182 1.83249929313715 6
0 0.681588488289945 -0.375859982611083 6
0 0.242434133831772 -1.64320325331244 6
1 0.188017184214761 -0.228076562308703 6
1 0.166649765922797 0.511793542958201 6
1 2.16402504068409 1.66375788446841 6
1 0.586097780802391 -0.289750400216476 6
0 -0.757716234538208 -0.356355379036605 6
0 -1.11703624150270 -1.02103411104842 6
1 1.32065912451240 -0.529344207548823 6
0 -0.563323776359257 -0.575116970325914 6
0 0.0761081554417085 0.466755576888024 6
0 -0.0393307073575922 -2.09949197028072 6
0 -1.45391429237201 -1.78607291521398 6
0 0.180700026079273 0.785140819957987 6
1 0.842204310528479 -0.078203655202921 6
0 0.957483010083191 0.0933486661821926 6
0 -1.34841123138060 -1.25281641699431 6
0 0.460944146864481 -0.516097804035054 6
1 1.67680834988551 -0.7539974457697 6
1 0.204256257528345 -0.389312192486669 6
1 0.546750712218541 3.06217530791803 6
0 0.0172078969431791 -0.101707833219385 7
0 -1.37010127340567 -1.02691608953111 7
0 2.56322137123985 -0.773240110395935 7
0 -0.507174946504427 0.040499359830619 7
1 1.47453663326518 0.394613501208432 7
1 0.566343961852793 -0.358372221352661 7
0 1.07987945451752 -1.26905542641279 7
0 0.48707001768777 -0.832979997875164 7
0 -2.55931246086968 -0.0387626614737304 7
0 0.6047256807427 -1.27952621228527 7
0 -0.800975639650827 -1.19892253215496 7
0 0.716445890243245 1.86048748902398 7
0 2.57842613918261 1.11040509258097 7
0 0.161390247713138 -1.25216129083268 7
0 0.692254615374798 -0.273053701220116 7
0 0.580742663805351 -0.440022666876242 7
0 0.0333740430940438 -1.29808134689932 7
0 0.429181268475771 0.891622761392008 7
0 -0.957211975299114 1.74875584749294 7
0 -0.320315263904051 1.38442720841526 7
0 -0.518641939927687 -0.617700091693557 7
0 -1.20846935175166 -1.08696366448480 7
0 0.409554570285672 0.144434888179345 7
0 -0.515552017873684 -0.070428657566001 7
0 -0.462212101043838 -1.46343847177255 7
0 -0.00462842110860983 0.86885032383084 7
0 1.11215966563818 -2.19016842842290 7
0 -0.095943103004817 -0.265688534909634 7
0 -0.813739915796608 1.04977208376393 7
1 -0.975574120042433 -0.478434046891119 7
0 -1.12185075962718 -0.462242205693225 8
1 -0.334687794852943 -1.01829805236460 8
1 1.38611021083734 0.503281001944003 8
1 0.920372355987816 0.305833067019153 8
0 -0.884970870283891 1.960905790486 8
1 1.28985617901913 -0.00149956005647031 8
0 0.0778815214946348 -0.517956556797686 8
0 -2.06184563024002 0.0280155538025018 8
0 -0.317141416850045 0.314623807401671 8
0 -1.49653749498563 1.06981480191435 8
0 -2.43843441106820 -0.239240224091172 8
0 -0.0151669309041355 -2.14961221758577 8
0 -0.87343341371374 0.0380076279097493 8
0 -1.5467719747319 -0.491996493369805 8
1 0.902122956206809 -0.892149687265836 8
0 -0.376632917446526 1.42669823620577 8
0 0.486194125509503 0.807454372286888 8
0 0.916156290278285 1.54950897817856 8
1 2.13593485939650 0.0531118603524952 8
1 1.42158974742330 3.83829659950465 8
0 -0.122253419813331 -0.0745502942393836 8
0 0.157383928573792 -1.14170627323008 8
0 -0.978439574811127 1.12166052946801 8
0 -0.985687025250555 0.3774721170096 8
1 0.713852538831987 1.43188231151794 8
0 -0.649829588740366 0.0584520738567654 8
0 0.193563052726598 0.512951309429883 8
0 -0.538005837648731 -0.112951562688504 8
0 -0.788831478128212 0.462358950603167 8
0 0.374094832725565 -1.32991485627294 8
1 0.352241229145249 0.562250011286888 9
0 -1.05215631778539 -1.74114055399709 9
0 -0.522589687963368 0.328414193078884 9
1 1.91980455298025 1.28117152841264 9
0 -1.19698337175958 0.137231274598480 9
0 -0.342957737575631 0.550318780019834 9
0 -1.00877030558756 0.742813587739608 9
0 -0.609052194629098 0.0570850107105122 9
0 0.94981286699914 -0.268572238949552 9
0 -0.159282637164353 0.75336912846542 9
0 2.00752437715434 -0.74125296307598 9
1 0.979699978359859 1.30618729994550 9
0 0.380482418123161 0.490924714141324 9
0 -0.601233831152873 0.543315677225777 9
0 -0.7013515108379 -0.664568420969133 9
0 -0.652526671596726 0.0763605255068936 9
0 -0.822704879841393 -0.647393531285262 9
0 -1.49663242578037 -0.0679862440883276 9
0 -1.58358195089004 -0.60109179531163 9
0 1.04029732398240 0.440759719452625 9
0 -1.63031032685937 0.118466998096396 9
0 -0.926613746016175 1.11079689678127 9
0 -1.53293552500695 -0.305256752345908 9
0 -0.705690506192276 1.08149201424252 9
1 0.462378929758432 -0.0509168140691024 9
0 -1.90504209857238 0.753341468876266 9
0 0.76992733862465 0.165601577046371 9
0 0.591381979204793 0.526637789236985 9
0 -1.21927156230053 -0.532305671819723 9
0 -0.547777324059503 0.585547156621238 9
0 -0.504082236181847 -1.82712629269852 10
1 0.396718780807625 -0.949145702669212 10
0 -1.29347946800091 2.25390297555326 10
0 -0.92177283317849 1.20178868498790 10
1 0.0805408037321608 0.841421293757345 10
0 0.208418005551224 0.380361636047264 10
0 -0.169621316747766 -0.255571127277655 10
0 -1.57231498788762 -0.295710119893565 10
0 0.513696015792655 -1.15959002434111 10
1 -0.538005142273579 0.777004708470148 10
0 -1.52514057077904 -0.504424570764622 10
0 -0.229493607284984 -0.779790275696481 10
0 -0.688477820707275 -1.65923613111410 10
1 1.35856477329391 -0.490351154011548 10
0 -0.156829706643682 -1.74506179751922 10
0 -0.0915368852979517 -0.257682874632584 10
1 -0.159644965402912 1.60417200435474 10
0 0.431404070322245 0.926605578045528 10
0 -0.882911607601614 -0.664639118573361 10
0 -0.0877961616974039 -2.07842472741901 10
0 -0.472365250637831 -1.09511220945447 10
0 0.388010569092914 0.351728051485935 10
0 0.228837083145049 -0.59686825402983 10
0 -2.11177261975197 0.307994631985372 10
0 -1.23554891941374 0.187932225389357 10
0 0.308357088673597 -0.685775972043022 10
0 0.323667994678980 -1.17945433052522 10
0 0.82236677343589 0.0930524184567396 10
0 -1.87075872571938 1.52576876599872 10
1 1.20879406557371 0.982667684213814 10
1 1.10266430237447 -0.320608041598663 11
0 0.124645642862156 0.494333414412012 11
1 0.836517592979297 1.14909402609648 11
0 -0.282343098131595 0.297805556483382 11
1 1.61599702324105 1.09222833097785 11
0 -1.20341421057695 0.650533884934341 11
1 0.0946481020286628 0.521212933554754 11
1 0.882422290275422 -0.261422833668422 11
0 0.681521728725491 0.00701052727864493 11
0 -1.58848334980268 0.955021855312823 11
0 -0.893056007320415 0.157173227492396 11
0 -1.16552430650929 0.503855482599274 11
1 1.28308978229921 -0.973529944164855 11
1 1.24963838763221 0.308048131053688 11
0 0.104066561938495 0.145979074672098 11
0 -0.89503703002368 -0.200437879851354 11
0 -0.467187782901472 -2.72038815479212 11
0 -1.14453859255276 0.0980265767743773 11
0 -0.0721420049649455 2.49321804925209 11
1 1.38616314690945 -1.48254076858745 11
0 -0.0146719674001681 -1.04258270000465 11
1 1.28890941786419 -0.00286067734382689 11
1 0.90825648498634 -0.738268770655777 11
0 -1.80853080119914 1.50828705943752 11
0 -0.500484614405076 -0.738989202286565 11
0 -0.552120601390749 1.1041088509462 11
0 0.843117475192254 -1.66458211480930 11
0 0.662344349656285 -0.777963492121107 11
0 -0.730188862636098 -2.1901241955952 11
0 -2.81363998003226 1.84022504045505 11
1 0.086044702070397 -0.702911739825897 12
1 1.34433334068654 1.28360847092290 12
1 0.251891500435721 2.07226315228226 12
0 0.224336538270969 -1.25833324662926 12
0 -1.12104919373692 -0.370603314707898 12
0 -1.53435277630678 -1.86928693702925 12
0 0.258043756574939 -0.370503172092549 12
1 2.66873388954625 1.00910010564530 12
1 0.290434832291129 -0.445894218040203 12
0 -0.0248126816244334 -2.33109016208254 12
0 -1.9286352330831 -0.60077811530749 12
0 -0.524776852941538 -0.499404280825145 12
0 -1.34350233135942 0.0599143505370495 12
0 -0.24239202266593 -0.0510517657124707 12
0 -0.910992927906503 0.706094280197348 12
1 0.0570525647780904 0.147974668454952 12
0 -0.173658972763139 -0.852625338358202 12
0 -1.22411512428813 -1.79524639721137 12
0 -0.96448136408751 -0.95196349951678 12
0 -0.979923740080496 0.962221244422385 12
0 -1.13652473063222 -1.14414813458517 12
1 1.96070245044971 0.528136077206296 12
0 -0.429914670637842 -1.39770448586153 12
0 -0.50120969047023 -0.786752864438409 12
1 -0.384296256219949 0.284131841164239 12
0 0.0732403143415486 0.520525801944549 12
0 -0.0960834306966799 -1.25251467942447 12
0 0.326101171158566 -0.937858608825269 12
0 -0.153546577038208 -1.51774312540946 12
1 2.48254399845744 1.76009870352291 12
0 -1.09496617188961 -0.321533113760303 13
0 -1.94388033841476 -0.0646645935666742 13
0 -2.10153389611632 1.29489650397050 13
1 0.79520911966323 -0.825898692621387 13
0 0.235149187726147 -1.58914372172423 13
0 -1.70987588055187 -0.103424653306163 13
0 -0.95384554631588 -0.575554498397343 13
1 -0.687315953753797 1.01024352373680 13
1 0.355008734677528 0.441398246796502 13
0 -1.41665179766565 0.685525484292133 13
0 -2.59511445621622 1.30812141296419 13
1 0.99382685010476 0.530925415173598 13
1 1.21095581528523 1.1492527226314 13
0 -0.859338656435497 -0.278330867269102 13
1 0.989885029284654 0.204847891145183 13
0 -0.428376964475937 -1.05374240862369 13
1 2.32538597722996 0.483309993898259 13
0 0.000586205232193108 0.097868237471109 13
0 0.0293156982548206 0.390744400897392 13
0 -0.366986446209177 -1.05425108708075 13
0 0.0144160610262478 -0.0546706047568215 13
1 0.963354164489189 -1.27647970807208 13
1 1.24741195105527 -0.621070112995144 13
1 0.603062141924585 0.671980988224254 13
0 -2.53544500055687 -0.0357697092987977 13
0 -0.0305829435665775 -0.477149243195049 13
0 -1.46221033388881 0.465696423968454 13
1 -0.42006332069103 0.892011637000155 13
0 -1.66849233127782 -0.617397925075591 13
0 -0.204525463141302 1.648775087203 13
1 1.08754959411283 1.81137548918496 14
1 1.86769483259646 0.714398797896161 14
0 -0.792196219583052 1.10981546436624 14
0 1.01183601395461 -0.864958972053073 14
0 0.225914457742523 0.398032317929939 14
0 -1.39908780081124 -1.65593062552734 14
1 0.479554974225379 0.719458615682166 14
0 -1.02109341454516 0.732780457520495 14
0 -0.393571975684696 -0.105359765666466 14
0 -1.65252880027835 0.422266166528251 14
0 -1.25228547294298 1.31918823240716 14
0 -1.19236257905887 -0.544371393684021 14
0 -1.54677928084119 -0.694285017372465 14
1 3.29396675483193 -1.84017768897959 14
0 0.459292676079817 -0.89184624664535 14
0 0.237013943544054 0.831784508857591 14
0 -0.32906888841791 1.32305707593525 14
0 -1.40507540704093 -0.154737736725519 14
0 -0.0695631886811407 0.0574834579252943 14
0 -0.146789806215363 -0.496837997861411 14
0 -0.234111112254029 -1.16178475557145 14
0 1.8426640673097 -1.04827926361109 14
0 -2.17383616294947 0.0265757991485158 14
0 0.477648239850021 0.901562721051203 14
0 -1.20974221961819 2.08301942539896 14
0 -0.738211922224691 0.214381890098052 14
0 -0.00800890073131998 1.56416656981743 14
0 -0.264757994079075 0.0178629426084996 14
0 1.86171714000819 1.90817906700495 14
0 -0.558701018435082 0.733499720585578 14
0 -0.865886034384752 0.442196143971558 15
0 -0.406686748739648 -0.506437910563362 15
0 0.836614644092183 -0.807464084199793 15
0 0.531628175507993 -0.50522576703824 15
0 -0.263277300347516 -1.05343202815237 15
0 -0.297128609798502 0.779758407553139 15
0 -1.01563839345607 -0.912796060567065 15
0 -0.902205261011733 -1.45770787634579 15
0 -0.0696088696548177 -0.349921610618962 15
0 -1.22973548053849 1.32745126369630 15
0 0.709927148208838 -0.264148417754600 15
0 0.149479587127593 -0.172092566711625 15
0 -0.440386021515528 -0.786312572339187 15
0 -0.261495768893566 -0.490923188269731 15
1 -0.223842960366522 2.33293081944489 15
1 1.31804018183548 0.974818533906543 15
0 -0.544716513784916 -1.28610408829743 15
0 -0.67517764245657 0.593437280880769 15
0 -1.15934355470588 -0.727342934075602 15
0 -1.07445830974061 0.0385479137082245 15
0 -1.32470407068840 -1.66357961834509 15
1 1.14460637231997 0.430325405368745 15
0 0.56966201723791 -1.12233945216559 15
0 -0.352097081497585 1.30664211936628 15
0 -1.45495018908162 -1.59895407764635 15
0 -0.00487916906597153 -0.00576100676056441 15
1 1.52854755107812 0.31956999262741 15
1 1.99064490066435 0.60753889928368 15
1 1.48692883082317 0.15456937712884 15
0 -0.95357296084049 1.51705409774523 15
0 -1.55101128764011 0.921547455516857 16
1 0.77592323092101 0.283606793352603 16
0 1.06170110044901 -0.828690381401407 16
0 -0.440655887228902 0.322290026155759 16
0 -0.536565834414531 0.946084903085436 16
1 1.39145889073658 1.47400306558842 16
0 -1.47096001804087 -0.393842672461817 16
0 -0.318312999835945 -1.22210813566608 16
1 1.82659583997352 0.109776857289715 16
0 0.939561374986587 1.01022405183367 16
1 1.12037048838811 -1.64482931676532 16
0 -0.235636036224073 -0.824831281614542 16
0 -0.740755859680485 0.552373546306527 16
0 0.659520608107892 1.14965117102797 16
0 0.0790873282799428 -2.63109363604361 16
0 0.266646262521125 -1.49227222377525 16
0 -0.407593223899886 -1.19277426000872 16
0 -0.475306000569453 0.447544879616341 16
0 -1.98946280123186 0.54611902836258 16
1 0.796199085038531 -0.0694924974441781 16
0 -0.881382934564273 1.44061441781661 16
1 0.528849579192021 1.73982303765818 16
1 1.18073039407243 -0.657195930859719 16
1 0.363832702628949 0.441828986412673 16
0 -0.428524201050796 -0.273397106604899 16
0 -0.75858922746716 -0.0632613297740682 16
1 0.34044516248751 1.37794389035647 16
0 -0.777874389267938 0.321129947285757 16
0 0.183486160149468 0.572341591539134 16
0 0.445039712303777 1.45930436051826 16
0 -0.443045505403508 -0.67978500824133 17
0 0.135633951647771 1.07711556104910 17
0 -0.553147264610169 0.0902248428294424 17
0 0.00937570490320113 0.0611402777427867 17
0 0.372277854021795 0.100723407650452 17
0 0.519786977502631 -1.01114328990972 17
0 -0.452261764838862 -1.16534288837980 17
0 -1.12142625314041 -1.46626028361544 17
0 -0.723609158484627 0.608367913938929 17
0 0.0229811055501538 -0.88146617000202 17
0 -0.981913819359972 -1.16261322187429 17
0 0.14956375871895 0.601464082555944 17
0 0.495202041104176 0.494596362925564 17
0 -2.15893906475216 0.51611263031782 17
0 -0.435233435871235 1.21760849616707 17
0 -1.80861286709999 0.624349159503658 17
0 -0.550616068067364 -0.492782684735982 17
0 0.285429861178697 -0.748967934515054 17
0 0.277893240283312 1.18482584819827 17
0 -0.943764161353195 -1.12799349575985 17
0 -0.385491182309278 0.121532067565899 17
0 1.60032075916878 -0.00340680438257541 17
0 -0.0707277956468931 -0.400563639092005 17
0 -0.617853028991443 -0.372358957357196 17
0 -0.545574860540945 1.21452548777187 17
0 -0.481567962234248 0.66521989882239 17
0 -1.19866751394092 2.28846899210326 17
0 -1.29935925912260 -0.638802133023374 17
1 1.22037260961074 2.43835625950646 17
0 1.50369331821888 -1.39665739283585 17
0 -1.22321450344753 -0.587043370579141 18
0 -0.894318267768654 -0.71542772987144 18
1 1.2458532690666 0.288140324058812 18
1 -0.20191651523307 0.149615803728453 18
0 -0.495997205284986 0.181484805254462 18
0 -0.0871815183225838 -0.41341327049775 18
0 -2.05549895331074 2.7361264518136 18
0 -0.0521027896708521 1.10901391590386 18
1 0.690965759190466 1.00839048543978 18
0 -0.141283852035664 0.0850809476219265 18
0 0.516494433047075 0.174016547338858 18
0 -0.796597471281735 -0.110139119797216 18
0 -1.11911574121814 -0.445105128281765 18
0 -0.164920052062653 0.067023947557097 18
0 -0.493620073950104 -1.45673442629996 18
1 1.11260286764458 -0.866494406212042 18
1 1.40571603813951 -0.061741884798058 18
0 0.519830317779348 1.27642484444701 18
0 -3.56505147261245 0.364417505912863 18
1 0.972249311441788 0.591117399815239 18
0 -0.234929072129907 -0.236619185591319 18
1 0.762434829775483 0.908246741691203 18
0 -0.641057762179862 1.48562208927077 18
1 0.981004802159744 0.328771034632375 18
0 -0.308752619316994 0.590399515928942 18
0 -0.379712912300093 -0.242250153086643 18
0 -1.63850578029125 -0.876789862698695 18
0 -1.42963124916782 0.245931639440858 18
0 0.435333692249852 -0.740977140857627 18
0 0.538787886198466 0.651868354744412 18
1 0.737622709107121 1.87921309468464 19
0 -0.457220662399179 -1.24459861585519 19
1 0.652854613946687 1.6269278994429 19
0 -0.689756881823752 1.23510268500758 19
1 0.118064188461892 0.418906171149208 19
1 0.525988566323315 0.0963224899751922 19
0 -1.76705089688460 1.59338821194678 19
1 0.77509903884034 0.384494753734261 19
0 -0.847770741569922 0.491856983435367 19
1 0.658602035090962 2.24240121580213 19
0 0.692615905774285 -0.395491845732234 19
1 0.650459171784845 -0.661553419724731 19
1 0.170154793677856 1.78571468157438 19
0 -0.415332784938741 0.458987306723669 19
0 0.2958764925582 -1.12632212607051 19
0 0.505429928105289 0.480227118436985 19
0 0.259014472096903 -1.47319818498822 19
0 -0.78614739810647 0.329471262685779 19
0 0.992088533952774 -1.30758505809895 19
1 1.8950681441408 0.997521719663752 19
0 -0.616206305169458 -0.133821166217684 19
0 -1.02787640268475 -0.289957872614792 19
0 0.415403125502618 -0.525324650997477 19
0 -0.0224962921018651 -0.160842178133076 19
1 0.091731078701501 0.277585926758944 19
1 -0.318106244852093 -0.472657387946097 19
0 -1.19143741011813 2.39035805203590 19
0 0.375980170740284 0.606887460896474 19
1 1.38486444503054 1.53030323590866 19
1 0.298802272387582 0.59660159620639 19
0 -0.0736795250795902 0.991263114363719 20
0 0.118468643373590 0.778282877419433 20
1 2.09972826646574 1.20786320016429 20
0 -0.713590657470114 0.71137304623902 20
0 -0.357998810810069 -1.10283201718446 20
0 0.293969391674282 -0.363703662455449 20
1 2.33175089728254 0.407543462266131 20
0 -0.531583078070963 1.83792116585249 20
0 -0.533932479643344 -0.625744124313377 20
0 -1.20473382825124 -0.647108728549238 20
0 -0.675497650828518 1.06847808410627 20
0 -0.9188637626031 1.50455241715333 20
1 0.40700416178753 1.31094542830213 20
0 -1.25045224620201 0.371579654125546 20
0 -1.08375527779938 -0.383874276832662 20
0 0.627160932357288 -1.46762769214617 20
0 -0.485907675551947 0.984579702681566 20
1 0.414924433093339 1.16325991244109 20
0 -0.590592054300904 0.452959150743942 20
0 -1.48447103826185 -0.192968672110827 20
1 1.73799225870842 1.01171568778920 20
0 -0.272063725855103 0.613710258829988 20
0 -0.387519267290261 -0.504044647994148 20
0 0.0734476530726249 -1.06468348951658 20
1 -0.838739768861018 1.88795968278158 20
0 -0.560718673860578 3.10575065226196 20
0 0.0661186404165914 1.07997014020863 20
0 -0.948739956652519 0.336699605183419 20
0 -3.28867294951223 0.211035777327718 20
0 -1.06563210355731 -1.04739483686403 20
0 -0.197957150889235 1.02548020590552 21
0 0.163439242268860 0.990796149898193 21
0 -0.873951192487593 -1.60987838555758 21
0 0.305277955328451 0.284879486938119 21
1 2.17596390363011 0.750389368615824 21
0 -0.8517990015261 1.01161870563604 21
0 -1.05219161387871 -0.462445623650634 21
0 -0.703594411923324 1.01575057291761 21
0 -2.03932628488176 -0.75466498148339 21
0 1.18085596791464 1.04482289058781 21
0 -1.00662122501257 1.05919197955978 21
0 -0.157054103893772 -1.60493348734091 21
0 -0.774858364632744 -0.914704433088452 21
0 -1.13327593841877 -2.35362763919430 21
0 -0.301233480919724 -0.460178474483009 21
0 0.181393557658695 0.721525088422351 21
0 -0.538138167758262 -0.449781606805314 21
0 0.783324936732158 -0.365450871566037 21
0 0.573614198601641 -0.284379920725200 21
0 1.18187700000278 0.295915368774074 21
0 -0.46872787900165 0.332076141377095 21
0 0.0297627035376201 -0.295283829842303 21
1 1.31319436121788 -1.12044842831112 21
1 0.505545708839969 -0.585068527945692 21
1 0.840180761083502 0.158762307679421 21
1 2.15799112326749 -0.529955138175695 21
0 -1.02035636295294 0.82056833515643 21
1 0.0369390468022524 -0.407496418753728 21
0 -0.479994352081338 0.927388652904115 21
0 -0.922993408574858 0.781433427927929 21
1 -0.307512032757239 0.793344164922764 22
0 0.115018655058571 -0.0863814473989291 22
0 -0.171991675603027 1.32766128875728 22
0 -0.0695267218175115 1.89876075600269 22
0 -1.38865085131159 -0.196003082386862 22
0 -2.80624002495911 -1.12524640110414 22
0 1.12967426582775 -0.657734157723724 22
0 -0.976832193838693 0.23716472205266 22
1 1.53261612905124 0.245889243310388 22
0 0.585537305955095 -1.28142178166935 22
0 1.03585754169813 0.148064829863711 22
1 0.0063706625097436 1.12550644841089 22
1 1.91789145930371 -1.06313353226449 22
0 0.199132691194167 -1.48743000683762 22
1 1.84149826101963 -0.130003685073142 22
1 1.47283151535127 -0.348990760043939 22
0 0.383816936869458 1.14748093826432 22
0 -1.38305874359244 -0.80142599188741 22
1 1.85843469747238 -0.121582944447751 22
1 1.73141571297532 -0.790439523027202 22
0 -1.02673089512187 0.091890018179651 22
1 1.11081612459465 -0.264763033203195 22
0 0.558358491209774 -1.03463364848747 22
0 0.243347464052742 -0.60278978837622 22
0 0.534536569352243 0.00156808936825848 22
0 1.00614410038894 -2.52865796668551 22
0 -0.403962317933189 1.66921049953142 22
0 -0.294045175598332 -0.238988423658120 22
0 -0.676362600696124 -0.0392182778786299 22
1 0.446805804244298 0.491308108318127 22
0 -0.326234822088166 0.890954484089404 23
0 -1.82657624403363 0.181809970032308 23
0 -0.47631646683927 0.428662775458846 23
0 1.19881570909744 0.153138416938003 23
1 -1.08552362695782 0.0259121252783369 23
0 0.118531931598539 -1.25446854126668 23
0 -0.411100058007619 0.705200147139931 23
0 0.874961792826811 0.846350000578573 23
0 0.517823050953698 -1.39445488385576 23
0 -1.07643882072521 -1.69303535977243 23
0 -0.878850943234961 0.587365358005596 23
0 -1.40964883115877 -0.519298232170772 23
0 -0.124882913447400 0.279972476556546 23
0 0.0381998568046068 -1.47869218691650 23
0 1.84880456383161 -1.72497823403827 23
0 0.163249264050558 -0.317447803718512 23
0 -0.609445571763695 -0.451246745712258 23
0 -1.22381470033594 0.388796742799244 23
0 0.198080920769644 -1.06015129316363 23
0 -0.37989433973358 0.61128054688163 23
0 0.150078242182152 -0.392619591105715 23
0 -1.04227044668412 0.14024076161724 23
0 -0.415152830475172 -0.551410479601129 23
0 -0.514790694380296 -0.89992387366783 23
1 1.38291511932431 1.02036918615752 23
0 -1.08938863019774 -0.715506614875224 23
0 -0.147844209135426 -0.911345367214518 23
0 -0.546647998841834 1.33087890151799 23
0 -1.01983457025099 -0.27696570657433 23
0 -0.203204984808869 1.65487723514278 23
0 -1.16755681080634 -0.685097465767054 24
0 -0.0997470847399058 -1.15059886020236 24
0 1.53855528018936 -0.97016480855604 24
1 -0.343803621419355 1.76457568614555 24
0 1.00770279961635 0.249770950451444 24
0 2.26641862660301 0.569227300619502 24
0 0.538390922486461 -1.22484336382451 24
0 -1.22564810968063 0.220658826022425 24
0 0.285884373515213 -0.77701148640513 24
0 -1.00245465134810 -0.580694712224929 24
0 -0.0339822330899107 1.83788500659093 24
0 -0.912204272676936 0.543022846285731 24
0 1.26306810861323 -0.602171212396809 24
0 -0.0952850226185517 0.156126522378751 24
0 0.76223016089203 0.0070211994212416 24
0 -1.43891582070196 -1.53070791364994 24
0 0.42464849223795 0.501764804149733 24
0 0.510043984524038 -0.654448103284112 24
0 -0.567703088929011 -0.117894009307708 24
0 -1.42358459989365 -0.568106739102582 24
0 -1.15246547172212 2.28853656624881 24
1 1.43309275128209 0.905193630965345 24
0 0.501184235875358 0.332446397497448 24
0 0.00945057072066702 -1.62711461042993 24
0 2.12937940640769 1.16364234441865 24
0 -0.842186256750118 2.03553465139640 24
0 0.274868827016096 0.542345143596872 24
0 -0.0548892850689096 -1.17819053996697 24
0 0.869796950213005 -0.840996911601461 24
0 -0.0464454543000792 -0.0645163709855169 24
0 -0.23890845802898 2.09577994605619 25
0 0.299557115296246 0.74670412785083 25
1 1.69310480285494 -0.479686472714503 25
0 -0.546558122883724 0.116044548849761 25
0 0.0793956853505407 0.745428344116883 25
0 -1.19010001903975 -1.49246920508452 25
0 -0.969471930926585 0.252615374697239 25
0 -0.424442559208083 2.48635453727665 25
0 -1.40101307220114 1.3915729223426 25
0 -0.534234296970439 -1.72987125048728 25
0 -1.24665136328661 -0.104696434826391 25
0 -0.697010880246971 -0.89189647669366 25
1 3.13266193587642 -0.00536348644011856 25
1 1.44877906333457 -1.65100808143807 25
0 0.335973330152532 -0.252244325993639 25
0 0.190274700098407 -0.526648426094839 25
0 -0.703947380390457 0.295974212105942 25
0 -0.275215254905665 -1.11718756692863 25
0 -1.49337334361323 1.39795772437554 25
0 -0.755629857268438 0.0344454503071277 25
0 -0.151086216907116 1.08497546820611 25
0 -1.28363764888660 1.05453694422232 25
0 -1.25140285926501 1.91316835557301 25
0 0.93720919655649 -1.3956182727864 25
0 0.217662193917063 0.137312227169065 25
0 1.43730441738598 -0.40561417873691 25
0 -1.22496356884969 0.364105969046394 25
0 0.452598911981279 -0.283548627117107 25
0 -0.67080726595856 0.360482053685988 25
0 0.408446070251112 0.338680989723888 25
1 -1.56018893896899 1.37206704679335 26
0 0.0900341516039697 0.447831044116437 26
0 -1.85977785834727 0.748797365972187 26
1 -0.505749091085652 1.88646671228687 26
0 1.52922471558493 2.47278965601599 26
0 1.03053726597795 -0.478550833624283 26
0 -0.0870171720012745 -2.17948526570768 26
1 -0.381544250087392 2.45802332525101 26
1 0.196906128129992 0.0615732155654071 26
1 -0.191520734903644 -0.801865980004574 26
0 1.60017656217781 -0.301621609341602 26
0 -1.02205344758343 -1.21590198351106 26
0 -0.811303213815072 -2.52726659172226 26
0 1.03588624222890 -0.312529268273428 26
1 -0.667218643726994 -0.649281474282759 26
0 -1.59133477679926 -1.97099167807631 26
0 0.0373031658820746 0.931358430928478 26
0 0.102839478294933 -1.16650737323483 26
0 -1.16663952204414 0.495522203493648 26
0 0.725296466868997 1.15874629305505 26
0 0.154970189848987 -1.32517489474126 26
0 -0.424029139977570 -0.633513133003262 26
0 -0.594604460149257 -1.95389696469352 26
0 1.6721530952745 -0.776266717421722 26
0 -0.0176140956440492 -0.296059132457671 26
1 -0.888379521830832 1.97282847644371 26
1 -1.73919124015551 1.10760042281072 26
0 0.0637952293635686 1.90809410793724 26
0 -0.775162643691233 -1.28009375052183 26
0 -1.06642581902587 -0.192929476288895 26
1 1.69252032762783 -0.500620865285653 27
1 0.99234177011424 -0.411617223739249 27
1 0.077542503794092 -0.98880044825396 27
0 -0.608399623223175 -0.970224844076665 27
0 0.576538514390394 -2.90627497495297 27
0 -0.903955391590981 0.325858484224393 27
0 -0.453330278851448 1.11021891461383 27
1 1.81642509093527 -0.00652526888706355 27
0 -1.46406027946530 -1.59684556401661 27
1 -0.430849094477498 -0.84475037025255 27
1 0.573157563804419 0.253686537773903 27
0 -2.60670226870484 0.467748286226971 27
0 -2.04173079361907 -0.713731528493079 27
1 0.251763406136457 1.05751425580302 27
0 -0.726349134844929 -0.874757139392132 27
1 0.143559627442400 2.35783272563535 27
1 1.52894169504688 -0.886468564695659 27
1 1.35363931126917 2.30459496079738 27
1 0.849036046329303 1.87019352822687 27
1 -0.481252243947433 -0.53498150424071 27
0 -1.04549048212546 1.11937902514591 27
0 -1.62866205125608 0.178069075837072 27
1 -0.882283843662979 1.68756791105104 27
1 0.690262901686046 0.0205502574061394 27
0 -0.241536501768276 -0.289479627783977 27
1 0.45104200324038 -0.846736447330162 27
0 0.125095499488477 -3.46269388578144 27
1 0.949399939183653 -0.910349933917473 27
1 0.123193360612241 2.40615766266468 27
1 0.924749304239926 1.77107857545299 27
1 1.18262123169607 -1.0770028750267 28
0 -1.82952586604597 0.989352602163334 28
0 1.55719623696059 -0.659634812368886 28
1 0.522977811137677 -1.37155079007410 28
1 1.03202576161745 -0.806556947897602 28
1 1.11413522855997 0.252215051444443 28
1 0.950428931179031 -0.0960487601324759 28
1 -1.04383291319661 0.839886075743175 28
0 -0.562942303790394 0.838768451128838 28
0 -0.312272317146447 0.861593798128056 28
0 -0.672217099823377 0.0340501309907514 28
0 1.28063918373146 -1.45349767784975 28
0 -0.937314217615877 -0.503216035919759 28
0 0.708687498989957 0.834617813275994 28
0 0.831663236787463 -0.194764130772523 28
0 -0.179625785095183 -0.386830047887248 28
0 -2.01256256811822 -0.69829317190168 28
0 0.381850185099437 -0.474359438461021 28
1 1.08379364206006 1.50175339733174 28
0 -0.0297004135746841 0.438959395862471 28
0 1.45379312768231 0.624955765834754 28
1 -0.370281691227487 1.28436918334358 28
1 2.29341477982987 -0.129601265262925 28
0 0.207193820313301 0.669794679591892 28
0 0.421157594404356 1.091244111144 28
0 -0.322119358597287 0.111070803908535 28
0 0.882427128179176 -1.30699107953675 28
0 1.0476218564878 -1.80852085426895 28
1 0.834419000995315 0.890970920085738 28
1 2.20003195460197 0.149427347669074 28
0 -0.0905101012397135 -1.82993503469764 29
1 1.23325299224616 0.494289604173213 29
0 0.685347317810108 -0.172671051349583 29
0 1.05789120078412 -0.907803201077746 29
0 0.207375907042930 -0.515299355685532 29
0 0.435826374005326 -1.90945330522520 29
1 1.34028432224112 0.539872694292059 29
0 -1.10931473677543 -0.346759347253418 29
0 0.669881938717285 -0.750193430172544 29
0 -1.25360810083972 0.395551897055894 29
1 0.81296056221462 1.17039934814194 29
0 -1.92388094795694 0.64503860186135 29
0 0.0952769501779241 -1.81150954030585 29
0 -1.7813902412391 0.840361128325777 29
0 -0.779551391322467 2.23825169387159 29
0 0.242568659427344 -0.370466578788216 29
1 1.85581433262880 1.11698732675966 29
0 -1.15934979103717 0.609202153046251 29
0 0.795630234871605 -0.233102369265603 29
0 -1.43287008129182 0.490405619647644 29
1 2.26669484278084 0.381338273084701 29
0 -0.462707347064392 1.40602789857989 29
0 0.200593533344002 -0.233286336934925 29
0 -0.855935431794858 1.10180468114254 29
1 1.60786446907384 0.47093067728403 29
0 1.05523297950764 -1.12797171320202 29
0 1.21519522359700 0.299730388154583 29
0 1.00210208468768 -0.357872559407867 29
0 0.702797189716252 -1.62454733660948 29
0 1.60779100830893 -0.105783802790483 29
0 -0.531952894016549 -1.01740310607877 30
1 1.44882529352799 0.681895481124701 30
1 0.548384469764106 -0.0828860757758776 30
0 -0.163226052797180 -0.598783583635226 30
0 -1.14146750718479 0.483801267177237 30
0 -0.0314918589295942 0.0600773217681696 30
0 0.288472059213042 -1.30141049463516 30
0 0.107565672269718 0.176326132916380 30
0 0.66625948822712 2.86388057873676 30
0 0.156835711707007 1.32961816503809 30
0 -1.28694065888994 1.10361909161099 30
0 -1.18846973249183 -0.619487590497797 30
0 -1.51380635469206 -1.12915668675421 30
0 -1.17134706240151 0.0178740312838734 30
0 0.442624267480939 -0.430705264841602 30
0 1.17929499655708 -0.0158567584397257 30
0 -0.100219824494380 0.114554498292876 30
0 -2.48836924326975 -2.15537015476658 30
1 0.684815376861774 0.379599460358890 30
0 -1.11706009304473 -0.213392629937843 30
0 -1.09026979751000 -1.22920434079159 30
0 -1.68150006012392 -0.8584259420876 30
0 1.49090714864855 -1.76603610011475 30
0 -0.264599200949542 0.384494563084869 30
1 -0.516789920323952 1.24868066972612 30
1 0.59379776320524 2.037198756361 30
0 -1.62628715377043 0.0132928194887809 30
1 1.28589250779266 0.378714764086379 30
0 -0.0175253027185293 0.275067267924580 30
1 0.599932241553186 -0.172933752079525 30
1 0.568025967486367 1.51312257992713 31
0 -0.0171667315421332 -0.458234068149784 31
1 0.496419550746439 0.94974248857526 31
0 0.207145055809103 -1.06170378048368 31
0 -2.57899028839591 1.23029185352946 31
0 0.922521271065102 -1.2591205574326 31
0 -0.709615749999797 0.812184820557492 31
0 0.95697939790632 -0.714918331711232 31
0 -1.03482676593565 0.33059866459193 31
0 -1.46406220172074 -0.945078658522272 31
1 1.60305470509200 -0.603468165148643 31
1 2.23095433291494 0.70847940835935 31
0 0.46910287810457 0.339505190958510 31
0 1.38845377244414 0.948367194952934 31
0 -0.841283369779093 -0.543385322184171 31
0 -0.89684204430218 2.02680247912047 31
0 -0.280250908329323 -0.196616787000675 31
1 0.163549725170238 1.89887206663873 31
0 0.45373971518808 1.17837364053434 31
0 -0.314916534306533 0.793062031543245 31
0 0.0367834038026205 0.163175201027631 31
0 -1.57491021884182 0.321077781077429 31
0 -1.64697185127381 0.672760722164046 31
0 -1.01060352210931 -0.831552924463853 31
0 -0.0429909884207766 -1.75214933013751 31
0 -1.24753417553051 -1.67846780305366 31
0 -0.940512157209068 0.157966013390362 31
1 1.80535297232126 -1.32000689065008 31
0 -1.26159989355730 2.29355210189689 31
0 -0.868716291552895 0.844773553819418 31
0 0.216095770140806 -1.44338185242672 32
1 -0.208737023249632 0.0874649263997664 32
0 0.144949102577926 0.560225979487457 32
0 -0.113058929054180 2.00428947307390 32
0 0.433969620809148 -1.35440417465203 32
0 -0.194046035582743 -1.07892612678140 32
0 1.18599763803675 -0.493153763643435 32
0 -0.323571429137576 -0.221771979414027 32
0 0.73694382958152 -0.511173470679874 32
0 0.884195921784544 -0.811836284838324 32
0 1.62861412637237 1.04973263403742 32
0 -0.944125090659356 -1.44446909151807 32
0 -0.0327630774582896 0.882571997755937 32
0 1.54544454072332 0.535650302567806 32
0 0.477435690337875 0.319929741508776 32
0 -0.561511031258259 0.223971067263208 32
0 1.54978924922226 1.19973993458095 32
0 -0.271509051404668 -1.49789421456724 32
0 0.22920171636667 -0.0983848443211296 32
0 -0.549155425364687 0.610015254410945 32
0 1.13664376759897 -0.811755856192278 32
0 1.7209032063569 -0.525592311927352 32
1 0.216398902137129 0.58459574566049 32
0 -1.39851684488706 -0.337153929949655 32
0 -0.401785196726462 -0.58413942679452 32
0 0.712127999671192 -0.487179807400412 32
0 0.275301190329306 -1.27888830869209 32
1 0.892221445562822 0.731765574928885 32
0 -1.33643707767245 -0.0885041514401707 32
0 -1.39249298715095 0.272367552542975 32
0 -0.435044187971827 -0.152599263617162 33
0 0.556137257522227 1.32540819429938 33
0 -0.229841478458439 -1.06152143187682 33
0 -1.34475878668394 -0.252674092418901 33
0 -1.08887450878615 0.908197301725357 33
1 -0.100836663092512 0.506963504576155 33
0 0.211609959878819 -0.334809562492528 33
0 0.406402536561041 0.837740290000123 33
0 0.686215877214938 -0.294685704756879 33
0 0.169578662214155 -0.0338046122722492 33
0 -0.991066937196797 -0.308653756883524 33
0 -0.18161444368793 0.156425259215169 33
0 1.79061087867624 0.607993650094212 33
0 0.536951923443952 0.719050493058519 33
1 1.56976460630797 0.659592134740966 33
0 1.7741926181 0.199758730916888 33
1 1.03866281126407 1.09956368657074 33
0 0.57229202396274 0.127322378420519 33
0 1.09095923089499 1.11677647121042 33
0 -2.05991292248401 0.852935046028815 33
0 0.45233975633226 1.36885467383329 33
0 -0.201681544861177 0.850006510700148 33
1 0.360998572651341 0.595664317377182 33
0 -0.560570614500087 -0.132845977490904 33
0 -0.239446986854265 -0.0293533362907044 33
0 1.95418578404118 -0.109277400272273 33
1 0.14882071604307 0.739054518086478 33
0 -0.89196533302734 0.576479776139958 33
0 0.879236401729685 -0.442091933357997 33
1 1.29782679907877 1.94067868697945 33
1 0.62218117364493 0.752848225950944 34
0 0.0992493045890867 -1.88273438254301 34
0 -0.959464124768981 1.24777867544516 34
0 -0.171283717002419 -0.437810016712795 34
1 0.863518034073181 -0.169796944004820 34
0 -0.114830562219960 -2.03710047683965 34
0 -0.241278855924113 -0.762335396968167 34
0 1.26041783992824 0.0741781292442294 34
0 -0.321007788133158 -1.15943892702739 34
0 -1.98870971123678 0.235117022366558 34
0 -0.142431336627423 0.122979263486284 34
0 -1.74660661245163 0.940052183763837 34
0 0.535743279377669 -0.467537012134354 34
1 2.13935815083658 0.393849854449467 34
0 0.822981450307142 0.765767540559674 34
1 0.65685016668968 1.34297012732689 34
0 0.00445015010368195 1.22077687684860 34
1 0.428624647583222 0.478904184577184 34
0 -0.52168512005374 0.580739893212834 34
1 0.852878198719333 1.02947429539029 34
0 -0.632327987108088 1.88077995607729 34
0 -0.473127166020453 1.03703341340194 34
1 2.08679911783053 -1.17782756802619 34
0 0.694157497565013 -1.89205955280539 34
0 0.202031648196241 -0.996103270598506 34
0 -0.447712853360539 0.373290344356986 34
0 -0.940140509884492 0.455667745940236 34
0 -0.559701479657525 -0.390528043939936 34
0 0.142358337076513 0.272163746184128 34
0 0.722665212020313 -0.264924573858411 34
0 0.171578171358315 -0.84256289792041 35
0 -1.53246346675183 -0.600458708392764 35
0 -0.124619827358097 0.129673587345323 35
1 -1.44400899031355 0.643313258042392 35
1 -0.289838661403134 -0.0108140832393535 35
0 0.798972829258546 0.906302703320038 35
0 0.23303314498281 1.49932217047676 35
0 0.471823157049299 -1.98963320502194 35
0 0.304318073503345 -0.211737487508884 35
0 -1.14641897153576 -0.238496036128775 35
1 -0.978068231097714 2.01367175759270 35
0 -0.370810915796208 1.28905796766258 35
0 -0.371946315257336 -0.246561229178747 35
1 -0.981172550172825 1.26161105878654 35
0 1.01418926519813 0.705756612818995 35
1 -1.57087738321305 1.04626309678320 35
0 0.156331496051774 0.334100970757504 35
1 -0.867002061782054 0.282996655377804 35
0 -0.897380862820812 0.556076895658487 35
0 -2.53999727370706 -1.27928065628090 35
0 -0.059165282343189 2.35305846052828 35
0 -0.439941362075668 -0.388240002527981 35
0 -0.40069407144439 -1.75054201399514 35
0 0.147249146655498 0.489067534220923 35
0 1.43538525565586 0.230381544198631 35
1 0.686524031934441 0.316217195462482 35
0 -0.952689326732826 1.27051192536222 35
0 0.109664948414183 -0.226334673895838 35
0 1.16255167596435 0.874926505872412 35
0 0.242763044572235 1.21418144559431 35
0 -1.62452472475419 -1.25083254949684 36
0 0.681608907564711 1.06396031747701 36
1 1.12552114470179 -1.33352695841901 36
0 0.286254354965174 0.350561037521671 36
1 1.31219376311722 2.50341626606741 36
0 -0.220935235516831 -1.31507767235446 36
1 0.330996139640939 2.02816921896221 36
1 0.697882009667383 1.44789230789063 36
1 0.679592022319808 1.48661289274685 36
0 -0.597749171777849 0.348979914173066 36
0 0.0219548550141899 -0.8089251043759 36
0 1.12735288846299 -0.918808753930448 36
1 1.50545533134515 0.71191619337861 36
1 0.457426410916814 1.07734251760253 36
0 -0.751534512907216 -0.0790568875911928 36
0 -0.534049718350595 -0.499478746912996 36
1 1.43728797727963 0.0339346099264653 36
1 1.25670303140753 -1.11671399820031 36
1 1.01917842199109 -0.453007293345241 36
0 -1.41131483589653 1.19634787775172 36
0 -1.0886372966331 1.04376415307185 36
0 0.260529971989987 1.03875891929935 36
1 0.943000819494761 -0.117737369314224 36
1 0.179794510519171 -0.0158047191526089 36
0 0.746574361538626 0.439715884007684 36
1 1.19274565559680 1.08168911940164 36
0 -0.775720860890516 1.13509740234057 36
0 1.92667784264009 -1.13774566403546 36
0 0.162646896672268 -0.78693352240647 36
0 -0.0237552633641579 -2.08009174539077 36
0 1.72764023505393 -1.52531107294011 37
0 -1.20609548236476 -0.350729935001103 37
0 -0.496474551944354 1.15741861491568 37
0 0.818624656010395 1.3343635197404 37
0 -0.378686126801994 0.320711585603462 37
0 -0.220736353149926 -0.0617951165921405 37
0 -0.532576974510893 0.856095353829673 37
0 -1.08874161733530 -0.572306716117233 37
0 1.66354947355465 -0.655045427261152 37
0 0.923099915602742 1.43210633095733 37
0 -1.44474374359117 -0.572982836603795 37
0 -0.868179219446784 -1.78150575293452 37
0 1.12186277789507 0.719839314331108 37
0 1.52409136537473 -1.60156070331998 37
0 0.0207604270495041 0.900854116182646 37
0 0.0870166937510333 0.188902010342365 37
0 0.189948114128803 -0.879191591413344 37
0 -0.594344539071753 -0.121695083130955 37
0 0.0310695331388422 0.302724206668980 37
0 0.450234650597431 -0.498275918074686 37
0 -0.879029318562994 1.91325027124310 37
0 -1.19387358617754 0.415195589786768 37
0 -0.999996384705095 0.326182986607773 37
0 -1.22203210130583 2.37893693284816 37
0 -0.536791825105618 -0.0529613002133304 37
0 0.772007449612322 0.334701931528597 37
1 0.530740980453826 0.367298286612298 37
0 0.0411140579856267 1.00888498761251 37
1 0.0567886661529801 1.02437829187428 37
0 0.420286460619087 0.623985295230492 37
1 1.66834800941914 -1.10686490888038 38
0 0.47912853490631 -0.146104597480991 38
0 0.231147329299359 -0.587423220808043 38
0 -0.492539346026769 0.0401049770255624 38
0 -1.58479705235987 0.70945382707776 38
1 0.696630483538953 2.72987301776056 38
0 0.167701635593668 -0.711787689887907 38
0 -1.89590889969603 -2.74634013880162 38
0 0.173961416148024 -1.07640458984864 38
1 1.95061183115667 -1.70089195730905 38
0 -0.85070410850467 1.44770877129908 38
0 -1.02589702898522 0.813627590649892 38
0 -1.89694342324999 0.485841417357191 38
0 -0.0485975388147613 -0.334833510355513 38
1 1.53386246671855 0.652226282581049 38
0 -0.853430077526638 0.74580650192301 38
0 -0.42125440699162 0.336240501129863 38
0 -0.346315994398499 -1.63617266227481 38
0 -1.57585182389610 1.59160020737885 38
0 0.343920473740171 -0.397542898456665 38
0 -0.416552081147817 1.20769094437198 38
0 0.114591095308867 -2.23923109001922 38
0 -1.05628217432432 -0.833779840964469 38
0 0.0340174806748752 -0.592897335834245 38
0 -0.362120691857639 -0.162571560636123 38
0 -1.06801694532928 -0.45705568011104 38
0 -0.381467762021923 0.108696085378322 38
0 -0.674943413776984 0.265908564416539 38
1 0.471578268601214 0.456313360180369 38
1 1.77388130128131 0.28460452132468 38
0 -0.990084778542665 -1.24264460210266 39
1 0.417396171275101 0.45555066291817 39
0 -0.700200427313856 -1.03727908131435 39
1 0.182343173744050 -0.486032289451232 39
0 0.209268560290966 0.594715669055158 39
0 0.542370951547831 -0.510328331217877 39
0 -0.964491166345883 1.25401549522843 39
0 -0.588988551068334 1.02060422568312 39
0 0.979268635317629 1.10393087400257 39
0 -1.06953137518773 1.63499547451183 39
0 -0.404524959652447 -0.495297776633022 39
1 -0.0927247424198279 0.0669356061614732 39
1 0.60440477335908 -0.491661807200449 39
1 2.27523039984576 -2.33079361386238 39
1 0.603861262510237 2.05214553642473 39
1 1.63091282623697 -2.05371886137414 39
0 -0.204862872401721 -0.900565825454433 39
0 -0.186842288477909 -0.313668853373807 39
0 0.0700769451695344 -0.469010692777151 39
1 0.234234136996913 1.51713150373395 39
0 0.801563166695383 0.436498504259955 39
1 1.28015251069158 0.486762900213529 39
1 1.02159253285159 -1.41811169643953 39
0 -1.38345159498863 0.732422770177986 39
1 0.963505794870392 0.236726192415211 39
0 -0.530845067498996 -0.848416622165054 39
1 0.94007379927763 -0.503188616559407 39
0 0.435638952523949 0.787611025386835 39
1 0.440701540543045 -1.67724594069816 39
1 0.606980283130353 0.0553151888953395 39
0 -0.156687845353137 -1.73608237082408 40
0 -0.338869269241685 -0.538933429594183 40
0 -1.61202309905023 -1.51526755892797 40
0 -1.6857988460799 -0.840741904787467 40
0 -3.58316567391177 0.353921494211976 40
0 -0.624886984704605 0.465445910904814 40
0 0.494715180266173 -0.765464804666817 40
1 1.54961698344287 0.095539761252929 40
0 -0.71724914200417 -0.839336368096103 40
1 1.83173972382229 3.23420153541580 40
0 -0.881026763972126 0.203044852352187 40
0 -0.993283458743837 -1.48444045250744 40
0 -0.823625565420861 -0.154644474480169 40
1 0.373034958124586 1.44624819578258 40
0 -1.71894019820474 -0.292392073263137 40
0 -0.570782056757786 -0.587885498763908 40
0 -0.698184905477773 0.400334734498972 40
0 0.071336759198262 -0.135264917825602 40
1 1.67387948793201 -0.431050916544799 40
1 1.43668796054224 -1.12721211943633 40
0 0.0723269239414386 -0.458448001153458 40
0 -0.787520384502937 0.584094182554619 40
0 -1.10915325518602 0.834125902031889 40
0 -0.338635586450247 0.281559601883557 40
0 -0.384905124123718 1.54579997403642 40
0 0.325432769334364 -1.37091429447479 40
0 -2.1372613693834 -0.74786817026355 40
1 1.88516296477786 1.28200385042219 40
0 -0.979775143987174 0.0911843815159694 40
0 0.314287975814384 0.363736522370921 40
1 0.504897700819593 0.911306826746873 41
1 0.735636309053802 -0.642662722461345 41
1 0.770328664272931 0.663152627963951 41
0 0.78501582981929 2.01384098356543 41
0 -1.10196267327791 -1.43172535964848 41
0 -0.624445013910017 -0.324973725567104 41
0 0.193471127020558 0.688050402435609 41
0 0.739300608622684 0.554854398686915 41
0 0.484077308324122 0.591045585938717 41
0 -0.315156329192753 1.47491871751687 41
0 -1.07639258606322 0.0844636756059656 41
0 -0.85466763981071 0.941475978471808 41
0 -0.209437549637753 0.962503082686234 41
0 -0.836645546421282 -1.08620243533946 41
1 1.98309737117392 1.21177919330621 41
0 -2.07987758846258 -1.16475871749237 41
0 -0.63359813947065 -0.386459343291285 41
0 -0.486846462696719 -1.00238061037842 41
1 -0.501595573653452 2.53179726714301 41
1 1.56091305434077 1.69067856183085 41
0 -1.56629370435627 1.15729676411271 41
0 1.26185205681162 0.123921731409261 41
0 -0.279245852779316 -0.972618206618266 41
0 -0.0540033204790167 1.37782502977372 41
0 -1.22167619537809 -0.203146500180569 41
0 2.0235288373097 0.746404649850615 41
0 0.413458885528054 0.81799639822508 41
0 1.05028592898343 0.515858662280838 41
0 0.809001752032183 -0.55500797659237 41
0 -0.718849253294117 -1.26568192189986 41
0 -2.19727845601966 1.20589591796630 42
0 -2.08076690085422 -1.57131876361738 42
1 0.515356152031725 0.757777362444648 42
1 1.55892322488508 -1.09712562584618 42
0 0.721099074728238 -0.294040106758496 42
0 0.381519580484401 1.07940995367245 42
0 -0.214317467030172 -0.0743703822423824 42
0 -1.46908289493717 -0.5548997331201 42
0 -0.538354110117335 -0.794763013283968 42
1 0.22554458126164 -0.0172744000916591 42
0 0.170873364627911 0.599611258400306 42
0 0.255645871467294 -0.345567917198399 42
1 0.634205327968053 -0.60485865825505 42
0 -0.384661947970979 -0.0314711657138624 42
1 0.848617537849377 -0.579155163747934 42
0 0.366320041812594 -0.607145460323003 42
0 0.318974032831185 -1.91352651635572 42
1 1.32711613601321 -0.7233481
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Roel de Jong
One thing I forgot to mention: the data was generated and fitted with
the binomial probit (not logit) link

Roel de Jong

Roel de Jong wrote:

> Dear professor Bates,
>
> thank you for your reaction. To make sure that no errors occur in the
> data generation process I used the elegant function you so neatly
> provided to generate a couple of datasets under the model specification
> specified earlier. Running lmer with a Laplace approximation to the
> high-dimensional integral in the likelihood gives me a warning an then
> this show-stopper:
>
> Warning: IRLS iterations for PQL did not converge
> Error in objective(.par, ...) : Unable to invert singular factor of
> downdated X'X
>
> Fitting the dataset with glmmADMB gives no apparent problems and
> reasonable estimates. I attached the particular dataset to the email.
>
> The difference in computation time can be attributed to the fact that
> glmmadmb uses a generic technique called automatic differentiation with
> the Laplace approximation. The same technique can be employed to fit
> much more complex nonlinear models, but I'm sure Hans & Dave can tell
> more about it.
>
> Best regards,
>     Roel de Jong
>
>
> Douglas Bates wrote:
>
>> On 12/15/05, Roel de Jong <[hidden email]> wrote:
>>
>>> Dear R-users,
>>>
>>> because lme(r) & glmmpql, which are based on Penalized Quasi Likelihood,
>>> are not very robust with Bernoulli responses,
>>
>>
>>
>> The current version of lmer takes method =  "PQL" (the default) or
>> "Laplace" or "AGQ" although AGQ is not available for vector-valued
>> random effects in that version so one must be content with "PQL" or
>> "Laplace"
>>
>>
>>> I wanted to test glmmADMB.  I run the following simulation study:
>>
>>
>>
>>> 500 samples are drawn with the model specification:
>>> y = (intercept*f1+pred2*f2+pred3*f3)+(intercept*ri+pred2*rs)
>>>     where pred2 and pred3 are predictors distributed N(0,1)
>>>     f1..f3 are fixed effects, f1=-1, f2=1.5, f3=0.5
>>>     ri is random intercept with associated variance var_ri=0.2
>>>     rs is random slope with associated variance var_rs=0.4
>>>     the covariance between ri and rs "covr"=0.1
>>>
>>> 1500 units/dataset, class size=30
>>
>>
>>
>> Could you make the datasets, or the code that generates them,
>> available?  My code for such a simulation would be
>>
>> genGLMM <- function(nobs, gsiz, fxd, Sigma, linkinv = binomial()$linkinv)
>> {
>>     ngrp <- nobs/gsiz
>>     ranef <- matrix(rnorm(ngrp * ncol(Sigma)), nr = ngrp) %*% chol(Sigma)
>>     pred2 <- rnorm(nobs)
>>     pred3 <- rnorm(nobs)
>>     mm <- model.matrix(~pred2 + pred3)
>>     rmm <- model.matrix(~pred2)
>>     grp <- gl(n = 1500/30, k = 30, len = 1500)
>>                                         # linear predictor
>>     lp <- as.vector(mm %*% fxd + rowSums(rmm * ranef[grp,]))
>>     resp <- as.integer(runif(nobs) < linkinv(lp))
>>     data.frame(resp = resp, pred2 = pred2, pred3 = pred3, grp = grp)
>> }
>>
>> Running this function gives
>>
>>> nobs <- 1500
>>> gsiz <- 30
>>> fxd <- c(-1, 1.5, 0.5)
>>> Sigma <- matrix(c(0.2, 0.1, 0.1, 0.4), nc = 2)
>>> set.seed(123454321)
>>> sim1 <- genGLMM(nobs, gsiz, fxd, Sigma)
>>> (fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial))
>>
>>
>> Generalized linear mixed model fit using PQL
>> Formula: resp ~ pred2 + pred3 + (pred2 | grp)
>>    Data: sim1
>>  Family: binomial(logit link)
>>       AIC      BIC    logLik deviance
>>  1403.522 1440.714 -694.7609 1389.522
>> Random effects:
>>  Groups Name        Variance Std.Dev. Corr
>>  grp    (Intercept) 0.44672  0.66837
>>         pred2       0.55629  0.74585  0.070
>> # of obs: 1500, groups: grp, 50
>>
>> Estimated scale (compare to 1)  0.9032712
>>
>> Fixed effects:
>>              Estimate Std. Error z value  Pr(>|z|)
>> (Intercept) -1.081710   0.121640 -8.8927 < 2.2e-16
>> pred2        1.607273   0.141697 11.3430 < 2.2e-16
>> pred3        0.531071   0.072643  7.3107 2.657e-13
>>
>>> system.time(fm1 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1,
>>> binomial))
>>
>>
>> [1] 0.33 0.00 0.33 0.00 0.00
>>
>>> (fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1, binomial,
>>> method = "Laplace"))
>>
>>
>> Generalized linear mixed model fit using Laplace
>> Formula: resp ~ pred2 + pred3 + (pred2 | grp)
>>    Data: sim1
>>  Family: binomial(logit link)
>>       AIC      BIC    logLik deviance
>>  1401.396 1438.588 -693.6979 1387.396
>> Random effects:
>>  Groups Name        Variance Std.Dev. Corr
>>  grp    (Intercept) 0.35248  0.59370
>>         pred2       0.46641  0.68294  0.077
>> # of obs: 1500, groups: grp, 50
>>
>> Estimated scale (compare to 1)  0.9854841
>>
>> Fixed effects:
>>              Estimate Std. Error z value  Pr(>|z|)
>> (Intercept) -1.119008   0.121640 -9.1993 < 2.2e-16
>> pred2        1.680916   0.141697 11.8627 < 2.2e-16
>> pred3        0.543548   0.072643  7.4825 7.293e-14
>>
>>> system.time(fm2 <- lmer(resp ~ pred2 + pred3 + (pred2|grp), sim1,
>>> binomial, method = "Laplace"))
>>
>>
>> [1] 4.62 0.01 4.65 0.00 0.00
>>
>> Fitting that model using glmmADMB gives
>>
>>> (fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp", sim1,
>>> "binomial", "logit", "full"))
>>
>>
>> ...
>> iteration output omitted
>> ...
>>
>> GLMM's in R powered by AD Model Builder:
>>
>>   Family: binomial
>>
>> Fixed effects:
>>   Log-likelihood: -602.035
>>   Formula: resp ~ pred2 + pred3
>> (Intercept)       pred2       pred3
>>    -1.11990     1.69030     0.54619
>>
>> Random effects:
>>   Grouping factor: grp
>>   Formula: ~pred2
>> Structure: General positive-definite
>>                StdDev      Corr
>> (Intercept) 0.5890755
>> pred2       0.6712377 0.1023698
>>
>> Number of Observations: 1500
>> Number of Groups: 50
>>
>> The "Laplace" method in lmer and the default method in glmm.admb,
>> which according to the documentation is the Laplace approximation,
>> produce essentially the same model fit.  One difference is the
>> reported value of the log-likelihood, which we should cross-check, and
>> another difference is in the execution time
>>
>>
>>> system.time(fm3 <- glmm.admb(resp ~ pred2 + pred3, ~ pred2, "grp",
>>> sim1, "binomial", "logit", "full"))
>>
>>
>> ...
>> Iteration output omitted
>> ...
>> [1]  0.23  0.02 21.44 19.45  0.24
>>
>> Fitting this model takes about 4.7 seconds with the Laplace
>> approximation in lmer (and only 0.33 seconds for PQL, which is not
>> that far off) and about 20 seconds in glmm.admb
>>
>>
>>
>>
>>> convergence:
>>> ~~~~~~~~~~~~
>>> No crashes.
>>> 5/500 Datasets had on exit a gradient of the log-likelihood > 0.001
>>> though. Removing the datasets with questionable convergence doesn't seem
>>> to effect the simulation analysis.
>>>
>>> bias:
>>> ~~~~~~
>>> f1=-1.00531376
>>> f2= 1.49891060
>>> f3= 0.50211520
>>> ri= 0.20075947
>>> covr=0.09886267
>>> rs= 0.38948382
>>>
>>> Only the random slope "rs" is somewhat low, but i don't think it is of
>>> significance
>>>
>>> coverage alpha=.95: (using asymmetric confidence intervals)
>>> ~~~~~~~~~~~~~~~~~~~~~~~~
>>> f1=0.950
>>> f2=0.950
>>> f3=0.966
>>> ri=0.974
>>> covr=0.970
>>> rs=0.970
>>>
>>> While some coverages are somewhat high, confidence intervals based on
>>> asymptotic theory will not have exactly the nominal coverage level, but
>>> with simulations (parametric bootstrap) that can be corrected for.
>>>
>>> I can highly recommend this excellent package to anyone fitting these
>>> kinds of models, and want to thank Hans Skaug & Dave Fournier for their
>>> hard work!
>>
>>
>>
>> I agree.  I am particularly pleased that Otter Research allows access
>> to a Linux executable of their code (although I would, naturally,
>> prefer the code to be Open Source).
>>
>>
>>> Roel de Jong.
>>>
>>>
>>> Hans Julius Skaug wrote:
>>>
>>>> Dear R-users,
>>>>
>>>> Half a year ago we put out the R package "glmmADMB" for fitting
>>>> overdispersed count data.
>>>>
>>>> http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>>>>
>>>> Several people who used this package have requested
>>>> additional features. We now have a new version ready.
>>>> The major new feature is that glmmADMB allows Bernoulli responses
>>>> with logistic and probit links. In addition there is
>>>> a "ranef.glmm.admb()" function for getting the random effects.
>>>>
>>>> The download site is still:
>>>>
>>>> http://otter-rsch.com/admbre/examples/glmmadmb/glmmADMB.html
>>>>
>>>> The package is based on the software ADMB-RE, but the full
>>>> unrestricted R-package is made freely available by Otter Research Ltd
>>>> and does not require ADMB-RE to run. Versions for Linux and Windows
>>>> are available.
>>>>
>>>> We are still happy to get feedback for users, and to get suggestions
>>>> for improvement.
>>>>
>>>> We have set up a forum at http://www.otter-rsch.ca/phpbb/ for
>>>> discussions
>>>> about the software.
>>>>
>>>> Regards,
>>>>
>>>> Hans
>>>>
>>>> _____________________________
>>>> Hans Julius Skaug
>>>>
>>>> Department of Mathematics
>>>> University of Bergen
>>>> Johannes Brunsgate 12
>>>> 5008 Bergen
>>>> Norway
>>>> ph. (+47) 55 58 48 61
>>>>
>>>> ______________________________________________
>>>> [hidden email] mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide!
>>>> http://www.R-project.org/posting-guide.html
>>>>
>>>
>>> ______________________________________________
>>> [hidden email] mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide!
>>> http://www.R-project.org/posting-guide.html
>>>
>>
>>
>
> ------------------------------------------------------------------------
>
> dep pred2 pred3 grp
> 0 -0.832299769325581 -1.67880317415923 1
> 0 -0.308299055464121 -0.364611545217099 1
> 1 0.630598386837566 -0.830616438489193 1
> 1 3.10913858356640 -0.521767356153283 1
> 1 0.903471812881371 1.91484272352174 1
> 0 0.90426769897266 -0.513292891585524 1
> 1 1.14495484917281 0.78473139460995 1
> 0 -0.234525952851387 -0.50697276499942 1
> 1 0.631041223657181 0.0885911624846033 1
> 0 -1.28176856712309 0.381144997887118 1
> 0 -0.199575490516357 -0.816864500080217 1
> 0 1.1185927379226 -1.89658827063818 1
> 0 -0.166954429106639 -1.72768529536314 1
> 0 0.410078885348753 -2.37013110454109 1
> 0 0.37016590773711 -0.320034871756389 1
> 0 -1.29319553111599 -0.243436474810949 1
> 1 1.44188541761604 0.46171661294066 1
> 1 1.03439142621889 -0.0934696125911877 1
> 1 0.941957044041486 1.46031495794676 1
> 0 0.63243411925538 -0.376900200288554 1
> 0 0.782797578938882 -2.17296669618680 1
> 0 -0.0434466627421846 -0.897181210612278 1
> 0 -0.918950768280257 1.40593647574051 1
> 0 -0.266797680590651 1.32852283384323 1
> 0 -1.97795808769759 -0.340096532983713 1
> 0 0.338662524103419 0.317005354534584 1
> 1 -0.778330737845841 1.83714022902796 1
> 1 0.167661592741467 -0.898198193914343 1
> 0 0.108266386948271 -1.31922947083775 1
> 1 0.799587591780234 1.20654772520622 1
> 0 -1.39792357224312 -1.17382560290430 2
> 1 1.41625128975700 0.0321732274019762 2
> 1 -0.0781989965585817 1.66218225184847 2
> 0 -1.57620982406135 0.122918359084620 2
> 0 -0.169615551813902 -1.32992468631561 2
> 0 0.0744278240167874 -0.793642608076887 2
> 1 1.33756270180946 1.43265155843536 2
> 0 0.171687202226864 -0.197474664007768 2
> 0 -0.253842074055872 0.422229019624458 2
> 1 1.51037367919741 0.708493875055813 2
> 0 0.126257617278352 -0.698370297940141 2
> 0 -1.41052021563966 -0.580830471211872 2
> 0 -0.740991848275336 -0.831723737484574 2
> 0 -1.38752149645338 -0.0645813982998873 2
> 0 -1.48202337888827 -0.810824176493156 2
> 0 0.434323217159074 -0.235959885057200 2
> 0 -0.3660190454198 -0.207497034351587 2
> 0 -1.55757318106926 -0.649666741994972 2
> 0 0.200298815968865 -1.25786648414456 2
> 1 0.60068195996731 1.39546043864496 2
> 1 0.92181552285589 0.169900540315928 2
> 1 0.963639582191724 -0.96652696802166 2
> 0 -1.10894528405182 -0.826447468782818 2
> 1 -0.143705873984295 0.85696358848945 2
> 1 -0.227197073907219 0.163108840069261 2
> 0 0.260778722736436 0.702350481108482 2
> 1 0.0079839656810117 0.640974302851168 2
> 1 0.520335955317318 0.82948476190548 2
> 0 -0.470053854425421 -1.41586935932929 2
> 0 0.318269807834773 0.221705522455281 2
> 0 -0.490245326301728 0.553620570804301 3
> 0 -0.276149978720162 -0.88110452374587 3
> 0 0.220721777999604 -0.653256457257882 3
> 0 -0.703075262079484 0.329683608971329 3
> 0 -0.22696243906353 -2.61950724697562 3
> 0 -1.09431005668124 -0.0232909455712828 3
> 1 1.28711954899881 0.344320685687299 3
> 0 -0.151754090955171 -0.0744843243935522 3
> 0 0.341919519960378 -0.0964112795642981 3
> 0 -1.12551027730654 -0.84933031360325 3
> 0 -0.83446590590987 -0.158563062623380 3
> 0 -0.519454023512489 -1.24574306199025 3
> 0 0.273951811480161 -0.738838977465222 3
> 1 0.644417981459438 0.48825723724319 3
> 0 0.80733294439454 0.38447235881228 3
> 0 -1.30801233701682 1.32830707446622 3
> 0 -0.691273996772228 -1.03695064187955 3
> 0 -0.55275315135725 0.712670673357253 3
> 1 1.18912706586104 -0.797779680340233 3
> 0 0.173813919484423 -0.212804192064869 3
> 0 0.510051214308946 0.407985297642093 3
> 0 0.0460404954363198 0.0781827813302751 3
> 0 -0.812726427004377 -0.961698454122676 3
> 0 0.316803703625498 0.436795213783597 3
> 0 1.59372338969642 -0.646972799801143 3
> 0 -0.264479048468343 0.436583419258436 3
> 1 1.79785791935398 -2.19022854850479 3
> 0 -1.25542914249998 0.759841939131259 3
> 0 -0.556703265843941 -0.596140993896742 3
> 0 -0.138859132587877 0.353420559076652 3
> 0 -0.54936814739631 0.5651403571148 4
> 0 0.0865897190005032 -1.28701512066411 4
> 0 -0.59359121771434 0.137688285767041 4
> 0 1.29103265636505 -1.20664947663906 4
> 0 0.593107076808349 0.398984411079726 4
> 0 -0.226418280356368 0.252338568472061 4
> 0 -0.622343771507656 -0.150521153702500 4
> 0 -0.430960330395192 -0.61365230036041 4
> 0 -0.897917002717756 1.24600197335951 4
> 0 -0.131652665221839 -1.14902442133866 4
> 1 1.05932918053364 -0.208043968099735 4
> 1 2.21374179592236 -2.11459091350310 4
> 0 -0.0984023158102187 0.620618589465858 4
> 0 -0.301419555524483 -1.51777513767374 4
> 0 0.705447946370506 0.535510970861683 4
> 0 -0.740935396882992 -0.554365880210576 4
> 1 1.05185254417717 1.44454045163646 4
> 0 -0.976862868950618 -0.28295698647597 4
> 0 -1.81719506639831 -0.214322369708696 4
> 0 -0.249925818554222 0.183451295261106 4
> 0 -2.21228328895419 0.123877231369233 4
> 1 1.38439754243109 0.505675670636377 4
> 0 -0.837221588634301 -1.50135483614086 4
> 0 -1.17154448913943 -0.708764217205316 4
> 1 1.50155644902162 0.43999620318975 4
> 0 -0.516075651409885 1.10156000900777 4
> 0 -0.583328580107054 -0.136624407275739 4
> 0 -0.805969095827047 1.23242239143692 4
> 0 0.520037308192361 -0.0532028414333528 4
> 0 -1.07675357578869 -0.246422992512653 4
> 1 0.84674606454176 1.24706822832629 5
> 1 0.50225429700895 -0.753501502939139 5
> 0 -0.110696991251422 -1.28963013849050 5
> 1 1.45669586789922 -0.0266212621300831 5
> 0 1.19164829320594 -1.6384147982708 5
> 1 1.2497582567896 -0.366144380048512 5
> 0 -1.59200279904562 -0.605612412386137 5
> 1 0.701642320563013 -0.692590957893450 5
> 0 0.826729750207121 0.104040242709383 5
> 0 -0.251589578605641 -1.20008533845167 5
> 0 1.18981892198469 -1.03838218637413 5
> 0 -0.48226798541373 -0.103516021510253 5
> 1 0.902430923145418 -1.25114657112499 5
> 0 0.480518605880176 0.223857697801260 5
> 0 -2.25438268634199 1.21126580568773 5
> 0 -0.383234731665990 -0.766293999344111 5
> 0 0.581319800323777 0.817394753080253 5
> 0 -3.3165427929598 -0.121062111785958 5
> 0 -0.1322580996873 -0.150578017302552 5
> 1 1.53193807897104 0.733939342102887 5
> 1 1.11908937603262 0.196818601090142 5
> 0 0.437875954273944 -0.0197136588508132 5
> 0 -0.0331688915453021 0.33400847926927 5
> 1 1.34543708556248 0.00907648260071153 5
> 0 0.213075347139783 -0.646249613641689 5
> 0 -0.69524684216634 -2.33922277678765 5
> 0 -1.57279905654483 -1.87538951834327 5
> 1 1.07936855487838 1.18446965022099 5
> 1 0.306985931587578 0.724405953653822 5
> 1 1.23181672693378 -0.0701925304204026 5
> 0 -0.919888911082209 0.477276404654286 6
> 0 -0.663571170286572 -1.46630908083576 6
> 0 -0.724039231476495 -0.061019673195811 6
> 0 0.0634460402565889 -1.08461362641975 6
> 1 -0.269323637008873 0.731446475758064 6
> 1 -0.146752286719258 1.51369083876776 6
> 0 -0.35095531255735 -0.234620941343001 6
> 0 0.273206507398862 -0.393903382508312 6
> 1 -0.182666577034182 1.83249929313715 6
> 0 0.681588488289945 -0.375859982611083 6
> 0 0.242434133831772 -1.64320325331244 6
> 1 0.188017184214761 -0.228076562308703 6
> 1 0.166649765922797 0.511793542958201 6
> 1 2.16402504068409 1.66375788446841 6
> 1 0.586097780802391 -0.289750400216476 6
> 0 -0.757716234538208 -0.356355379036605 6
> 0 -1.11703624150270 -1.02103411104842 6
> 1 1.32065912451240 -0.529344207548823 6
> 0 -0.563323776359257 -0.575116970325914 6
> 0 0.0761081554417085 0.466755576888024 6
> 0 -0.0393307073575922 -2.09949197028072 6
> 0 -1.45391429237201 -1.78607291521398 6
> 0 0.180700026079273 0.785140819957987 6
> 1 0.842204310528479 -0.078203655202921 6
> 0 0.957483010083191 0.0933486661821926 6
> 0 -1.34841123138060 -1.25281641699431 6
> 0 0.460944146864481 -0.516097804035054 6
> 1 1.67680834988551 -0.7539974457697 6
> 1 0.204256257528345 -0.389312192486669 6
> 1 0.546750712218541 3.06217530791803 6
> 0 0.0172078969431791 -0.101707833219385 7
> 0 -1.37010127340567 -1.02691608953111 7
> 0 2.56322137123985 -0.773240110395935 7
> 0 -0.507174946504427 0.040499359830619 7
> 1 1.47453663326518 0.394613501208432 7
> 1 0.566343961852793 -0.358372221352661 7
> 0 1.07987945451752 -1.26905542641279 7
> 0 0.48707001768777 -0.832979997875164 7
> 0 -2.55931246086968 -0.0387626614737304 7
> 0 0.6047256807427 -1.27952621228527 7
> 0 -0.800975639650827 -1.19892253215496 7
> 0 0.716445890243245 1.86048748902398 7
> 0 2.57842613918261 1.11040509258097 7
> 0 0.161390247713138 -1.25216129083268 7
> 0 0.692254615374798 -0.273053701220116 7
> 0 0.580742663805351 -0.440022666876242 7
> 0 0.0333740430940438 -1.29808134689932 7
> 0 0.429181268475771 0.891622761392008 7
> 0 -0.957211975299114 1.74875584749294 7
> 0 -0.320315263904051 1.38442720841526 7
> 0 -0.518641939927687 -0.617700091693557 7
> 0 -1.20846935175166 -1.08696366448480 7
> 0 0.409554570285672 0.144434888179345 7
> 0 -0.515552017873684 -0.070428657566001 7
> 0 -0.462212101043838 -1.46343847177255 7
> 0 -0.00462842110860983 0.86885032383084 7
> 0 1.11215966563818 -2.19016842842290 7
> 0 -0.095943103004817 -0.265688534909634 7
> 0 -0.813739915796608 1.04977208376393 7
> 1 -0.975574120042433 -0.478434046891119 7
> 0 -1.12185075962718 -0.462242205693225 8
> 1 -0.334687794852943 -1.01829805236460 8
> 1 1.38611021083734 0.503281001944003 8
> 1 0.920372355987816 0.305833067019153 8
> 0 -0.884970870283891 1.960905790486 8
> 1 1.28985617901913 -0.00149956005647031 8
> 0 0.0778815214946348 -0.517956556797686 8
> 0 -2.06184563024002 0.0280155538025018 8
> 0 -0.317141416850045 0.314623807401671 8
> 0 -1.49653749498563 1.06981480191435 8
> 0 -2.43843441106820 -0.239240224091172 8
> 0 -0.0151669309041355 -2.14961221758577 8
> 0 -0.87343341371374 0.0380076279097493 8
> 0 -1.5467719747319 -0.491996493369805 8
> 1 0.902122956206809 -0.892149687265836 8
> 0 -0.376632917446526 1.42669823620577 8
> 0 0.486194125509503 0.807454372286888 8
> 0 0.916156290278285 1.54950897817856 8
> 1 2.13593485939650 0.0531118603524952 8
> 1 1.42158974742330 3.83829659950465 8
> 0 -0.122253419813331 -0.0745502942393836 8
> 0 0.157383928573792 -1.14170627323008 8
> 0 -0.978439574811127 1.12166052946801 8
> 0 -0.985687025250555 0.3774721170096 8
> 1 0.713852538831987 1.43188231151794 8
> 0 -0.649829588740366 0.0584520738567654 8
> 0 0.193563052726598 0.512951309429883 8
> 0 -0.538005837648731 -0.112951562688504 8
> 0 -0.788831478128212 0.462358950603167 8
> 0 0.374094832725565 -1.32991485627294 8
> 1 0.352241229145249 0.562250011286888 9
> 0 -1.05215631778539 -1.74114055399709 9
> 0 -0.522589687963368 0.328414193078884 9
> 1 1.91980455298025 1.28117152841264 9
> 0 -1.19698337175958 0.137231274598480 9
> 0 -0.342957737575631 0.550318780019834 9
> 0 -1.00877030558756 0.742813587739608 9
> 0 -0.609052194629098 0.0570850107105122 9
> 0 0.94981286699914 -0.268572238949552 9
> 0 -0.159282637164353 0.75336912846542 9
> 0 2.00752437715434 -0.74125296307598 9
> 1 0.979699978359859 1.30618729994550 9
> 0 0.380482418123161 0.490924714141324 9
> 0 -0.601233831152873 0.543315677225777 9
> 0 -0.7013515108379 -0.664568420969133 9
> 0 -0.652526671596726 0.0763605255068936 9
> 0 -0.822704879841393 -0.647393531285262 9
> 0 -1.49663242578037 -0.0679862440883276 9
> 0 -1.58358195089004 -0.60109179531163 9
> 0 1.04029732398240 0.440759719452625 9
> 0 -1.63031032685937 0.118466998096396 9
> 0 -0.926613746016175 1.11079689678127 9
> 0 -1.53293552500695 -0.305256752345908 9
> 0 -0.705690506192276 1.08149201424252 9
> 1 0.462378929758432 -0.0509168140691024 9
> 0 -1.90504209857238 0.753341468876266 9
> 0 0.76992733862465 0.165601577046371 9
> 0 0.591381979204793 0.526637789236985 9
> 0 -1.21927156230053 -0.532305671819723 9
> 0 -0.547777324059503 0.585547156621238 9
> 0 -0.504082236181847 -1.82712629269852 10
> 1 0.396718780807625 -0.949145702669212 10
> 0 -1.29347946800091 2.25390297555326 10
> 0 -0.92177283317849 1.20178868498790 10
> 1 0.0805408037321608 0.841421293757345 10
> 0 0.208418005551224 0.380361636047264 10
> 0 -0.169621316747766 -0.255571127277655 10
> 0 -1.57231498788762 -0.295710119893565 10
> 0 0.513696015792655 -1.15959002434111 10
> 1 -0.538005142273579 0.777004708470148 10
> 0 -1.52514057077904 -0.504424570764622 10
> 0 -0.229493607284984 -0.779790275696481 10
> 0 -0.688477820707275 -1.65923613111410 10
> 1 1.35856477329391 -0.490351154011548 10
> 0 -0.156829706643682 -1.74506179751922 10
> 0 -0.0915368852979517 -0.257682874632584 10
> 1 -0.159644965402912 1.60417200435474 10
> 0 0.431404070322245 0.926605578045528 10
> 0 -0.882911607601614 -0.664639118573361 10
> 0 -0.0877961616974039 -2.07842472741901 10
> 0 -0.472365250637831 -1.09511220945447 10
> 0 0.388010569092914 0.351728051485935 10
> 0 0.228837083145049 -0.59686825402983 10
> 0 -2.11177261975197 0.307994631985372 10
> 0 -1.23554891941374 0.187932225389357 10
> 0 0.308357088673597 -0.685775972043022 10
> 0 0.323667994678980 -1.17945433052522 10
> 0 0.82236677343589 0.0930524184567396 10
> 0 -1.87075872571938 1.52576876599872 10
> 1 1.20879406557371 0.982667684213814 10
> 1 1.10266430237447 -0.320608041598663 11
> 0 0.124645642862156 0.494333414412012 11
> 1 0.836517592979297 1.14909402609648 11
> 0 -0.282343098131595 0.297805556483382 11
> 1 1.61599702324105 1.09222833097785 11
> 0 -1.20341421057695 0.650533884934341 11
> 1 0.0946481020286628 0.521212933554754 11
> 1 0.882422290275422 -0.261422833668422 11
> 0 0.681521728725491 0.00701052727864493 11
> 0 -1.58848334980268 0.955021855312823 11
> 0 -0.893056007320415 0.157173227492396 11
> 0 -1.16552430650929 0.503855482599274 11
> 1 1.28308978229921 -0.973529944164855 11
> 1 1.24963838763221 0.308048131053688 11
> 0 0.104066561938495 0.145979074672098 11
> 0 -0.89503703002368 -0.200437879851354 11
> 0 -0.467187782901472 -2.72038815479212 11
> 0 -1.14453859255276 0.0980265767743773 11
> 0 -0.0721420049649455 2.49321804925209 11
> 1 1.38616314690945 -1.48254076858745 11
> 0 -0.0146719674001681 -1.04258270000465 11
> 1 1.28890941786419 -0.00286067734382689 11
> 1 0.90825648498634 -0.738268770655777 11
> 0 -1.80853080119914 1.50828705943752 11
> 0 -0.500484614405076 -0.738989202286565 11
> 0 -0.552120601390749 1.1041088509462 11
> 0 0.843117475192254 -1.66458211480930 11
> 0 0.662344349656285 -0.777963492121107 11
> 0 -0.730188862636098 -2.1901241955952 11
> 0 -2.81363998003226 1.84022504045505 11
> 1 0.086044702070397 -0.702911739825897 12
> 1 1.34433334068654 1.28360847092290 12
> 1 0.251891500435721 2.07226315228226 12
> 0 0.224336538270969 -1.25833324662926 12
> 0 -1.12104919373692 -0.370603314707898 12
> 0 -1.53435277630678 -1.86928693702925 12
> 0 0.258043756574939 -0.370503172092549 12
> 1 2.66873388954625 1.00910010564530 12
> 1 0.290434832291129 -0.445894218040203 12
> 0 -0.0248126816244334 -2.33109016208254 12
> 0 -1.9286352330831 -0.60077811530749 12
> 0 -0.524776852941538 -0.499404280825145 12
> 0 -1.34350233135942 0.0599143505370495 12
> 0 -0.24239202266593 -0.0510517657124707 12
> 0 -0.910992927906503 0.706094280197348 12
> 1 0.0570525647780904 0.147974668454952 12
> 0 -0.173658972763139 -0.852625338358202 12
> 0 -1.22411512428813 -1.79524639721137 12
> 0 -0.96448136408751 -0.95196349951678 12
> 0 -0.979923740080496 0.962221244422385 12
> 0 -1.13652473063222 -1.14414813458517 12
> 1 1.96070245044971 0.528136077206296 12
> 0 -0.429914670637842 -1.39770448586153 12
> 0 -0.50120969047023 -0.786752864438409 12
> 1 -0.384296256219949 0.284131841164239 12
> 0 0.0732403143415486 0.520525801944549 12
> 0 -0.0960834306966799 -1.25251467942447 12
> 0 0.326101171158566 -0.937858608825269 12
> 0 -0.153546577038208 -1.51774312540946 12
> 1 2.48254399845744 1.76009870352291 12
> 0 -1.09496617188961 -0.321533113760303 13
> 0 -1.94388033841476 -0.0646645935666742 13
> 0 -2.10153389611632 1.29489650397050 13
> 1 0.79520911966323 -0.825898692621387 13
> 0 0.235149187726147 -1.58914372172423 13
> 0 -1.70987588055187 -0.103424653306163 13
> 0 -0.95384554631588 -0.575554498397343 13
> 1 -0.687315953753797 1.01024352373680 13
> 1 0.355008734677528 0.441398246796502 13
> 0 -1.41665179766565 0.685525484292133 13
> 0 -2.59511445621622 1.30812141296419 13
> 1 0.99382685010476 0.530925415173598 13
> 1 1.21095581528523 1.1492527226314 13
> 0 -0.859338656435497 -0.278330867269102 13
> 1 0.989885029284654 0.204847891145183 13
> 0 -0.428376964475937 -1.05374240862369 13
> 1 2.32538597722996 0.483309993898259 13
> 0 0.000586205232193108 0.097868237471109 13
> 0 0.0293156982548206 0.390744400897392 13
> 0 -0.366986446209177 -1.05425108708075 13
> 0 0.0144160610262478 -0.0546706047568215 13
> 1 0.963354164489189 -1.27647970807208 13
> 1 1.24741195105527 -0.621070112995144 13
> 1 0.603062141924585 0.671980988224254 13
> 0 -2.53544500055687 -0.0357697092987977 13
> 0 -0.0305829435665775 -0.477149243195049 13
> 0 -1.46221033388881 0.465696423968454 13
> 1 -0.42006332069103 0.892011637000155 13
> 0 -1.66849233127782 -0.617397925075591 13
> 0 -0.204525463141302 1.648775087203 13
> 1 1.08754959411283 1.81137548918496 14
> 1 1.86769483259646 0.714398797896161 14
> 0 -0.792196219583052 1.10981546436624 14
> 0 1.01183601395461 -0.864958972053073 14
> 0 0.225914457742523 0.398032317929939 14
> 0 -1.39908780081124 -1.65593062552734 14
> 1 0.479554974225379 0.719458615682166 14
> 0 -1.02109341454516 0.732780457520495 14
> 0 -0.393571975684696 -0.105359765666466 14
> 0 -1.65252880027835 0.422266166528251 14
> 0 -1.25228547294298 1.31918823240716 14
> 0 -1.19236257905887 -0.544371393684021 14
> 0 -1.54677928084119 -0.694285017372465 14
> 1 3.29396675483193 -1.84017768897959 14
> 0 0.459292676079817 -0.89184624664535 14
> 0 0.237013943544054 0.831784508857591 14
> 0 -0.32906888841791 1.32305707593525 14
> 0 -1.40507540704093 -0.154737736725519 14
> 0 -0.0695631886811407 0.0574834579252943 14
> 0 -0.146789806215363 -0.496837997861411 14
> 0 -0.234111112254029 -1.16178475557145 14
> 0 1.8426640673097 -1.04827926361109 14
> 0 -2.17383616294947 0.0265757991485158 14
> 0 0.477648239850021 0.901562721051203 14
> 0 -1.20974221961819 2.08301942539896 14
> 0 -0.738211922224691 0.214381890098052 14
> 0 -0.00800890073131998 1.56416656981743 14
> 0 -0.264757994079075 0.0178629426084996 14
> 0 1.86171714000819 1.90817906700495 14
> 0 -0.558701018435082 0.733499720585578 14
> 0 -0.865886034384752 0.442196143971558 15
> 0 -0.406686748739648 -0.506437910563362 15
> 0 0.836614644092183 -0.807464084199793 15
> 0 0.531628175507993 -0.50522576703824 15
> 0 -0.263277300347516 -1.05343202815237 15
> 0 -0.297128609798502 0.779758407553139 15
> 0 -1.01563839345607 -0.912796060567065 15
> 0 -0.902205261011733 -1.45770787634579 15
> 0 -0.0696088696548177 -0.349921610618962 15
> 0 -1.22973548053849 1.32745126369630 15
> 0 0.709927148208838 -0.264148417754600 15
> 0 0.149479587127593 -0.172092566711625 15
> 0 -0.440386021515528 -0.786312572339187 15
> 0 -0.261495768893566 -0.490923188269731 15
> 1 -0.223842960366522 2.33293081944489 15
> 1 1.31804018183548 0.974818533906543 15
> 0 -0.544716513784916 -1.28610408829743 15
> 0 -0.67517764245657 0.593437280880769 15
> 0 -1.15934355470588 -0.727342934075602 15
> 0 -1.07445830974061 0.0385479137082245 15
> 0 -1.32470407068840 -1.66357961834509 15
> 1 1.14460637231997 0.430325405368745 15
> 0 0.56966201723791 -1.12233945216559 15
> 0 -0.352097081497585 1.30664211936628 15
> 0 -1.45495018908162 -1.59895407764635 15
> 0 -0.00487916906597153 -0.00576100676056441 15
> 1 1.52854755107812 0.31956999262741 15
> 1 1.99064490066435 0.60753889928368 15
> 1 1.48692883082317 0.15456937712884 15
> 0 -0.95357296084049 1.51705409774523 15
> 0 -1.55101128764011 0.921547455516857 16
> 1 0.77592323092101 0.283606793352603 16
> 0 1.06170110044901 -0.828690381401407 16
> 0 -0.440655887228902 0.322290026155759 16
> 0 -0.536565834414531 0.946084903085436 16
> 1 1.39145889073658 1.47400306558842 16
> 0 -1.47096001804087 -0.393842672461817 16
> 0 -0.318312999835945 -1.22210813566608 16
> 1 1.82659583997352 0.109776857289715 16
> 0 0.939561374986587 1.01022405183367 16
> 1 1.12037048838811 -1.64482931676532 16
> 0 -0.235636036224073 -0.824831281614542 16
> 0 -0.740755859680485 0.552373546306527 16
> 0 0.659520608107892 1.14965117102797 16
> 0 0.0790873282799428 -2.63109363604361 16
> 0 0.266646262521125 -1.49227222377525 16
> 0 -0.407593223899886 -1.19277426000872 16
> 0 -0.475306000569453 0.447544879616341 16
> 0 -1.98946280123186 0.54611902836258 16
> 1 0.796199085038531 -0.0694924974441781 16
> 0 -0.881382934564273 1.44061441781661 16
> 1 0.528849579192021 1.73982303765818 16
> 1 1.18073039407243 -0.657195930859719 16
> 1 0.363832702628949 0.441828986412673 16
> 0 -0.428524201050796 -0.273397106604899 16
> 0 -0.75858922746716 -0.0632613297740682 16
> 1 0.34044516248751 1.37794389035647 16
> 0 -0.777874389267938 0.321129947285757 16
> 0 0.183486160149468 0.572341591539134 16
> 0 0.445039712303777 1.45930436051826 16
> 0 -0.443045505403508 -0.67978500824133 17
> 0 0.135633951647771 1.07711556104910 17
> 0 -0.553147264610169 0.0902248428294424 17
> 0 0.00937570490320113 0.0611402777427867 17
> 0 0.372277854021795 0.100723407650452 17
> 0 0.519786977502631 -1.01114328990972 17
> 0 -0.452261764838862 -1.16534288837980 17
> 0 -1.12142625314041 -1.46626028361544 17
> 0 -0.723609158484627 0.608367913938929 17
> 0 0.0229811055501538 -0.88146617000202 17
> 0 -0.981913819359972 -1.16261322187429 17
> 0 0.14956375871895 0.601464082555944 17
> 0 0.495202041104176 0.494596362925564 17
> 0 -2.15893906475216 0.51611263031782 17
> 0 -0.435233435871235 1.21760849616707 17
> 0 -1.80861286709999 0.624349159503658 17
> 0 -0.550616068067364 -0.492782684735982 17
> 0 0.285429861178697 -0.748967934515054 17
> 0 0.277893240283312 1.18482584819827 17
> 0 -0.943764161353195 -1.12799349575985 17
> 0 -0.385491182309278 0.121532067565899 17
> 0 1.60032075916878 -0.00340680438257541 17
> 0 -0.0707277956468931 -0.400563639092005 17
> 0 -0.617853028991443 -0.372358957357196 17
> 0 -0.545574860540945 1.21452548777187 17
> 0 -0.481567962234248 0.66521989882239 17
> 0 -1.19866751394092 2.28846899210326 17
> 0 -1.29935925912260 -0.638802133023374 17
> 1 1.22037260961074 2.43835625950646 17
> 0 1.50369331821888 -1.39665739283585 17
> 0 -1.22321450344753 -0.587043370579141 18
> 0 -0.894318267768654 -0.71542772987144 18
> 1 1.2458532690666 0.288140324058812 18
> 1 -0.20191651523307 0.149615803728453 18
> 0 -0.495997205284986 0.181484805254462 18
> 0 -0.0871815183225838 -0.41341327049775 18
> 0 -2.05549895331074 2.7361264518136 18
> 0 -0.0521027896708521 1.10901391590386 18
> 1 0.690965759190466 1.00839048543978 18
> 0 -0.141283852035664 0.0850809476219265 18
> 0 0.516494433047075 0.174016547338858 18
> 0 -0.796597471281735 -0.110139119797216 18
> 0 -1.11911574121814 -0.445105128281765 18
> 0 -0.164920052062653 0.067023947557097 18
> 0 -0.493620073950104 -1.45673442629996 18
> 1 1.11260286764458 -0.866494406212042 18
> 1 1.40571603813951 -0.061741884798058 18
> 0 0.519830317779348 1.27642484444701 18
> 0 -3.56505147261245 0.364417505912863 18
> 1 0.972249311441788 0.591117399815239 18
> 0 -0.234929072129907 -0.236619185591319 18
> 1 0.762434829775483 0.908246741691203 18
> 0 -0.641057762179862 1.48562208927077 18
> 1 0.981004802159744 0.328771034632375 18
> 0 -0.308752619316994 0.590399515928942 18
> 0 -0.379712912300093 -0.242250153086643 18
> 0 -1.63850578029125 -0.876789862698695 18
> 0 -1.42963124916782 0.245931639440858 18
> 0 0.435333692249852 -0.740977140857627 18
> 0 0.538787886198466 0.651868354744412 18
> 1 0.737622709107121 1.87921309468464 19
> 0 -0.457220662399179 -1.24459861585519 19
> 1 0.652854613946687 1.6269278994429 19
> 0 -0.689756881823752 1.23510268500758 19
> 1 0.118064188461892 0.418906171149208 19
> 1 0.525988566323315 0.0963224899751922 19
> 0 -1.76705089688460 1.59338821194678 19
> 1 0.77509903884034 0.384494753734261 19
> 0 -0.847770741569922 0.491856983435367 19
> 1 0.658602035090962 2.24240121580213 19
> 0 0.692615905774285 -0.395491845732234 19
> 1 0.650459171784845 -0.661553419724731 19
> 1 0.170154793677856 1.78571468157438 19
> 0 -0.415332784938741 0.458987306723669 19
> 0 0.2958764925582 -1.12632212607051 19
> 0 0.505429928105289 0.480227118436985 19
> 0 0.259014472096903 -1.47319818498822 19
> 0 -0.78614739810647 0.329471262685779 19
> 0 0.992088533952774 -1.30758505809895 19
> 1 1.8950681441408 0.997521719663752 19
> 0 -0.616206305169458 -0.133821166217684 19
> 0 -1.02787640268475 -0.289957872614792 19
> 0 0.415403125502618 -0.525324650997477 19
> 0 -0.0224962921018651 -0.160842178133076 19
> 1 0.091731078701501 0.277585926758944 19
> 1 -0.318106244852093 -0.472657387946097 19
> 0 -1.19143741011813 2.39035805203590 19
> 0 0.375980170740284 0.606887460896474 19
> 1 1.38486444503054 1.53030323590866 19
> 1 0.298802272387582 0.59660159620639 19
> 0 -0.0736795250795902 0.991263114363719 20
> 0 0.118468643373590 0.778282877419433 20
> 1 2.09972826646574 1.20786320016429 20
> 0 -0.713590657470114 0.71137304623902 20
> 0 -0.357998810810069 -1.10283201718446 20
> 0 0.293969391674282 -0.363703662455449 20
> 1 2.33175089728254 0.407543462266131 20
> 0 -0.531583078070963 1.83792116585249 20
> 0 -0.533932479643344 -0.625744124313377 20
> 0 -1.20473382825124 -0.647108728549238 20
> 0 -0.675497650828518 1.06847808410627 20
> 0 -0.9188637626031 1.50455241715333 20
> 1 0.40700416178753 1.31094542830213 20
> 0 -1.25045224620201 0.371579654125546 20
> 0 -1.08375527779938 -0.383874276832662 20
> 0 0.627160932357288 -1.46762769214617 20
> 0 -0.485907675551947 0.984579702681566 20
> 1 0.414924433093339 1.16325991244109 20
> 0 -0.590592054300904 0.452959150743942 20
> 0 -1.48447103826185 -0.192968672110827 20
> 1 1.73799225870842 1.01171568778920 20
> 0 -0.272063725855103 0.613710258829988 20
> 0 -0.387519267290261 -0.504044647994148 20
> 0 0.0734476530726249 -1.06468348951658 20
> 1 -0.838739768861018 1.88795968278158 20
> 0 -0.560718673860578 3.10575065226196 20
> 0 0.0661186404165914 1.07997014020863 20
> 0 -0.948739956652519 0.336699605183419 20
> 0 -3.28867294951223 0.211035777327718 20
> 0 -1.06563210355731 -1.04739483686403 20
> 0 -0.197957150889235 1.02548020590552 21
> 0 0.163439242268860 0.990796149898193 21
> 0 -0.873951192487593 -1.60987838555758 21
> 0 0.305277955328451 0.284879486938119 21
> 1 2.17596390363011 0.750389368615824 21
> 0 -0.8517990015261 1.01161870563604 21
> 0 -1.05219161387871 -0.462445623650634 21
> 0 -0.703594411923324 1.01575057291761 21
> 0 -2.03932628488176 -0.75466498148339 21
> 0 1.18085596791464 1.04482289058781 21
> 0 -1.00662122501257 1.05919197955978 21
> 0 -0.157054103893772 -1.60493348734091 21
> 0 -0.774858364632744 -0.914704433088452 21
> 0 -1.13327593841877 -2.35362763919430 21
> 0 -0.301233480919724 -0.460178474483009 21
> 0 0.181393557658695 0.721525088422351 21
> 0 -0.538138167758262 -0.449781606805314 21
> 0 0.783324936732158 -0.365450871566037 21
> 0 0.573614198601641 -0.284379920725200 21
> 0 1.18187700000278 0.295915368774074 21
> 0 -0.46872787900165 0.332076141377095 21
> 0 0.0297627035376201 -0.295283829842303 21
> 1 1.31319436121788 -1.12044842831112 21
> 1 0.505545708839969 -0.585068527945692 21
> 1 0.840180761083502 0.158762307679421 21
> 1 2.15799112326749 -0.529955138175695 21
> 0 -1.02035636295294 0.82056833515643 21
> 1 0.0369390468022524 -0.407496418753728 21
> 0 -0.479994352081338 0.927388652904115 21
> 0 -0.922993408574858 0.781433427927929 21
> 1 -0.307512032757239 0.793344164922764 22
> 0 0.115018655058571 -0.0863814473989291 22
> 0 -0.171991675603027 1.32766128875728 22
> 0 -0.0695267218175115 1.89876075600269 22
> 0 -1.38865085131159 -0.196003082386862 22
> 0 -2.80624002495911 -1.12524640110414 22
> 0 1.12967426582775 -0.657734157723724 22
> 0 -0.976832193838693 0.23716472205266 22
> 1 1.53261612905124 0.245889243310388 22
> 0 0.585537305955095 -1.28142178166935 22
> 0 1.03585754169813 0.148064829863711 22
> 1 0.0063706625097436 1.12550644841089 22
> 1 1.91789145930371 -1.06313353226449 22
> 0 0.199132691194167 -1.48743000683762 22
> 1 1.84149826101963 -0.130003685073142 22
> 1 1.47283151535127 -0.348990760043939 22
> 0 0.383816936869458 1.14748093826432 22
> 0 -1.38305874359244 -0.80142599188741 22
> 1 1.85843469747238 -0.121582944447751 22
> 1 1.73141571297532 -0.790439523027202 22
> 0 -1.02673089512187 0.091890018179651 22
> 1 1.11081612459465 -0.264763033203195 22
> 0 0.558358491209774 -1.03463364848747 22
> 0 0.243347464052742 -0.60278978837622 22
> 0 0.534536569352243 0.00156808936825848 22
> 0 1.00614410038894 -2.52865796668551 22
> 0 -0.403962317933189 1.66921049953142 22
> 0 -0.294045175598332 -0.238988423658120 22
> 0 -0.676362600696124 -0.0392182778786299 22
> 1 0.446805804244298 0.491308108318127 22
> 0 -0.326234822088166 0.890954484089404 23
> 0 -1.82657624403363 0.181809970032308 23
> 0 -0.47631646683927 0.428662775458846 23
> 0 1.19881570909744 0.153138416938003 23
> 1 -1.08552362695782 0.0259121252783369 23
> 0 0.118531931598539 -1.25446854126668 23
> 0 -0.411100058007619 0.705200147139931 23
> 0 0.874961792826811 0.846350000578573 23
> 0 0.517823050953698 -1.39445488385576 23
> 0 -1.07643882072521 -1.69303535977243 23
> 0 -0.878850943234961 0.587365358005596 23
> 0 -1.40964883115877 -0.519298232170772 23
> 0 -0.124882913447400 0.279972476556546 23
> 0 0.0381998568046068 -1.47869218691650 23
> 0 1.84880456383161 -1.72497823403827 23
> 0 0.163249264050558 -0.317447803718512 23
> 0 -0.609445571763695 -0.451246745712258 23
> 0 -1.22381470033594 0.388796742799244 23
> 0 0.198080920769644 -1.06015129316363 23
> 0 -0.37989433973358 0.61128054688163 23
> 0 0.150078242182152 -0.392619591105715 23
> 0 -1.04227044668412 0.14024076161724 23
> 0 -0.415152830475172 -0.551410479601129 23
> 0 -0.514790694380296 -0.89992387366783 23
> 1 1.38291511932431 1.02036918615752 23
> 0 -1.08938863019774 -0.715506614875224 23
> 0 -0.147844209135426 -0.911345367214518 23
> 0 -0.546647998841834 1.33087890151799 23
> 0 -1.01983457025099 -0.27696570657433 23
> 0 -0.203204984808869 1.65487723514278 23
> 0 -1.16755681080634 -0.685097465767054 24
> 0 -0.0997470847399058 -1.15059886020236 24
> 0 1.53855528018936 -0.97016480855604 24
> 1 -0.343803621419355 1.76457568614555 24
> 0 1.00770279961635 0.249770950451444 24
> 0 2.26641862660301 0.569227300619502 24
> 0 0.538390922486461 -1.22484336382451 24
> 0 -1.22564810968063 0.220658826022425 24
> 0 0.285884373515213 -0.77701148640513 24
> 0 -1.00245465134810 -0.580694712224929 24
> 0 -0.0339822330899107 1.83788500659093 24
> 0 -0.912204272676936 0.543022846285731 24
> 0 1.26306810861323 -0.602171212396809 24
> 0 -0.0952850226185517 0.156126522378751 24
> 0 0.76223016089203 0.0070211994212416 24
> 0 -1.43891582070196 -1.53070791364994 24
> 0 0.42464849223795 0.501764804149733 24
> 0 0.510043984524038 -0.654448103284112 24
> 0 -0.567703088929011 -0.117894009307708 24
> 0 -1.42358459989365 -0.568106739102582 24
> 0 -1.15246547172212 2.28853656624881 24
> 1 1.43309275128209 0.905193630965345 24
> 0 0.501184235875358 0.332446397497448 24
> 0 0.00945057072066702 -1.62711461042993 24
> 0 2.12937940640769 1.16364234441865 24
> 0 -0.842186256750118 2.03553465139640 24
> 0 0.274868827016096 0.542345143596872 24
> 0 -0.0548892850689096 -1.17819053996697 24
> 0 0.869796950213005 -0.840996911601461 24
> 0 -0.0464454543000792 -0.0645163709855169 24
> 0 -0.23890845802898 2.09577994605619 25
> 0 0.299557115296246 0.74670412785083 25
> 1 1.69310480285494 -0.479686472714503 25
> 0 -0.546558122883724 0.116044548849761 25
> 0 0.0793956853505407 0.745428344116883 25
> 0 -1.19010001903975 -1.49246920508452 25
> 0 -0.969471930926585 0.252615374697239 25
> 0 -0.424442559208083 2.48635453727665 25
> 0 -1.40101307220114 1.3915729223426 25
> 0 -0.534234296970439 -1.72987125048728 25
> 0 -1.24665136328661 -0.104696434826391 25
> 0 -0.697010880246971 -0.89189647669366 25
> 1 3.13266193587642 -0.00536348644011856 25
> 1 1.44877906333457 -1.65100808143807 25
> 0 0.335973330152532 -0.252244325993639 25
> 0 0.190274700098407 -0.526648426094839 25
> 0 -0.703947380390457 0.295974212105942 25
> 0 -0.275215254905665 -1.11718756692863 25
> 0 -1.49337334361323 1.39795772437554 25
> 0 -0.755629857268438 0.0344454503071277 25
> 0 -0.151086216907116 1.08497546820611 25
> 0 -1.28363764888660 1.05453694422232 25
> 0 -1.25140285926501 1.91316835557301 25
> 0 0.93720919655649 -1.3956182727864 25
> 0 0.217662193917063 0.137312227169065 25
> 0 1.43730441738598 -0.40561417873691 25
> 0 -1.22496356884969 0.364105969046394 25
> 0 0.452598911981279 -0.283548627117107 25
> 0 -0.67080726595856 0.360482053685988 25
> 0 0.408446070251112 0.338680989723888 25
> 1 -1.56018893896899 1.37206704679335 26
> 0 0.0900341516039697 0.447831044116437 26
> 0 -1.85977785834727 0.748797365972187 26
> 1 -0.505749091085652 1.88646671228687 26
> 0 1.52922471558493 2.47278965601599 26
> 0 1.03053726597795 -0.478550833624283 26
> 0 -0.0870171720012745 -2.17948526570768 26
> 1 -0.381544250087392 2.45802332525101 26
> 1 0.196906128129992 0.0615732155654071 26
> 1 -0.191520734903644 -0.801865980004574 26
> 0 1.60017656217781 -0.301621609341602 26
> 0 -1.02205344758343 -1.21590198351106 26
> 0 -0.811303213815072 -2.52726659172226 26
> 0 1.03588624222890 -0.312529268273428 26
> 1 -0.667218643726994 -0.649281474282759 26
> 0 -1.59133477679926 -1.97099167807631 26
> 0 0.0373031658820746 0.931358430928478 26
> 0 0.102839478294933 -1.16650737323483 26
> 0 -1.16663952204414 0.495522203493648 26
> 0 0.725296466868997 1.15874629305505 26
> 0 0.154970189848987 -1.32517489474126 26
> 0 -0.424029139977570 -0.633513133003262 26
> 0 -0.594604460149257 -1.95389696469352 26
> 0 1.6721530952745 -0.776266717421722 26
> 0 -0.0176140956440492 -0.296059132457671 26
> 1 -0.888379521830832 1.97282847644371 26
> 1 -1.73919124015551 1.10760042281072 26
> 0 0.0637952293635686 1.90809410793724 26
> 0 -0.775162643691233 -1.28009375052183 26
> 0 -1.06642581902587 -0.192929476288895 26
> 1 1.69252032762783 -0.500620865285653 27
> 1 0.99234177011424 -0.411617223739249 27
> 1 0.077542503794092 -0.98880044825396 27
> 0 -0.608399623223175 -0.970224844076665 27
> 0 0.576538514390394 -2.90627497495297 27
> 0 -0.903955391590981 0.325858484224393 27
> 0 -0.453330278851448 1.11021891461383 27
> 1 1.81642509093527 -0.00652526888706355 27
> 0 -1.46406027946530 -1.59684556401661 27
> 1 -0.430849094477498 -0.84475037025255 27
> 1 0.573157563804419 0.253686537773903 27
> 0 -2.60670226870484 0.467748286226971 27
> 0 -2.04173079361907 -0.713731528493079 27
> 1 0.251763406136457 1.05751425580302 27
> 0 -0.726349134844929 -0.874757139392132 27
> 1 0.143559627442400 2.35783272563535 27
> 1 1.52894169504688 -0.886468564695659 27
> 1 1.35363931126917 2.30459496079738 27
> 1 0.849036046329303 1.87019352822687 27
> 1 -0.481252243947433 -0.53498150424071 27
> 0 -1.04549048212546 1.11937902514591 27
> 0 -1.62866205125608 0.178069075837072 27
> 1 -0.882283843662979 1.68756791105104 27
> 1 0.690262901686046 0.0205502574061394 27
> 0 -0.241536501768276 -0.289479627783977 27
> 1 0.45104200324038 -0.846736447330162 27
> 0 0.125095499488477 -3.46269388578144 27
> 1 0.949399939183653 -0.910349933917473 27
> 1 0.123193360612241 2.40615766266468 27
> 1 0.924749304239926 1.77107857545299 27
> 1 1.18262123169607 -1.0770028750267 28
> 0 -1.82952586604597 0.989352602163334 28
> 0 1.55719623696059 -0.659634812368886 28
> 1 0.522977811137677 -1.37155079007410 28
> 1 1.03202576161745 -0.806556947897602 28
> 1 1.11413522855997 0.252215051444443 28
> 1 0.950428931179031 -0.0960487601324759 28
> 1 -1.04383291319661 0.839886075743175 28
> 0 -0.562942303790394 0.838768451128838 28
> 0 -0.312272317146447 0.861593798128056 28
> 0 -0.672217099823377 0.0340501309907514 28
> 0 1.28063918373146 -1.45349767784975 28
> 0 -0.937314217615877 -0.503216035919759 28
> 0 0.708687498989957 0.834617813275994 28
> 0 0.831663236787463 -0.194764130772523 28
> 0 -0.179625785095183 -0.386830047887248 28
> 0 -2.01256256811822 -0.69829317190168 28
> 0 0.381850185099437 -0.474359438461021 28
> 1 1.08379364206006 1.50175339733174 28
> 0 -0.0297004135746841 0.438959395862471 28
> 0 1.45379312768231 0.624955765834754 28
> 1 -0.370281691227487 1.28436918334358 28
> 1 2.29341477982987 -0.129601265262925 28
> 0 0.207193820313301 0.669794679591892 28
> 0 0.421157594404356 1.091244111144 28
> 0 -0.322119358597287 0.111070803908535 28
> 0 0.882427128179176 -1.30699107953675 28
> 0 1.0476218564878 -1.80852085426895 28
> 1 0.834419000995315 0.890970920085738 28
> 1 2.20003195460197 0.149427347669074 28
> 0 -0.0905101012397135 -1.82993503469764 29
> 1 1.23325299224616 0.494289604173213 29
> 0 0.685347317810108 -0.172671051349583 29
> 0 1.05789120078412 -0.907803201077746 29
> 0 0.207375907042930 -0.515299355685532 29
> 0 0.435826374005326 -1.90945330522520 29
> 1 1.34028432224112 0.539872694292059 29
> 0 -1.10931473677543 -0.346759347253418 29
> 0 0.669881938717285 -0.750193430172544 29
> 0 -1.25360810083972 0.395551897055894 29
> 1 0.81296056221462 1.17039934814194 29
> 0 -1.92388094795694 0.64503860186135 29
> 0 0.0952769501779241 -1.81150954030585 29
> 0 -1.7813902412391 0.840361128325777 29
> 0 -0.779551391322467 2.23825169387159 29
> 0 0.242568659427344 -0.370466578788216 29
> 1 1.85581433262880 1.11698732675966 29
> 0 -1.15934979103717 0.609202153046251 29
> 0 0.795630234871605 -0.233102369265603 29
> 0 -1.43287008129182 0.490405619647644 29
> 1 2.26669484278084 0.381338273084701 29
> 0 -0.462707347064392 1.40602789857989 29
> 0 0.200593533344002 -0.233286336934925 29
> 0 -0.855935431794858 1.10180468114254 29
> 1 1.60786446907384 0.47093067728403 29
> 0 1.05523297950764 -1.12797171320202 29
> 0 1.21519522359700 0.299730388154583 29
> 0 1.00210208468768 -0.357872559407867 29
> 0 0.702797189716252 -1.62454733660948 29
> 0 1.60779100830893 -0.105783802790483 29
> 0 -0.531952894016549 -1.01740310607877 30
> 1 1.44882529352799 0.681895481124701 30
> 1 0.548384469764106 -0.0828860757758776 30
> 0 -0.163226052797180 -0.598783583635226 30
> 0 -1.14146750718479 0.483801267177237 30
> 0 -0.0314918589295942 0.0600773217681696 30
> 0 0.288472059213042 -1.30141049463516 30
> 0 0.107565672269718 0.176326132916380 30
> 0 0.66625948822712 2.86388057873676 30
> 0 0.156835711707007 1.32961816503809 30
> 0 -1.28694065888994 1.10361909161099 30
> 0 -1.18846973249183 -0.619487590497797 30
> 0 -1.51380635469206 -1.12915668675421 30
> 0 -1.17134706240151 0.0178740312838734 30
> 0 0.442624267480939 -0.430705264841602 30
> 0 1.17929499655708 -0.0158567584397257 30
> 0 -0.100219824494380 0.114554498292876 30
> 0 -2.48836924326975 -2.15537015476658 30
> 1 0.684815376861774 0.379599460358890 30
> 0 -1.11706009304473 -0.213392629937843 30
> 0 -1.09026979751000 -1.22920434079159 30
> 0 -1.68150006012392 -0.8584259420876 30
> 0 1.49090714864855 -1.76603610011475 30
> 0 -0.264599200949542 0.384494563084869 30
> 1 -0.516789920323952 1.24868066972612 30
> 1 0.59379776320524 2.037198756361 30
> 0 -1.62628715377043 0.0132928194887809 30
> 1 1.28589250779266 0.378714764086379 30
> 0 -0.0175253027185293 0.275067267924580 30
> 1 0.599932241553186 -0.172933752079525 30
> 1 0.568025967486367 1.51312257992713 31
> 0 -0.0171667315421332 -0.458234068149784 31
> 1 0.496419550746439 0.94974248857526 31
> 0 0.207145055809103 -1.06170378048368 31
> 0 -2.57899028839591 1.23029185352946 31
> 0 0.922521271065102 -1.2591205574326 31
> 0 -0.709615749999797 0.812184820557492 31
> 0 0.95697939790632 -0.714918331711232 31
> 0 -1.03482676593565 0.33059866459193 31
> 0 -1.46406220172074 -0.945078658522272 31
> 1 1.60305470509200 -0.603468165148643 31
> 1 2.23095433291494 0.70847940835935 31
> 0 0.46910287810457 0.339505190958510 31
> 0 1.38845377244414 0.948367194952934 31
> 0 -0.841283369779093 -0.543385322184171 31
> 0 -0.89684204430218 2.02680247912047 31
> 0 -0.280250908329323 -0.196616787000675 31
> 1 0.163549725170238 1.89887206663873 31
> 0 0.45373971518808 1.17837364053434 31
> 0 -0.314916534306533 0.793062031543245 31
> 0 0.0367834038026205 0.163175201027631 31
> 0 -1.57491021884182 0.321077781077429 31
> 0 -1.64697185127381 0.672760722164046 31
> 0 -1.01060352210931 -0.831552924463853 31
> 0 -0.0429909884207766 -1.75214933013751 31
> 0 -1.24753417553051 -1.67846780305366 31
> 0 -0.940512157209068 0.157966013390362 31
> 1 1.80535297232126 -1.32000689065008 31
> 0 -1.26159989355730 2.29355210189689 31
> 0 -0.868716291552895 0.844773553819418 31
> 0 0.216095770140806 -1.44338185242672 32
> 1 -0.208737023249632 0.0874649263997664 32
> 0 0.144949102577926 0.560225979487457 32
> 0 -0.113058929054180 2.00428947307390 32
> 0 0.433969620809148 -1.35440417465203 32
> 0 -0.194046035582743 -1.07892612678140 32
> 0 1.18599763803675 -0.493153763643435 32
> 0 -0.323571429137576 -0.221771979414027 32
> 0 0.73694382958152 -0.511173470679874 32
> 0 0.884195921784544 -0.811836284838324 32
> 0 1.62861412637237 1.04973263403742 32
> 0 -0.944125090659356 -1.44446909151807 32
> 0 -0.0327630774582896 0.882571997755937 32
> 0 1.54544454072332 0.535650302567806 32
> 0 0.477435690337875 0.319929741508776 32
> 0 -0.561511031258259 0.223971067263208 32
> 0 1.54978924922226 1.19973993458095 32
> 0 -0.271509051404668 -1.49789421456724 32
> 0 0.22920171636667 -0.0983848443211296 32
> 0 -0.549155425364687 0.610015254410945 32
> 0 1.13664376759897 -0.811755856192278 32
> 0 1.7209032063569 -0.525592311927352 32
> 1 0.216398902137129 0.58459574566049 32
> 0 -1.39851684488706 -0.337153929949655 32
> 0 -0.401785196726462 -0.58413942679452 32
> 0 0.712127999671192 -0.487179807400412 32
> 0 0.275301190329306 -1.27888830869209 32
> 1 0.892221445562822 0.731765574928885 32
> 0 -1.33643707767245 -0.0885041514401707 32
> 0 -1.39249298715095 0.272367552542975 32
> 0 -0.435044187971827 -0.152599263617162 33
> 0 0.556137257522227 1.32540819429938 33
> 0 -0.229841478458439 -1.06152143187682 33
> 0 -1.34475878668394 -0.252674092418901 33
> 0 -1.08887450878615 0.908197301725357 33
> 1 -0.100836663092512 0.506963504576155 33
> 0 0.211609959878819 -0.334809562492528 33
> 0 0.406402536561041 0.837740290000123 33
> 0 0.686215877214938 -0.294685704756879 33
> 0 0.169578662214155 -0.0338046122722492 33
> 0 -0.991066937196797 -0.308653756883524 33
> 0 -0.18161444368793 0.156425259215169 33
> 0 1.79061087867624 0.607993650094212 33
> 0 0.536951923443952 0.719050493058519 33
> 1 1.56976460630797 0.659592134740966 33
> 0 1.7741926181 0.199758730916888 33
> 1 1.03866281126407 1.09956368657074 33
> 0 0.57229202396274 0.127322378420519 33
> 0 1.09095923089499 1.11677647121042 33
> 0 -2.05991292248401 0.852935046028815 33
> 0 0.45233975633226 1.36885467383329 33
> 0 -0.201681544861177 0.850006510700148 33
> 1 0.360998572651341 0.595664317377182 33
> 0 -0.560570614500087 -0.132845977490904 33
> 0 -0.239446986854265 -0.0293533362907044 33
> 0 1.95418578404118 -0.109277400272273 33
> 1 0.14882071604307 0.739054518086478 33
> 0 -0.89196533302734 0.576479776139958 33
> 0 0.879236401729685 -0.442091933357997 33
> 1 1.29782679907877 1.94067868697945 33
> 1 0.62218117364493 0.752848225950944 34
> 0 0.0992493045890867 -1.88273438254301 34
> 0 -0.959464124768981 1.24777867544516 34
> 0 -0.171283717002419 -0.437810016712795 34
> 1 0.863518034073181 -0.169796944004820 34
> 0 -0.114830562219960 -2.03710047683965 34
> 0 -0.241278855924113 -0.762335396968167 34
> 0 1.26041783992824 0.0741781292442294 34
> 0 -0.321007788133158 -1.15943892702739 34
> 0 -1.98870971123678 0.235117022366558 34
> 0 -0.142431336627423 0.122979263486284 34
> 0 -1.74660661245163 0.940052183763837 34
> 0 0.535743279377669 -0.467537012134354 34
> 1 2.13935815083658 0.393849854449467 34
> 0 0.822981450307142 0.765767540559674 34
> 1 0.65685016668968 1.34297012732689 34
> 0 0.00445015010368195 1.22077687684860 34
> 1 0.428624647583222 0.478904184577184 34
> 0 -0.52168512005374 0.580739893212834 34
> 1 0.852878198719333 1.02947429539029 34
> 0 -0.632327987108088 1.88077995607729 34
> 0 -0.473127166020453 1.03703341340194 34
> 1 2.08679911783053 -1.17782756802619 34
> 0 0.694157497565013 -1.89205955280539 34
> 0 0.202031648196241 -0.996103270598506 34
> 0 -0.447712853360539 0.373290344356986 34
> 0 -0.940140509884492 0.455667745940236 34
> 0 -0.559701479657525 -0.390528043939936 34
> 0 0.142358337076513 0.272163746184128 34
> 0 0.722665212020313 -0.264924573858411 34
> 0 0.171578171358315 -0.84256289792041 35
> 0 -1.53246346675183 -0.600458708392764 35
> 0 -0.124619827358097 0.129673587345323 35
> 1 -1.44400899031355 0.643313258042392 35
> 1 -0.289838661403134 -0.0108140832393535 35
> 0 0.798972829258546 0.906302703320038 35
> 0 0.23303314498281 1.49932217047676 35
> 0 0.471823157049299 -1.98963320502194 35
> 0 0.304318073503345 -0.211737487508884 35
> 0 -1.14641897153576 -0.238496036128775 35
> 1 -0.978068231097714 2.01367175759270 35
> 0 -0.370810915796208 1.28905796766258 35
> 0 -0.371946315257336 -0.246561229178747 35
> 1 -0.981172550172825 1.26161105878654 35
> 0 1.01418926519813 0.705756612818995 35
> 1 -1.57087738321305 1.04626309678320 35
> 0 0.156331496051774 0.334100970757504 35
> 1 -0.867002061782054 0.282996655377804 35
> 0 -0.897380862820812 0.556076895658487 35
> 0 -2.53999727370706 -1.27928065628090 35
> 0 -0.059165282343189 2.35305846052828 35
> 0 -0.439941362075668 -0.388240002527981 35
> 0 -0.40069407144439 -1.75054201399514 35
> 0 0.147249146655498 0.489067534220923 35
> 0 1.43538525565586 0.230381544198631 35
> 1 0.686524031934441 0.316217195462482 35
> 0 -0.952689326732826 1.27051192536222 35
> 0 0.109664948414183 -0.226334673895838 35
> 0 1.16255167596435 0.874926505872412 35
> 0 0.242763044572235 1.21418144559431 35
> 0 -1.62452472475419 -1.25083254949684 36
> 0 0.681608907564711 1.06396031747701 36
> 1 1.12552114470179 -1.33352695841901 36
> 0 0.286254354965174 0.350561037521671 36
> 1 1.31219376311722 2.50341626606741 36
> 0 -0.220935235516831 -1.31507767235446 36
> 1 0.330996139640939 2.02816921896221 36
> 1 0.697882009667383 1.44789230789063 36
> 1 0.679592022319808 1.48661289274685 36
> 0 -0.597749171777849 0.348979914173066 36
> 0 0.0219548550141899 -0.8089251043759 36
> 0 1.12735288846299 -0.918808753930448 36
> 1 1.50545533134515 0.71191619337861 36
> 1 0.457426410916814 1.07734251760253 36
> 0 -0.751534512907216 -0.0790568875911928 36
> 0 -0.534049718350595 -0.499478746912996 36
> 1 1.43728797727963 0.0339346099264653 36
> 1 1.25670303140753 -1.11671399820031 36
> 1 1.01917842199109 -0.453007293345241 36
> 0 -1.41131483589653 1.19634787775172 36
> 0 -1.0886372966331 1.04376415307185 36
> 0 0.260529971989987 1.03875891929935 36
> 1 0.943000819494761 -0.117737369314224 36
> 1 0.179794510519171 -0.0158047191526089 36
> 0 0.746574361538626 0.439715884007684 36
> 1 1.19274565559680 1.08168911940164 36
> 0 -0.775720860890516 1.13509740234057 36
> 0 1.92667784264009 -1.13774566403546 36
> 0 0.162646896672268 -0.78693352240647 36
> 0 -0.0237552633641579 -2.08009174539077 36
> 0 1.72764023505393 -1.52531107294011 37
> 0 -1.20609548236476 -0.350729935001103 37
> 0 -0.496474551944354 1.15741861491568 37
> 0 0.818624656010395 1.3343635197404 37
> 0 -0.378686126801994 0.320711585603462 37
> 0 -0.220736353149926 -0.0617951165921405 37
> 0 -0.532576974510893 0.856095353829673 37
> 0 -1.08874161733530 -0.572306716117233 37
> 0 1.66354947355465 -0.655045427261152 37
> 0 0.923099915602742 1.43210633095733 37
> 0 -1.44474374359117 -0.572982836603795 37
> 0 -0.868179219446784 -1.78150575293452 37
> 0 1.12186277789507 0.719839314331108 37
> 0 1.52409136537473 -1.60156070331998 37
> 0 0.0207604270495041 0.900854116182646 37
> 0 0.0870166937510333 0.188902010342365 37
> 0 0.189948114128803 -0.879191591413344 37
> 0 -0.594344539071753 -0.121695083130955 37
> 0 0.0310695331388422 0.302724206668980 37
> 0 0.450234650597431 -0.498275918074686 37
> 0 -0.879029318562994 1.91325027124310 37
> 0 -1.19387358617754 0.415195589786768 37
> 0 -0.999996384705095 0.326182986607773 37
> 0 -1.22203210130583 2.37893693284816 37
> 0 -0.536791825105618 -0.0529613002133304 37
> 0 0.772007449612322 0.334701931528597 37
> 1 0.530740980453826 0.367298286612298 37
> 0 0.0411140579856267 1.00888498761251 37
> 1 0.0567886661529801 1.02437829187428 37
> 0 0.420286460619087 0.623985295230492 37
> 1 1.66834800941914 -1.10686490888038 38
> 0 0.47912853490631 -0.146104597480991 38
> 0 0.231147329299359 -0.587423220808043 38
> 0 -0.492539346026769 0.0401049770255624 38
> 0 -1.58479705235987 0.70945382707776 38
> 1 0.696630483538953 2.72987301776056 38
> 0 0.167701635593668 -0.711787689887907 38
> 0 -1.89590889969603 -2.74634013880162 38
> 0 0.173961416148024 -1.07640458984864 38
> 1 1.95061183115667 -1.70089195730905 38
> 0 -0.85070410850467 1.44770877129908 38
> 0 -1.02589702898522 0.813627590649892 38
> 0 -1.89694342324999 0.485841417357191 38
> 0 -0.0485975388147613 -0.334833510355513 38
> 1 1.53386246671855 0.652226282581049 38
> 0 -0.853430077526638 0.74580650192301 38
> 0 -0.42125440699162 0.336240501129863 38
> 0 -0.346315994398499 -1.63617266227481 38
> 0 -1.57585182389610 1.59160020737885 38
> 0 0.343920473740171 -0.397542898456665 38
> 0 -0.416552081147817 1.20769094437198 38
> 0 0.114591095308867 -2.23923109001922 38
> 0 -1.05628217432432 -0.833779840964469 38
> 0 0.0340174806748752 -0.592897335834245 38
> 0 -0.362120691857639 -0.162571560636123 38
> 0 -1.06801694532928 -0.45705568011104 38
> 0 -0.381467762021923 0.108696085378322 38
> 0 -0.674943413776984 0.265908564416539 38
> 1 0.471578268601214 0.456313360180369 38
> 1 1.77388130128131 0.28460452132468 38
> 0 -0.990084778542665 -1.24264460210266 39
> 1 0.417396171275101 0.45555066291817 39
> 0 -0.700200427313856 -1.03727908131435 39
> 1 0.182343173744050 -0.486032289451232 39
> 0 0.209268560290966 0.594715669055158 39
> 0 0.542370951547831 -0.510328331217877 39
> 0 -0.964491166345883 1.25401549522843 39
> 0 -0.588988551068334 1.02060422568312 39
> 0 0.979268635317629 1.10393087400257 39
> 0 -1.06953137518773 1.63499547451183 39
> 0 -0.404524959652447 -0.495297776633022 39
> 1 -0.0927247424198279 0.0669356061614732 39
> 1 0.60440477335908 -0.491661807200449 39
> 1 2.27523039984576 -2.33079361386238 39
> 1 0.603861262510237 2.05214553642473 39
> 1 1.63091282623697 -2.05371886137414 39
> 0 -0.204862872401721 -0.900565825454433 39
> 0 -0.186842288477909 -0.313668853373807 39
> 0 0.0700769451695344 -0.469010692777151 39
> 1 0.234234136996913 1.51713150373395 39
> 0 0.801563166695383 0.436498504259955 39
> 1 1.28015251069158 0.486762900213529 39
> 1 1.02159253285159 -1.41811169643953 39
> 0 -1.38345159498863 0.732422770177986 39
> 1 0.963505794870392 0.236726192415211 39
> 0 -0.530845067498996 -0.848416622165054 39
> 1 0.94007379927763 -0.503188616559407 39
> 0 0.435638952523949 0.787611025386835 39
> 1 0.440701540543045 -1.67724594069816 39
> 1 0.606980283130353 0.0553151888953395 39
> 0 -0.156687845353137 -1.73608237082408 40
> 0 -0.338869269241685 -0.538933429594183 40
> 0 -1.61202309905023 -1.51526755892797 40
> 0 -1.6857988460799 -0.840741904787467 40
> 0 -3.58316567391177 0.353921494211976 40
> 0 -0.624886984704605 0.465445910904814 40
> 0 0.494715180266173 -0.765464804666817 40
> 1 1.54961698344287 0.095539761252929 40
> 0 -0.71724914200417 -0.839336368096103 40
> 1 1.83173972382229 3.23420153541580 40
> 0 -0.881026763972126 0.203044852352187
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Hans Skaug
In reply to this post by Hans Skaug
Douglas Bates wrote:

>
>The "Laplace" method in lmer and the default method in glmm.admb,
>which according to the documentation is the Laplace approximation,
>produce essentially the same model fit.  One difference is the
>reported value of the log-likelihood, which we should cross-check, and
>another difference is in the execution time
>

Yes, glmmADMB has sqrt(2*pi) constants missing. Thanks for pointing that out.

Execution time: As pointed out by Roel de Jong, the underlying software AD Model Builder
does not use hand-coded derivatives for the Hessian involved in the Laplace approximation,
but calculates these by automatic differentiation. There is a cost in terms of execution
speed, but on the other hand it is very quick to develop new models, as you do not
have to worry about derivatives. I hope to exploit this beyond standard GLMMs, and provide
other R packages.

Comparison of glmmADMB with lmer: I find that the two packages do not give the same
result on one of the standard datasets in the literature (Lesaffre et. al., Appl. Statist. (2001) 50, Part3, pp 325-335).
The full set of R commands used to download data and fit the model is given at the end of this email.

> fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")
Warning message:
optim or nlminb returned message ERROR: ABNORMAL_TERMINATION_IN_LNSRCH
 in: LMEopt(x = mer, value = cv)

PART OF OUTPUT:

Fixed effects:
             Estimate Std. Error  z value Pr(>|z|)    
(Intercept) -0.626214   0.264996  -2.3631  0.01812 *  
treat       -0.304660   0.360866  -0.8442  0.39853    
time        -0.346605   0.026666 -12.9979  < 2e-16 ***

The corresponding estimates with glmmADMB is:

> fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")

PART OF OUTPUT:

Fixed effects:
  Log-likelihood: -359.649
  Formula: y ~ treat + time
(Intercept)       treat        time
   -2.33210    -0.68795    -0.46134


So, the estimates of the fixed effects differ. lmer() does infact produces a warning, and it appears that
it method="Laplace" and method="PQL" produce the same results.


Best regards,

hans


# Load data
source("http://www.mi.uib.no/~skaug/cash/lesaffre_dat.s")

# Run lmer
library(lme4)
fit_lmer <- lmer(y~ treat + time + (1|subject),data=lesaffre,family=binomial)
fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")


# Run glmmADMB
library(glmmADMB)
example(glmm.admb) # Must be run once in each new directory (this feature will be removed in future version of glmmADMB).
fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Douglas Bates
On 12/19/05, Hans Julius Skaug <[hidden email]> wrote:

> Douglas Bates wrote:
>
> >
> >The "Laplace" method in lmer and the default method in glmm.admb,
> >which according to the documentation is the Laplace approximation,
> >produce essentially the same model fit.  One difference is the
> >reported value of the log-likelihood, which we should cross-check, and
> >another difference is in the execution time
> >
>
> Yes, glmmADMB has sqrt(2*pi) constants missing. Thanks for pointing that out.
>
> Execution time: As pointed out by Roel de Jong, the underlying software AD Model Builder
> does not use hand-coded derivatives for the Hessian involved in the Laplace approximation,
> but calculates these by automatic differentiation. There is a cost in terms of execution
> speed, but on the other hand it is very quick to develop new models, as you do not
> have to worry about derivatives. I hope to exploit this beyond standard GLMMs, and provide
> other R packages.
>
> Comparison of glmmADMB with lmer: I find that the two packages do not give the same
> result on one of the standard datasets in the literature (Lesaffre et. al., Appl. Statist. (2001) 50, Part3, pp 325-335).
Ah yes, that example.  It is also given as the 'toenail' data set in
the 'mlmus' package of data sets from the book "Multilevel and
Longitudinal Modeling Using Stata" by Sophia Rabe-Hesketh and Anders
Skrondal (Stata Press, 2005).

It is not surprising that it is difficult to fit such a model to these
data because the data do not look like they come from such a model.
You did not include the estimates of the variance of the random
effects in your output.  It is very large and very poorly determined.
If you check the distribution of the posterior modes of the random
effects (for linear mixed models these are called the BLUPs - Best
Linear Unbiased Predictors - and you could call them BLUPs here too
except for the fact that they are not linear and they are not unbiased
and there isn't a clear sense in which they are "best") it is clearly
not a Gaussian distribution with mean zero.  I enclose a density plot.
 You can see that it is bimodal and the larger of the two peaks is for
a negative value.  These are the random effects for those subjects
that had no positive responses - 163 out of the 294 subjects.

> sum(with(lesaffre, tapply(y, subject, mean)) == 0)
[1] 163

There is no information to estimate the random effects for these
subjects other than "make it as large and negative as possible".  It
is pointless to estimate the fixed effects for such a clearly
inappropriate model.
 lattice package.

> The full set of R commands used to download data and fit the model is given at the end of this email.
>
> > fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")
> Warning message:
> optim or nlminb returned message ERROR: ABNORMAL_TERMINATION_IN_LNSRCH
>  in: LMEopt(x = mer, value = cv)
>
> PART OF OUTPUT:
>
> Fixed effects:
>              Estimate Std. Error  z value Pr(>|z|)
> (Intercept) -0.626214   0.264996  -2.3631  0.01812 *
> treat       -0.304660   0.360866  -0.8442  0.39853
> time        -0.346605   0.026666 -12.9979  < 2e-16 ***
>
> The corresponding estimates with glmmADMB is:
>
> > fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")
>
> PART OF OUTPUT:
>
> Fixed effects:
>   Log-likelihood: -359.649
>   Formula: y ~ treat + time
> (Intercept)       treat        time
>    -2.33210    -0.68795    -0.46134
>
>
> So, the estimates of the fixed effects differ. lmer() does infact produces a warning, and it appears that
> it method="Laplace" and method="PQL" produce the same results.
>
>
> Best regards,
>
> hans
>
>
> # Load data
> source("http://www.mi.uib.no/~skaug/cash/lesaffre_dat.s")
>
> # Run lmer
> library(lme4)
> fit_lmer <- lmer(y~ treat + time + (1|subject),data=lesaffre,family=binomial)
> fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")
>
>
> # Run glmmADMB
> library(glmmADMB)
> example(glmm.admb)      # Must be run once in each new directory (this feature will be removed in future version of glmmADMB).
> fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")
>
>
>
>
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

ranef.pdf (28K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Roel de Jong
Well, the dataset which I send in my previous message did without any
doubt come from a multilevel model (generated and fitted under the
binomial probit link), and gave the earlier posted error message while
fitting it with the latest version of lmer:

Warning: IRLS iterations for PQL did not converge
Error in objective(.par, ...) : Unable to invert singular factor of
downdated X'X

When data is generated from a specified model with reasonable parameter
values, it should be possible to fit such a model successful, or is this
me being stupid? We could try other parameter values, link functions
and/or more cases in a class if these given below are somehow implausible.

500 samples are drawn with the model specification (binomial probit):
y = (intercept*f1+pred2*f2+pred3*f3)+(intercept*ri+pred2*rs)
     where pred2 and pred3 are predictors distributed N(0,1)
     f1..f3 are fixed effects, f1=-1, f2=1.5, f3=0.5
     ri is random intercept with associated variance var_ri=0.2
     rs is random slope with associated variance var_rs=0.4
     the covariance between ri and rs "covr"=0.1

1500 units/dataset, class size=30

Best regards,
        Roel de Jong

Douglas Bates wrote:

> On 12/19/05, Hans Julius Skaug <[hidden email]> wrote:
>
>>Douglas Bates wrote:
>>
>>
>>>The "Laplace" method in lmer and the default method in glmm.admb,
>>>which according to the documentation is the Laplace approximation,
>>>produce essentially the same model fit.  One difference is the
>>>reported value of the log-likelihood, which we should cross-check, and
>>>another difference is in the execution time
>>>
>>
>>Yes, glmmADMB has sqrt(2*pi) constants missing. Thanks for pointing that out.
>>
>>Execution time: As pointed out by Roel de Jong, the underlying software AD Model Builder
>>does not use hand-coded derivatives for the Hessian involved in the Laplace approximation,
>>but calculates these by automatic differentiation. There is a cost in terms of execution
>>speed, but on the other hand it is very quick to develop new models, as you do not
>>have to worry about derivatives. I hope to exploit this beyond standard GLMMs, and provide
>>other R packages.
>>
>>Comparison of glmmADMB with lmer: I find that the two packages do not give the same
>>result on one of the standard datasets in the literature (Lesaffre et. al., Appl. Statist. (2001) 50, Part3, pp 325-335).
>
>
> Ah yes, that example.  It is also given as the 'toenail' data set in
> the 'mlmus' package of data sets from the book "Multilevel and
> Longitudinal Modeling Using Stata" by Sophia Rabe-Hesketh and Anders
> Skrondal (Stata Press, 2005).
>
> It is not surprising that it is difficult to fit such a model to these
> data because the data do not look like they come from such a model.
> You did not include the estimates of the variance of the random
> effects in your output.  It is very large and very poorly determined.
> If you check the distribution of the posterior modes of the random
> effects (for linear mixed models these are called the BLUPs - Best
> Linear Unbiased Predictors - and you could call them BLUPs here too
> except for the fact that they are not linear and they are not unbiased
> and there isn't a clear sense in which they are "best") it is clearly
> not a Gaussian distribution with mean zero.  I enclose a density plot.
>  You can see that it is bimodal and the larger of the two peaks is for
> a negative value.  These are the random effects for those subjects
> that had no positive responses - 163 out of the 294 subjects.
>
>
>>sum(with(lesaffre, tapply(y, subject, mean)) == 0)
>
> [1] 163
>
> There is no information to estimate the random effects for these
> subjects other than "make it as large and negative as possible".  It
> is pointless to estimate the fixed effects for such a clearly
> inappropriate model.
>  lattice package.
>
>
>>The full set of R commands used to download data and fit the model is given at the end of this email.
>>
>>
>>>fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")
>>
>>Warning message:
>>optim or nlminb returned message ERROR: ABNORMAL_TERMINATION_IN_LNSRCH
>> in: LMEopt(x = mer, value = cv)
>>
>>PART OF OUTPUT:
>>
>>Fixed effects:
>>             Estimate Std. Error  z value Pr(>|z|)
>>(Intercept) -0.626214   0.264996  -2.3631  0.01812 *
>>treat       -0.304660   0.360866  -0.8442  0.39853
>>time        -0.346605   0.026666 -12.9979  < 2e-16 ***
>>
>>The corresponding estimates with glmmADMB is:
>>
>>
>>>fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")
>>
>>PART OF OUTPUT:
>>
>>Fixed effects:
>>  Log-likelihood: -359.649
>>  Formula: y ~ treat + time
>>(Intercept)       treat        time
>>   -2.33210    -0.68795    -0.46134
>>
>>
>>So, the estimates of the fixed effects differ. lmer() does infact produces a warning, and it appears that
>>it method="Laplace" and method="PQL" produce the same results.
>>
>>
>>Best regards,
>>
>>hans
>>
>>
>># Load data
>>source("http://www.mi.uib.no/~skaug/cash/lesaffre_dat.s")
>>
>># Run lmer
>>library(lme4)
>>fit_lmer <- lmer(y~ treat + time + (1|subject),data=lesaffre,family=binomial)
>>fit_lmer_lapl <- lmer(y~ treat + time  + (1|subject),data=lesaffre,family=binomial,method="Laplace")
>>
>>
>># Run glmmADMB
>>library(glmmADMB)
>>example(glmm.admb)      # Must be run once in each new directory (this feature will be removed in future version of glmmADMB).
>>fit_glmmADMB <- glmm.admb(y~ treat + time,random=~1,group="subject",data=lesaffre,family="binomial",link="logit")
>>
>>
>>
>>
>>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Bert Gunter
May I interject a comment?

>
> When data is generated from a specified model with reasonable
> parameter
> values, it should be possible to fit such a model successful,
> or is this
> me being stupid?

Let me take a turn at being stupid. Why should this be true? That is, why
should it be possible to easily fit a model that is generated ( i.e. using a
pseudo-random number generator) from a perfectly well-defined model? For
example, I can easily generate simple linear models contaminated with
outliers that are quite difficult to fit (e.g. via resistant fitting
methods). In nonlinear fitting, it is quite easy to generate data from
oevrparameterized models that are quite difficult to fit or whose fit is
very sensitive to initial conditions. Remember: the design (for the
covariates) at which you fit the data must support the parameterization.

The most dramatic examples are probably of simple nonlinear model systems
with no noise which produce chaotic results when parameters are in certain
ranges. These would be totally impossible to recover from the "data."

So I repeat: just because you can generate data from a simple model, why
should it be easy to fit the data and recover the model?

Cheers,

Bert Gunter
Genentech

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Hans Skaug
In reply to this post by Hans Skaug
I agree that the model is not fitting the Lesaffre data well, but my point was
to show that glmmADMB is numerically stable. Numerical
stability is obviously a nice property, but becomes particularly important
when one wants to do parametric bootstrappin, which I think is needed
for these kinds of models to assess bias in parameter estimates.

glmmADMB produces the exact parameter values that maximizes the Laplace approximation
for this dataset. Another story is that the Laplace approximation
is inaccurate here, as can be shown by using other likelihood approximations.


hans

Douglas Bates wrote:

>Ah yes, that example.  It is also given as the 'toenail' data set in
>the 'mlmus' package of data sets from the book "Multilevel and
>Longitudinal Modeling Using Stata" by Sophia Rabe-Hesketh and Anders
>Skrondal (Stata Press, 2005).
>
>It is not surprising that it is difficult to fit such a model to these
>data because the data do not look like they come from such a model.
>You did not include the estimates of the variance of the random
>effects in your output.  It is very large and very poorly determined.
>If you check the distribution of the posterior modes of the random
>effects (for linear mixed models these are called the BLUPs - Best
>Linear Unbiased Predictors - and you could call them BLUPs here too
>except for the fact that they are not linear and they are not unbiased
>and there isn't a clear sense in which they are "best") it is clearly
>not a Gaussian distribution with mean zero.  I enclose a density plot.
> You can see that it is bimodal and the larger of the two peaks is for
>a negative value.  These are the random effects for those subjects
>that had no positive responses - 163 out of the 294 subjects.
>
>> sum(with(lesaffre, tapply(y, subject, mean)) == 0)
>[1] 163
>
>There is no information to estimate the random effects for these
>subjects other than "make it as large and negative as possible".  It
>is pointless to estimate the fixed effects for such a clearly
>inappropriate model.
> lattice package.
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Roel de Jong
In reply to this post by Bert Gunter
Of course it is generally possible to generate datasets for a perfectly
well-defined model that are hard to fit, but in this particular case I
feel it should be possible. In my observations, glmm.admb is far more
numerically stable fitting GLMM's than other software I've seen. Further
, I don't think the data I generated come from a model that is
overparameterized, severely contaminated with outliers, has no noise, or
is nonlinear. But I encourage anyone to run a simulation study with
generated data they think are acceptable and compare the robustness of
several methods. I leave it at this.

Best regards,
        Roel de Jong

Berton Gunter wrote:

> May I interject a comment?
>
>
>>When data is generated from a specified model with reasonable
>>parameter
>>values, it should be possible to fit such a model successful,
>>or is this
>>me being stupid?
>
>
> Let me take a turn at being stupid. Why should this be true? That is, why
> should it be possible to easily fit a model that is generated ( i.e. using a
> pseudo-random number generator) from a perfectly well-defined model? For
> example, I can easily generate simple linear models contaminated with
> outliers that are quite difficult to fit (e.g. via resistant fitting
> methods). In nonlinear fitting, it is quite easy to generate data from
> oevrparameterized models that are quite difficult to fit or whose fit is
> very sensitive to initial conditions. Remember: the design (for the
> covariates) at which you fit the data must support the parameterization.
>
> The most dramatic examples are probably of simple nonlinear model systems
> with no noise which produce chaotic results when parameters are in certain
> ranges. These would be totally impossible to recover from the "data."
>
> So I repeat: just because you can generate data from a simple model, why
> should it be easy to fit the data and recover the model?
>
> Cheers,
>
> Bert Gunter
> Genentech
>
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
Reply | Threaded
Open this post in threaded view
|

Re: glmmADMB: Generalized Linear Mixed Models using AD Model Builder

Spencer Graves
          I get upset when software dies and refuses to give me an answer.  I'd
much rather have a routine give me a wrong answer -- with an error
message -- than just an error message.  Maybe refuse to print standard
errors when the hessian is singular, but at least give me a progress
report with the singular hessian.  Without that, I have to program
"optim" or something else separately to get the answers and the hessian
in order to do my own diagnosis -- if I know enough to do that.

          Just my 0.02 Euros.
          spencer graves

Roel de Jong wrote:

> Of course it is generally possible to generate datasets for a perfectly
> well-defined model that are hard to fit, but in this particular case I
> feel it should be possible. In my observations, glmm.admb is far more
> numerically stable fitting GLMM's than other software I've seen. Further
> , I don't think the data I generated come from a model that is
> overparameterized, severely contaminated with outliers, has no noise, or
> is nonlinear. But I encourage anyone to run a simulation study with
> generated data they think are acceptable and compare the robustness of
> several methods. I leave it at this.
>
> Best regards,
> Roel de Jong
>
> Berton Gunter wrote:
>
>>May I interject a comment?
>>
>>
>>
>>>When data is generated from a specified model with reasonable
>>>parameter
>>>values, it should be possible to fit such a model successful,
>>>or is this
>>>me being stupid?
>>
>>
>>Let me take a turn at being stupid. Why should this be true? That is, why
>>should it be possible to easily fit a model that is generated ( i.e. using a
>>pseudo-random number generator) from a perfectly well-defined model? For
>>example, I can easily generate simple linear models contaminated with
>>outliers that are quite difficult to fit (e.g. via resistant fitting
>>methods). In nonlinear fitting, it is quite easy to generate data from
>>oevrparameterized models that are quite difficult to fit or whose fit is
>>very sensitive to initial conditions. Remember: the design (for the
>>covariates) at which you fit the data must support the parameterization.
>>
>>The most dramatic examples are probably of simple nonlinear model systems
>>with no noise which produce chaotic results when parameters are in certain
>>ranges. These would be totally impossible to recover from the "data."
>>
>>So I repeat: just because you can generate data from a simple model, why
>>should it be easy to fit the data and recover the model?
>>
>>Cheers,
>>
>>Bert Gunter
>>Genentech
>>
>>
>
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

--
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[hidden email]
www.pdf.com <http://www.pdf.com>
Tel:  408-938-4420
Fax: 408-280-7915

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html