M1 and M2 are extreme in that all or none of the variables have
parallel lines on the logit scale. One can try fitting a partial
POM, which remains fraught (but not as much as M2) because if
the lines intersect for a particular variable where the data lie
then there will be numerical problems.
which will only allow nonparallelism wrt the x3 variable.
Another idea is to try transforming each x variable if
it crashes... that might help.
BTW, can use lrtest(), e.g., lrtest(M3, M2)
>I want to test whether the proportional odds assumption for an ordered
>regression is met.
>The UCLA website points out that there is no mathematical way to test the
>proportional odds assumption
>and use graphical inspection ("We were unable to locate a facility in
>perform any of the tests commonly used to test the parallel
>However, I found a pdf by Agresti suggesting a method using the
>function, the pdf is called "Examples of Using R for Modeling
>> M1 <- vglm(y ~ x1 + x2 + x3, data=data,
>> M2 <- vglm(y ~ x1 + x2 + x3, data=data, family=cumulative, maxit=100)
>If the test is significant, the proportional odds assumption is (might be)
>However, running this procedure in my dataset with 5 predictors (3
>dichotomous, 2 z-standardized metric) and an ordinal dependent variable
>(0,1,2,3) in a sample of N=2500 leads to various problems, maybe you can
>help me out how to solve these.
>(1) Error in dotC(name = "tapplymat1", mat = as.double(mat),
> NA/NaN/Inf in foreign function call (arg 1
>(2) In matrix.power(wz, M = M, power = 0.5, fast = TRUE) :
> Some weight matrices have negative eigenvalues. They
>will be assigned NAs
>In Deviance.categorical.data.vgam(mu = mu, y = y, w = w, residuals =
> fitted values close to 0 or 1
>(4) In log(prob) : NaNs produced
>The last two don't seem to be as critical and the first two, seeing that
>models with that errors at least provide a p-value (the first to errors
>lead to a p--value of 1).
> [[alternative HTML version deleted]]