Difference between Traditional Regression and Partial Least Square

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Difference between Traditional Regression and Partial Least Square

Wong Chun Kit
>>
Dear R-Helpers,

I've a data set and run the traditional regression and partial least square as
below:

>lm(y~x10+x11+x12+x13+x14+x15+x16, data=X)

Coefficients:
              Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.4762538  0.0252618 -18.853  < 2e-16 ***
x10          0.2825081  0.0962377   2.936  0.00348 **
x11          0.0487763  0.1222990   0.399  0.69019    
x12          0.0189079  0.1200368   0.158  0.87490    
x13          0.0957643  0.1236650   0.774  0.43907    
x14         -0.2028041  0.1243989  -1.630  0.10367    
x15         -0.0005613  0.1255884  -0.004  0.99644    
x16          0.0815347  0.0837342   0.974  0.33066  


>plsr(formula = y ~ x10 + x11 + x12 + x13 + x14 + x15 + x16, 7,data = X)

                y
x10  0.2825080818
x11  0.0487762894
x12  0.0189078718
x13  0.0957643290
x14 -0.2028040503
x15 -0.0005613228
x16  0.0815347421

I checked that the estimated coefficient is the same. What is the difference
between lm and plsr?

Thanks in advance.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html