problem with nls....

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

problem with nls....

akshay kulkarni
dear members,
                            I have the following nls call:

> HF53nl <- nls(HF1 ~ ((m/HF6) + 1),data = data.frame(HF6,HF1),start = list(m = 0.1))
> overview(HF53nl)

------
Formula: HF1 ~ ((m/HF6) + 1)

Parameters:
   Estimate Std. Error t value Pr(>|t|)
m 2.147e-07  1.852e-06   0.116    0.908

Residual standard error: 0.03596 on 799 degrees of freedom

Number of iterations to convergence: 1
Achieved convergence tolerance: 1.246e-06

------
Residual sum of squares: 1.03

------
t-based confidence interval:
           2.5%        97.5%
1 -3.420983e-06 3.850292e-06

------
Correlation matrix:
  m
m 1

The scatter plot of HF6 and HF1 and the corresponding fitted line according to the above output of nls is attached(HF53nl). The fitted line is almost a straight line. But it should be a curve something of: y ~ 1/x.  I think the very small value of m is making the curve a straight line.

But the fitted curve of the following call makes sense(attached: HF43nl):

> HF43nl <- nls(HF1 ~ ((k/HF5) + 1),data = data.frame(HF5,HF1),start = list(k = 0.1))
> overview(HF43nl)

------
Formula: HF1 ~ ((k/HF5) + 1)

Parameters:
    Estimate Std. Error t value Pr(>|t|)
k -5.367e-04  5.076e-05  -10.57   <2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.03368 on 799 degrees of freedom

Number of iterations to convergence: 1
Achieved convergence tolerance: 3.076e-07

------
Residual sum of squares: 0.906

------
t-based confidence interval:
           2.5%         97.5%
1 -0.0006363717 -0.0004370954

------
Correlation matrix:
  k
k 1

The queer thing is that the RSS for HF53nl and HF43nl is almost the same, which points to the purported validity of HF53nl.  How is this possible? Can I go with the above estimates for the coefficient m of HF6 being equal to 2.147 * 10^(-7)? How do I make an nls call so that there is a better fit to HF1 and HF6.

NB: If you can't access the attached graphs, how do I send it to you otherwise? I can also give you HF1,HF6,HF5 if needed....

very many thanks for your time and effort....
yours sincerely,

AKSHAY M KULKARNI

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

HF53nl.png (14K) Download Attachment
HF43nl.png (15K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Fw: problem with nls....

akshay kulkarni
dear members,
                            On a closer inspection, I can see that the scatterplot of HF1 and HF5 is of the form y ~ -(1/x), while that of HF1 and HF6 is of the form y ~ (1/x). Is it possible that HF43nl is converging almost due to chance? I mean, for HF53nl, a straight line minimizes the RSS rather than for a curve like y ~ (1/x). Is it possible? If that is the case, should I model it linearly rather than nonlinearly? It is unsettling(this would always gives the wrong prediction given a predictor!). Or rather picewise nonlinear regression(for HF6 < 0 and HF6 > 0)?

very many thanks for your time and effort....
yours sincerely,
AKSHAY M KULKARNI


________________________________________
From: R-help <[hidden email]> on behalf of akshay kulkarni <[hidden email]>
Sent: Thursday, March 21, 2019 5:26 PM
To: R help Mailing  list
Subject: [R] problem with nls....

dear members,
                            I have the following nls call:

> HF53nl <- nls(HF1 ~ ((m/HF6) + 1),data = data.frame(HF6,HF1),start = list(m = 0.1))
> overview(HF53nl)

------
Formula: HF1 ~ ((m/HF6) + 1)

Parameters:
   Estimate Std. Error t value Pr(>|t|)
m 2.147e-07  1.852e-06   0.116    0.908

Residual standard error: 0.03596 on 799 degrees of freedom

Number of iterations to convergence: 1
Achieved convergence tolerance: 1.246e-06

------
Residual sum of squares: 1.03

------
t-based confidence interval:
           2.5%        97.5%
1 -3.420983e-06 3.850292e-06

------
Correlation matrix:
  m
m 1

The scatter plot of HF6 and HF1 and the corresponding fitted line according to the above output of nls is attached(HF53nl). The fitted line is almost a straight line. But it should be a curve something of: y ~ 1/x.  I think the very small value of m is making the curve a straight line.

But the fitted curve of the following call makes sense(attached: HF43nl):

> HF43nl <- nls(HF1 ~ ((k/HF5) + 1),data = data.frame(HF5,HF1),start = list(k = 0.1))
> overview(HF43nl)

------
Formula: HF1 ~ ((k/HF5) + 1)

Parameters:
    Estimate Std. Error t value Pr(>|t|)
k -5.367e-04  5.076e-05  -10.57   <2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.03368 on 799 degrees of freedom

Number of iterations to convergence: 1
Achieved convergence tolerance: 3.076e-07

------
Residual sum of squares: 0.906

------
t-based confidence interval:
           2.5%         97.5%
1 -0.0006363717 -0.0004370954

------
Correlation matrix:
  k
k 1

The queer thing is that the RSS for HF53nl and HF43nl is almost the same, which points to the purported validity of HF53nl.  How is this possible? Can I go with the above estimates for the coefficient m of HF6 being equal to 2.147 * 10^(-7)? How do I make an nls call so that there is a better fit to HF1 and HF6.

NB: If you can't access the attached graphs, how do I send it to you otherwise? I can also give you HF1,HF6,HF5 if needed....

very many thanks for your time and effort....
yours sincerely,

AKSHAY M KULKARNI

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

HF53nl.png (14K) Download Attachment
HF43nl.png (15K) Download Attachment
ATT00001.txt (424 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

Ivan Krylov
In reply to this post by akshay kulkarni
One of the assumptions made by least squares method is that the
residuals are independent and normally distributed with same parameters
(or, in case of weighted regression, the standard deviation of the
residual is known for every point). If this is the case, the parameters
that minimize the sum of squared residuals are the maximum likelihood
estimation of the true parameter values.

The problem is, your data doesn't seem to adhere well to your formula.
Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
residuals themselves)? With large residual values (outliers?), the loss
function (i.e. sum of squared residuals) is disturbed and doesn't
reflect the values you would expect to get otherwise. Try computing
sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
changing m makes any difference.

Try looking up "robust regression" (e.g. minimize sum of absolute
residuals instead of squared residuals; a unique solution is not
guaranteed, but it's be less disturbed by outliers).

--
Best regards,
Ivan

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

akshay kulkarni
dear Ivan,
                   I've not gone into residual analysis; but my observation is simple: I've checked the hist of both HF5 and HF6. There is not much difference. Also I've replaced all outliers.
HF1 ~ (m/HF5 )+ 1 is getting fitted properly, but not HF1 ~ (m/HF6) + 1.
                    The following are the actual values:

> HF1
Time Series:
Start = 1
End = 800
Frequency = 1
  [1] 1.0319256 0.9842066 1.0098243 1.0446384 0.9177308 1.0060822 0.9609599 1.0374124 1.0139675 0.9973329 0.9559346 0.9848896
 [13] 0.9749513 1.0511627 0.9789968 1.0964832 0.9879833 0.9549759 0.9787043 1.0203225 0.9947078 0.9813439 1.0138056 0.9670097
 [25] 0.9711946 0.9873085 1.0858024 1.0394149 0.9766102 0.9689002 1.0097453 1.0235376 0.9873976 0.9705998 1.0356838 1.0165155
 [37] 0.9855907 1.0757638 1.0072182 1.0280799 0.9281543 0.9587241 1.1086856 1.0446199 1.0158398 0.9529567 1.0610853 0.9976204
 [49] 0.9575143 0.9803208 1.1238821 1.0118991 1.0112989 0.9415333 1.0424331 0.9912462 1.0106361 0.9802978 1.0108935 1.0159902
 [61] 0.9892313 0.9438749 1.0118004 0.9953912 0.9175923 0.9479009 1.0235502 1.0060517 0.9890903 0.9885812 0.9900430 1.0350717
 [73] 1.0108698 1.0468498 1.0656555 1.0436655 0.9908752 0.9751098 1.0163194 0.9851445 0.9710072 0.9885114 1.0109649 1.0490736
 [85] 0.9795251 1.0108749 1.0029784 1.0149087 0.9965277 0.9893746 0.9917926 1.0115123 1.0472170 1.0437206 1.0139089 1.0372349
 [97] 1.0038352 0.9586151 1.0085806 1.0119048 1.0118624 0.9896469 1.0272961 1.0172400 1.0134005 0.9757968 0.9717420 1.0269058
[109] 1.0114416 0.9512890 1.0181753 1.0565599 1.0376291 0.9865798 1.0212159 1.0701965 1.0324734 0.9899814 0.9973403 1.0172419
[121] 1.0020050 0.9889063 1.0129236 1.0277797 0.9826509 0.9922282 1.0988522 1.0275115 1.0183555 0.9774303 1.0172997 1.0150803
[133] 0.9685015 0.9924186 0.9937192 1.0072210 0.9673327 1.0473338 1.0562761 0.9707440 0.9771936 0.9883559 1.0208805 0.9894798
[145] 1.0694593 0.9754638 1.0383527 1.0013232 0.9863309 0.8778824 1.0157532 1.0438316 1.0000022 0.9740199 1.0305441 1.0275372
[157] 0.9723386 0.9954525 1.0046082 0.9531964 0.9768512 0.9899314 1.0496263 1.0546074 0.9616430 1.0210772 0.9901334 1.0689765
[169] 1.0154938 0.8765444 0.9919604 1.0082690 0.9860675 0.9823378 0.9897682 1.0363582 0.9805102 0.9723787 1.0741545 1.0290322
[181] 0.9760903 0.9850951 1.0500385 0.9774908 0.9861186 0.9898369 0.9941887 1.0097938 1.0187774 1.0591694 1.0270933 1.0466363
[193] 1.0000043 0.9815685 1.0238718 0.9740055 0.9717232 1.0251001 0.9946316 1.0075567 0.9751129 0.9871612 1.0643235 1.0075491
[205] 0.9888058 0.9396797 1.0068366 0.9962325 1.0455487 1.0442334 1.0103938 1.0236919 0.9852552 0.9767037 1.0063593 1.0518584
[217] 0.9705860 0.9718808 1.0178662 1.0414515 0.9883699 0.9860597 1.0394941 1.0103630 0.9082023 0.9889798 0.9646139 1.0052705
[229] 0.9688456 1.0559528 1.0401153 0.9785603 1.0169463 0.9929363 0.9812825 0.9302532 1.0272447 1.0644704 1.0201468 1.0248872
[241] 0.9587034 0.9884793 1.0065787 1.0568458 1.0167972 0.9702934 1.0233577 1.0052691 0.9690838 0.9900543 1.0171212 1.0093782
[253] 0.9518359 0.8953816 1.1180924 1.0126421 0.9847542 0.9731075 0.9906067 1.0191311 0.9757062 0.9819144 1.0392988 1.0358210
[265] 0.9842700 1.0057314 1.0206313 1.0088607 0.9779384 0.9860996 0.9894232 1.0180867 1.0060215 0.9419578 1.0604701 1.0186874
[277] 0.9824626 0.9303484 1.0491317 1.0204767 0.9892820 0.9971268 1.0322837 1.0435960 1.0123649 0.9791956 0.9880841 1.0203823
[289] 0.9696436 0.9769832 1.0704628 1.0230000 0.9665417 0.8624573 1.0152342 1.0538081 0.9885551 0.9605257 1.0196322 1.0135050
[301] 1.0420189 0.9875982 1.0228686 1.0224319 0.9778704 0.9912653 1.0116106 1.0226598 0.9387455 0.9717815 1.0122788 0.9889690
[313] 1.0232488 1.0276606 1.0173681 1.0159885 0.9877074 0.9838069 1.0374707 1.0152624 0.9789677 0.9612178 1.0192874 1.0644549
[325] 0.9715407 0.9787567 0.9925342 0.9790322 0.9777879 0.9680505 1.0224064 1.0348370 0.9875051 0.9457753 0.9914921 0.9591109
[337] 0.9629202 0.9995519 1.0136481 1.0221348 1.0148608 0.9912785 1.0439862 1.0330749 0.9762325 0.9983923 0.9348918 1.0227065
[349] 0.9794121 0.9733227 1.0082373 1.0421889 0.9767361 0.9726911 1.0100370 0.9921361 0.9861159 0.9749961 1.0594331 1.0806732
[361] 1.0276992 1.0329190 1.0686383 1.0466639 0.9740776 0.9672371 1.0128714 0.9934691 0.9582222 0.9332858 1.0029784 1.0250300
[373] 1.0059249 0.9999445 1.0082015 1.0252359 0.9760324 0.9493543 0.9996351 1.0116540 0.9675301 0.9470141 1.0127507 1.0112527
[385] 0.9766712 0.9703953 1.0592567 1.0360448 0.9790881 0.9680051 0.9711350 1.0049626 0.9738689 0.9819661 1.0835125 0.9765333
[397] 0.9138484 1.0220322 1.0465788 1.0065803 1.0273082 0.9838126 1.0151329 1.0146824 0.9452442 0.9489901 0.9921946 1.0101152
[409] 0.9730738 0.9354592 0.9542558 0.9681532 0.9792620 1.0352246 1.0426173 1.0180344 0.9576323 0.9533448 0.9846387 1.0261479
[421] 0.9453757 0.9455791 1.0691109 1.0084141 0.9844405 0.9537970 1.0118840 1.0094733 1.1493009 0.9922558 0.9941628 1.0290179
[433] 1.0020050 0.9971342 1.0436267 1.0726863 1.0925811 1.1072580 1.0390200 1.0376942 1.0302470 0.9838505 1.0420336 0.9793092
[445] 0.9850191 1.0196805 1.0065491 1.0158645 1.0117730 0.9406381 1.0097070 0.9870108 0.9818856 1.0040046 0.9712323 0.9951345
[457] 1.0199816 1.0551752 1.0112867 1.0763534 1.0253155 1.0029784 1.0251464 1.0814414 0.9987183 0.9771628 0.9726044 1.0482059
[469] 1.0020050 0.8931139 1.0367775 1.0260033 0.9728766 1.0225689 0.9908196 1.0068729 0.9912127 0.9931128 1.0158280 1.0433496
[481] 1.0203120 1.0085496 0.9812741 1.0615742 1.0119223 0.9849236 0.9992032 0.9879929 0.9000571 0.9891419 1.0345521 1.0381184
[493] 0.9886766 0.9574869 1.0149106 1.0294410 0.9882982 1.0244778 0.9812230 1.0082813 0.9664091 1.0283733 1.0124268 0.9992115
[505] 0.9872004 0.9884649 1.0386713 0.9763343 0.9597727 0.9567414 1.0086152 1.0165768 0.9848861 0.9620526 1.0123326 1.0447678
[517] 0.9934084 0.9669690 1.0360421 0.9829837 0.9761610 0.9708850 1.0014170 1.0195497 0.9806560 0.9757284 1.0251931 1.0116233
[529] 0.9868054 0.9756085 1.0303624 1.0077517 1.0505017 0.9414114 1.0124536 1.0131595 0.9638660 0.9887363 1.0132553 1.0052792
[541] 0.9820370 0.9460134 1.0125483 1.0426700 0.9818528 0.9762532 0.9582658 0.9814603 0.9618717 0.9615659 0.9496436 0.9877108
[553] 0.9999971 1.0284677 1.0106125 1.0031898 0.9793703 0.9486161 1.0226473 1.0236002 0.9538295 0.9689285 1.0313897 1.0212912
[565] 0.9505638 0.9921170 1.0130086 1.0419494 1.0000323 0.9607922 1.0211809 1.0424671 0.9795343 0.9497697 1.0231071 1.0142700
[577] 0.9765539 0.9492815 1.0267628 1.0135138 0.9885966 0.9529603 1.0264062 1.0249176 0.9872525 0.9849608 0.9986306 1.0437033
[589] 1.0041780 0.9931204 1.0329029 0.9939742 0.9459785 0.9629758 0.9456565 0.9836949 0.9754926 0.9976241 1.0232742 1.0050830
[601] 0.9481952 0.9854969 1.0352188 1.0337062 0.9892019 0.9554122 1.0189333 0.9793607 0.9899167 0.9503345 1.0117583 1.0371750
[613] 1.0070349 0.9804208 1.0500940 1.0107281 1.0698735 0.9881469 1.0565684 1.0179031 0.9856278 1.0314952 1.0720689 1.0011222
[625] 0.9743944 1.0034468 0.9824861 1.0192735 0.9991494 0.9842630 1.0060971 1.0294506 0.9695057 0.9725408 1.0227924 1.0088150
[637] 0.9765886 0.9889828 1.0108903 1.0068109 0.9905286 0.9517037 1.0527706 1.0257783 0.9932039 1.0121870 1.0506565 0.9816386
[649] 0.9843450 0.9552800 1.0124886 1.0332463 1.0021401 0.9885442 1.0136001 1.0381933 0.9594773 1.0679251 0.9653448 0.9997715
[661] 0.9890589 0.9658054 1.0079124 1.1292276 0.9873225 0.9730770 1.0699042 1.0174021 1.0041981 1.0232245 1.0389181 0.9720513
[673] 0.8686271 0.9915428 0.9606290 1.0482094 0.9898013 0.9510998 0.9602020 0.9976802 1.1427011 0.9917742 0.9770992 0.8638270
[685] 0.9991782 1.0455336 1.1043633 1.0489159 1.0029784 0.9906192 1.0307161 1.0182152 0.9677313 1.0090984 0.9851279 0.9596324
[697] 0.9743092 0.9748568 1.0206321 1.0517142 0.9876535 0.9732838 1.0656093 1.0603864 0.9980164 0.9795437 0.9746766 0.9784871
[709] 0.9746066 1.0484975 1.0228157 1.0165735 0.9785301 1.0322862 1.0303562 1.0203352 0.9606113 1.0674109 1.0051598 1.0095761
[721] 1.0138837 0.9862772 1.0173451 0.9879873 0.9761662 0.9828150 0.9839169 0.9887962 0.9474475 0.9786754 1.0405266 1.0246702
[733] 0.9764242 0.9782060 1.0004626 1.0653315 1.1480925 0.9567859 1.0410088 1.0246378 1.0025964 0.9894414 1.0146759 1.0449204
[745] 0.9917509 0.9706269 1.0199806 1.0044524 0.9942750 1.0145927 0.9917488 1.0314604 0.9495737 1.0005564 0.9972033 0.9849848
[757] 0.9741118 0.9693319 1.0061280 0.9892915 0.9944768 1.0101943 1.0545997 1.0044063 1.0020050 1.0127975 1.0164313 1.0285558
[769] 1.0043574 0.9854983 1.0122655 1.0123857 0.9879603 0.9734764 0.9995228 1.0315182 0.9564373 1.0543879 1.0099970 0.9987432
[781] 0.9580883 0.9724853 1.0167722 1.0102822 0.9629902 0.9908875 0.9838395 0.9733901 1.0207349 0.9848377 1.0633785 1.0312998
[793] 1.0316422 1.0335433 0.9890110 1.0334082 0.9915590 0.9909167 1.0208474 0.9899497

> HF6
Time Series:
Start = 1
End = 800
Frequency = 1
  [1]  9.703261e-02 -3.302060e-01  5.100922e+00  1.932550e+00 -1.386912e-01  1.482268e-02 -1.137384e+00  3.732522e-01  2.506729e-01
 [10] -2.919045e-01 -6.675508e-02 -1.267444e+00 -4.271286e-01  1.539651e-01 -1.424168e-01  2.632788e-01 -6.013491e-02 -5.743224e-02
 [19] -1.955379e-01  7.423308e-01 -3.041726e-03 -2.667225e-02  2.409421e-01 -4.339732e-02 -2.372542e-01 -2.194143e-01  2.712374e-01
 [28]  1.764577e+00 -1.583502e-01 -1.558412e-01  4.859185e+00  6.595212e-02 -6.227563e-02 -3.663468e-02  9.338089e-01  1.165410e+00
 [37] -3.776054e-02  1.015936e+01  4.269841e+00  8.659153e-01 -1.045996e+00 -8.061952e-01  2.627137e-01  1.023131e-01  2.757644e-01
 [46] -6.199723e-02  1.466399e-01 -3.353696e-01 -2.881873e-01 -1.560865e-01  2.946743e-01  1.825263e-01  6.075510e-01 -7.659018e-02
 [55]  9.332004e-02 -7.924914e-01 -2.995696e+00 -2.625424e-01  6.959834e+00  2.882190e-01 -5.555718e-02 -3.191530e+00 -2.894247e+00
 [64] -7.495410e-01 -5.698178e-01 -2.920025e-01  7.262345e-02  6.955618e-01 -7.509777e-01 -3.111461e-02 -1.757717e+00  9.583333e-02
 [73]  2.022944e-01  1.481875e-01  3.709509e-01  3.297667e+00 -1.679928e-02 -6.633111e-01  3.081464e-01 -1.522342e-01 -2.697393e-01
 [82] -2.474069e-01  1.267182e+00  2.990766e-01 -1.483910e-01  1.851073e-02 -3.320246e+00  5.365467e-01  3.685251e-02 -5.869044e-02
 [91] -5.304953e-01  8.510204e-02 -1.943394e+00  6.796528e-01  8.707915e+00  5.339946e-01  3.334323e-01 -5.567989e-01  1.741750e-01
[100]  3.974109e-01 -1.180250e-01 -3.248193e-01  2.839601e-01  8.396776e-01  1.587400e+00 -1.052848e-01 -5.427561e-02  1.308345e+00
[109] -4.321102e+00 -2.114642e+00  2.545551e-01  2.608206e-01  2.468002e-01 -3.503397e-01  8.657229e-02  4.993098e-01  2.432785e+00
[118] -1.896142e-02 -2.014234e+00  2.029458e+00  6.079714e+00 -2.764164e-01  2.669853e-01  3.423891e+00 -5.324067e-01 -1.615363e-02
[127]  2.728479e+00  2.063365e+00  3.873700e-01 -9.717373e-01  2.802471e-01  3.221953e+00 -1.380415e+00 -2.251014e-01 -9.367013e-01
[136]  1.453974e-01 -9.212878e-01  6.660146e-01  2.698844e-01 -2.378487e-01 -1.841615e-01 -7.505472e-01  2.545551e-01 -1.904946e-02
[145]  2.825536e-01 -1.849939e-01  3.591260e-01  3.743418e-01 -2.778478e+00 -1.329060e+00  3.160122e-01  4.643313e-01  5.750524e-05
[154] -6.072878e-01  2.644429e-01  1.874244e+00 -2.695451e-01 -9.715273e-03  3.494761e-01 -9.281908e-02 -1.818026e-01 -3.065760e+00
[163]  1.745485e-01  4.058502e-01 -7.937648e-02  4.082885e-01 -5.328007e-02  5.173842e-01  3.014029e-01 -1.332769e+00 -1.525841e+00
[172]  1.278083e+00 -2.592115e+00 -5.447981e-02 -5.511966e-02  1.499697e-01 -7.537936e-01 -6.736513e-01  2.502264e-01  1.421474e+00
[181] -1.908278e-01 -2.629398e+00  3.030101e-01 -5.162059e-01 -2.154668e-01  1.774540e-03 -8.088480e-01  5.501430e-01  5.268684e-02
[190]  2.180616e-01  5.120812e-01  2.823400e-01  1.174173e-04 -3.871419e-02  3.158028e+00 -1.044852e+00 -2.686278e-01  7.454716e-01
[199]  7.658868e+00  1.125989e+00 -1.923856e-01 -2.441550e+00  5.024290e-01  2.290590e+00 -5.988608e-02 -7.947542e-01  4.383090e-01
[208] -6.176262e-03  2.480572e-01  6.266179e-01  3.552698e+00  7.503722e-01 -2.675535e-02 -5.389840e-01  8.622592e-01  1.991035e-01
[217] -2.189162e-01 -1.161234e+00  6.972145e-01  2.780796e-01 -6.312992e-02 -2.608414e+00  2.618422e+00  5.462640e+00 -1.142624e+00
[226] -8.490826e-01 -1.288512e-01  4.011994e-02 -2.048321e-01  3.272460e+00  6.866489e-01 -5.466328e-02  3.070392e-01 -9.886495e-01
[235] -1.569525e+00 -1.306696e-01  3.623149e+00  1.978161e-01  2.545551e-01  2.002372e+00 -5.400229e-01 -1.656640e-02  1.206290e-01
[244]  2.054049e-01  3.155210e-01 -1.744452e-01  2.545551e-01  7.412855e-01 -1.234508e-01 -1.718832e-02  7.214410e-01  6.076084e-01
[253] -1.024878e+01 -1.287062e+00  3.080408e+00  5.876054e-01 -2.777261e-01 -1.009898e+00 -5.856174e-01  6.067372e-02 -8.840570e-02
[262] -3.594392e-01  1.813739e+00  1.405564e-01 -3.372484e-02  8.371325e-01  3.424895e-01  1.493120e+00 -1.433819e+00 -4.222862e-01
[271] -6.719543e-02  3.063784e+00  1.282949e-01 -1.294272e-01  3.830982e-01  2.144883e+00 -8.491788e-02 -2.231544e+00  3.286810e-01
[280]  7.426660e-01 -6.825884e-02 -3.435546e-02  1.707858e+00  2.714363e-01  2.583534e-01 -1.856543e-01 -2.983154e-02  1.084806e+00
[289] -2.070454e+00 -1.513304e-01  2.514891e-01  1.531216e+00 -1.656472e+00 -1.337082e+00  3.243848e-01  1.715378e-01 -7.437379e-02
[298] -6.363805e-01  8.216420e-01  2.398677e+00  2.481079e-01 -1.493501e+00  1.788803e+00  4.053156e-01 -1.932145e-01 -1.698920e-01
[307]  3.829476e+00  1.608798e+00 -1.129716e-01 -1.141733e+00  1.621158e+00 -3.343241e-02  4.700237e+00  2.001713e+00  4.618576e+00
[316]  2.501457e+00 -7.836369e-02 -1.908722e-01  1.028320e+01  5.647405e-02 -1.710083e-01 -1.065351e+00  4.019061e-01  2.086564e-01
[325] -3.025440e-01 -9.047322e-02 -6.460067e-01 -6.844164e-01 -1.881830e-01 -8.318599e-02  3.826786e+00  9.766119e-01 -7.635718e-02
[334] -2.220902e-01 -3.856785e+00 -1.094635e+00 -7.540428e-02  3.058904e-02  9.137646e-01  3.775832e+00  2.989209e-01 -1.054544e-01
[343]  1.553016e-01  2.089815e+00 -1.232086e+00 -6.334727e-03 -7.148192e-01  1.619745e+00 -1.637780e-01 -2.134504e-01  1.515387e-01
[352]  2.434295e-01 -1.831071e-01 -9.457498e-01  2.459490e+00 -4.414135e-02 -8.114919e-02 -4.016494e-01  3.734457e-01  4.896307e-01
[361]  4.049245e+00  2.017318e+00  2.488363e-01  2.780003e-01 -1.550511e+00 -3.131615e+00  3.587101e+00 -8.323622e-02 -9.310946e-01
[370] -1.594693e-01 -6.449885e+00  8.850238e-01  2.298414e-01  2.548747e-02  3.259190e-01  3.200593e+00 -4.624236e-02 -3.353935e+00
[379] -9.192395e-03  1.659781e+00 -3.100714e-01 -2.432555e+00  2.493722e+00  6.676577e+00 -1.780416e-01 -8.183622e-01  2.873045e-01
[388]  1.873401e+00 -3.159656e-01 -1.063461e+00 -2.275333e-01  2.545551e-01 -1.147574e-01 -2.423924e-01  5.087483e-01 -4.946087e+00
[397] -1.404096e+00  5.942892e-01  1.850712e-01  2.305337e+00  1.519311e+00 -1.659701e-01  1.099995e+00  8.581784e-01 -1.180922e+00
[406] -1.422570e-01 -1.017652e-01  6.907890e-01 -2.043544e-01 -1.268577e-01 -7.520662e-01 -3.144183e-01 -3.326276e-01  2.571825e+00
[415]  2.758053e-01  5.519530e-01 -4.779812e+00 -8.598340e-01 -2.733269e-01  1.746146e+00 -1.152102e-01 -1.696746e-01  8.731155e+00
[424]  4.202525e-01 -8.863015e-02 -1.811685e+00  1.030848e+00  6.446866e+00  4.923567e+00 -2.394358e-02 -1.384412e-01  1.970634e-01
[433]  5.853398e+00 -5.313277e-01  2.451911e-01  5.068814e-01  1.001283e+01  2.545551e-01  2.991683e-01  2.978852e-01  1.920352e+00
[442] -6.436663e-01  2.865144e-01 -2.985307e+00 -7.993755e-02  6.913159e+00  1.263291e-01  4.946523e-01  2.384704e-01 -2.053579e-01
[451]  1.041425e+00 -3.840293e-02 -2.728177e-01  1.088364e-01 -2.959904e+00 -3.448277e-01  5.773138e-01  8.970300e-01  4.063197e-01
[460]  5.289729e-01  9.353198e-02 -5.891475e+00  2.352159e-01  1.773673e+00 -3.609789e-02 -5.292783e-02 -3.005821e+00  5.409213e-01
[469]  5.832009e+00 -1.372102e+00  3.681388e-01  2.022951e-01 -1.088937e+00  1.301919e+00 -1.351298e-01  1.666197e+00 -4.925816e-01
[478] -2.756112e-01  4.107335e-01  2.707616e-01  2.778384e-01  2.869210e-01 -2.518847e+00  2.731433e-01  3.057766e-01 -6.508426e-02
[487] -6.508053e-02 -5.386767e-02 -1.319746e+00 -5.511262e-01  2.439812e-01  3.581400e-01 -1.816593e-02 -2.177485e+00  1.568457e+00
[496]  2.416807e+00 -6.700124e-01  1.588973e-01 -8.862605e-02  2.121985e-01 -1.143138e+00  1.076068e+00  3.220819e-01 -6.488642e-02
[505] -9.119583e-02 -1.900989e-01  3.154147e-01 -5.464995e-01 -2.187182e-01 -2.180317e+00  2.234639e-01  2.134121e+00 -9.894065e-01
[514] -6.472309e-01  3.239172e-01  2.761212e-01 -1.826730e-01 -1.782663e-01  3.954665e-01 -1.855077e+00 -2.951616e-01 -1.235170e-01
[523]  2.251821e-01  5.553531e-01 -2.023192e-01 -9.014496e-01  6.563737e-01  4.016597e-01 -9.749756e-02 -6.402956e-02  2.750496e-01
[532]  2.754876e-01  3.293551e-01 -5.005642e+00  3.231129e-01  2.545811e-01 -1.022154e+00 -8.060808e-01  3.521192e-01  2.119989e-01
[541] -2.381150e-01 -3.288550e+00  1.421280e+00  2.847853e-01 -2.001002e-01 -1.646902e-01 -1.850079e-01 -7.654026e-02 -1.188583e-01
[550] -1.742599e-01 -1.984759e+00 -8.682619e-02  2.071753e-02  1.087379e+00  2.844600e-01  7.515339e-01 -2.224236e-01 -3.075256e+00
[559]  8.950929e-02  2.394501e+00 -2.165059e+00 -1.233829e-01  2.701741e+00  2.282062e+00 -1.924431e+00 -2.082254e-01  3.412529e-01
[568]  2.607147e-01  1.124918e-03 -1.203852e-01  2.657898e+00  1.957524e-01 -2.931182e-01 -1.971033e+00  1.807135e+00  8.706652e-01
[577] -2.326451e-01 -8.473731e-01  6.189609e-01  5.287915e-01 -7.740822e-01 -1.821481e+00  1.237988e-01  2.233247e+00 -3.362100e-01
[586] -2.170038e-01 -2.635905e-02  2.472973e-01  3.868325e-02 -1.159021e-01  2.500380e-01 -6.085198e-01 -1.899817e-01 -1.431927e-01
[595] -3.052454e+00 -3.469223e-01 -2.444967e-01  1.652156e-02  2.119758e-01  4.240047e-01 -1.911114e+00 -9.038186e-02  2.528617e-01
[604]  2.756286e+00 -3.906295e-01 -1.484883e-01  6.719161e-01 -3.866083e-01 -6.435435e-02 -1.902576e+00  5.826201e-01  2.644784e-01
[613]  9.538629e-02 -2.011244e-01  2.984296e-01  2.771165e+00  8.282686e+00 -1.932610e+00  4.163817e-01  1.595534e-01 -1.118635e+00
[622]  2.470832e-01  5.270764e-01  3.958194e-01 -6.165985e-02  5.335154e-01 -3.571421e-01  2.505762e+00 -2.753137e-02 -5.469856e-01
[631]  9.252352e-01  6.461865e-01 -4.461937e-01 -7.364934e-01  2.285105e+00  7.138954e-01 -2.720298e-01 -8.693912e-01  2.993691e+00
[640]  8.028559e-01 -6.743009e-02 -5.307888e-01  3.119620e-01  3.434619e+00 -1.452196e+00  1.684125e+00  3.311307e-01 -5.669708e-01
[649] -3.553324e-01 -1.896185e+00  3.446842e+00  7.830310e-01 -1.711669e+00 -1.277983e+00  4.621170e-01  2.421051e-01 -1.701033e-01
[658]  8.163954e-01 -7.261633e-01 -1.299687e-02 -8.016282e-02 -1.140658e-01  1.323860e+00  2.545551e-01 -2.675428e-01 -8.433672e-01
[667]  5.228999e-01  1.632592e-01 -2.957998e-01 -5.553013e-01  3.011379e-01 -6.305225e-01 -8.612630e+00 -3.065235e-01 -8.305565e-01
[676]  3.059103e-01 -7.797204e-02 -1.786650e+00 -1.488405e-01 -5.183906e-02  5.218233e+00  1.728557e-01 -4.481405e-01 -5.693220e+00
[685]  2.999499e-01  3.040171e-01  7.464381e+00  6.972166e+00 -8.403415e+00 -6.402848e-02  6.964933e-01  4.432842e-01 -1.786165e-01
[694]  5.528044e-01 -3.616451e-01 -1.591170e-01 -5.733156e-01 -3.732071e-01  1.612954e+00  4.521562e-01 -2.379350e-01 -7.817489e-01
[703]  5.615589e-01  7.984968e+00 -1.419967e-03 -3.413697e-01 -5.719286e-01 -8.776290e-02 -3.640012e-01  3.454289e-01  1.116740e-01
[712]  1.435803e-01 -2.474860e-01  1.093524e+00  2.347177e-01  1.730654e-01 -1.228630e-01  5.803183e-01  1.384456e+00  5.756865e-01
[721]  4.993324e-01 -3.569028e-01  1.399057e+00 -1.404911e+00 -2.982227e-01 -1.440777e-01 -3.008329e-01 -1.480333e-01 -2.034551e-01
[730] -6.386024e-02  5.534418e+00  1.260242e+00 -8.166069e-01 -2.222623e-01  7.545084e-02  5.541968e-01  5.556761e+00 -8.453119e-01
[739]  3.526444e-01  2.922084e+00  2.172985e-01 -1.469902e+00  1.632111e-01  3.493874e-01 -1.966893e-01 -2.924236e-01  1.102409e-01
[748]  6.462530e-02 -1.704171e-02  4.125223e-01 -2.080039e-01  2.607025e-01 -2.169992e-01  8.481024e-02  1.636632e-01 -5.962679e-01
[757] -3.093319e-01 -1.073459e-01  1.363464e+00 -1.401023e+00 -2.107503e-02  5.597599e-01  5.385795e-01  1.863389e+00  6.397418e+00
[766]  1.861240e+00  1.194237e+00  2.997435e-01  1.611484e-01 -5.484613e-01  1.189842e+00  3.837644e+00 -2.338096e-01 -5.737762e-01
[775]  2.413753e-03  3.455508e+00 -6.577547e-01  3.693522e-01  1.219843e-01 -2.947803e-02 -2.569075e-01 -1.664628e-01  2.123460e+00
[784]  4.798212e+00 -1.675385e-01 -7.847523e-02 -3.216411e-01 -2.386470e-01  2.017492e+00 -2.721515e-01  2.467115e+00  5.415200e-01
[793]  1.157133e+00  4.056428e-01 -2.823108e-01  4.366070e-01 -9.937483e-01 -6.383019e-02  2.599436e-01 -2.154435e-01


> HF5
Time Series:
Start = 1
End = 800
Frequency = 1
  [1]  -0.053649858   0.473045129  -1.855791134  -0.218807875   0.014571536   0.159596481   0.081564240   0.155658287  -0.106533556
 [10]   1.349296039   0.043722595   1.184981768   2.936701948  -0.020890782   0.287209659  -0.343506862   0.182125744   0.321655319
 [19]   0.152265596  -0.130765767   0.301479728   0.237639510  -0.113640154   0.116768441   0.245115225   2.653399743  -0.014934997
 [28]  -0.376484031   0.098036596   0.078589972 -10.566796506  -0.033271764   0.007003626   0.336061740  -0.290931130  -1.003161849
 [37]   2.126410832  -0.339519118  -1.984023750  -0.188090224   8.562631062   1.515090731  -0.089723677  -0.035854499  -0.156259607
 [46]   0.087586561  -0.099550560  -0.386082232   1.231825926   0.665419202  -0.018845938  -0.273064427  -0.441205599   0.058367119
 [55]  -0.035881404   0.231425905  -1.731548372   0.511813720  -2.652292762  -0.163362295   0.112082704   2.127449901  -1.984516819
 [64]  -1.681618131  -0.291551146   1.301356201  -0.231139629  -1.761747111   2.198414002   0.031431206   4.817803230  -0.053835827
 [73]  -0.241400896  -0.132737682  -0.494221853  -3.427015165   0.158200929   6.618433733  -0.255699644   0.399170676   0.469394277
 [82]   1.403215236  -0.539113590  -0.074840353   0.315268293   0.017757466  -0.069162940  -0.591611166   0.952482170   0.040122193
 [91]  -0.224477562  -0.035925567  -0.213900914  -2.517367299  -6.371690768  -0.379127361   1.742901426   0.115588677  -0.070774432
[100]  -0.042019349  -0.203094162   6.187633937  -0.030255588 -14.135776862  -0.292919023   0.210526422   0.015809975  -0.191603291
[109]  -0.028013930   1.196397848  -3.136609657  -0.016623343  -0.047866621   0.235623521  -0.033250628  -0.029950585  -0.152733306
[118]   0.283585876   0.725438532  -5.321553815  -0.262818166   0.333358093  -0.127887678  -0.825616353   0.729752381   0.161256587
[127]  -0.127870499  -3.504392573  -0.056818942   1.051455503  -0.610581967  -1.772628876   7.763088022   6.377255877   0.982331777
[136]  -0.049720167   1.668057546  -0.081526679  -0.016646304   0.027828635   0.021019397   0.638606658  -2.639610493   0.252214128
[145]  -0.021177179   0.309961739  -0.159709264  -0.791889902   0.221659320   0.071520603  -0.191792830  -0.095849781   0.147118019
[154]   0.423879172  -0.015966532  -0.814997357   0.182181232   0.740886403  -0.425878632   0.063399111   0.084086002   6.242629054
[163]  -0.166059640  -0.164944798   0.047540050   0.077867048   0.210206725  -0.059959501  -0.134972375   0.724032252   1.208053371
[172]  -1.450416248   0.767002941   0.567248602   0.203201419  -0.017947959   0.723492119   2.315582396  -0.003250712  -0.214061467
[181]   0.140103172   1.485875229  -0.063942314   0.504794752   1.203681332   2.916254637  -0.599141209  -0.504639390  -0.035917702
[190]  -0.066508032  -0.774334925  -0.044962400   0.539379226   0.035552430  -0.852486162   0.582649427   0.511438446  -0.054846737
[199]  -1.363899566  -0.925264908   0.378290631   8.418752812  -0.329694296  -0.646217899   0.014012816   0.229314359  -0.822570550
[208]   0.035566960  -0.048042626  -0.230902834  -0.983361624  -0.524604472   0.142112334   3.178630141  -0.679181149  -0.021673949
[217]   0.455549743   2.884246546   0.995534936  -0.074915836   0.294327209   8.707540120  -0.891039780  -5.116943345   1.807300268
[226]   4.653522444   0.176254186   0.096643906   0.097676112  -0.442654494  -0.176779103   0.047338893  -0.206053506  24.897248697
[235]  -0.558294814   0.015836695  -0.754272940  -0.071928292  -4.065491270  -1.044690836   0.376283222   0.588343352  -0.355257156
[244]  -0.083364082  -0.007105364   0.529359954  -8.021221450  -0.546179595   0.646916952   0.178089665  -0.051864657   0.253082405
[253]  20.713299311   0.759143878  -0.539982015  -0.368425364   3.538669564   0.567510757   0.191253562  -0.053957999   0.047407002
[262]   1.072823567  -2.268748418  -0.033168432   0.252976417   4.010509758  -0.085581899  -1.500184616   2.623411867   0.129878721
[271]   0.196248634  -0.481771250  -0.056828978   0.031878859  -0.144508992  -1.702092094   0.450120295   2.201565243  -0.208548110
[280]  -0.607892432   0.210270224  -0.053396495  -1.098250818  -0.497645770  -0.019981605   0.026436961   0.269281987  -0.203943129
[289]   5.803557957   0.898143581  -0.016638145  -2.870511105  12.535882950   0.089523798  -0.071056442  -0.083304296   0.371506360
[298]   2.549915204  -1.388314210  -5.815539352  -0.210410307  -0.046665442  -1.398770615  -0.420604388   0.007011299   7.295412551
[307]  -3.567433713  -2.259342153   0.111306625   9.424201044  -0.099990007   0.058851186  -0.319394368  -1.188994369 -13.473532466
[316]  -4.123907909   0.007008249   0.026816923 -11.866142680  -0.017989270   0.277994379   3.218420951  -0.312668928  -0.149944044
[325]   0.056081944   0.507135605   0.016331503   1.002122867   0.715103715   0.127326780  -5.870328468  -0.530081308   0.336482105
[334]   0.162019469   0.136228079   0.109176188   0.265903072   1.954660692  -1.262932946   0.190943304  -0.114749082   3.999535346
[343]  -0.022649183  -2.320142948   4.864378799   0.107269719  -0.147020500  -0.880102075   0.896221148   0.383806031  -0.014315332
[352]  -0.064334680   0.126186237   1.836083215  -1.195093675   0.247399700   0.210305271   1.205910512  -0.128797914  -0.015077554
[361]  -0.076260331  -1.984423551  -0.050001641  -0.060265425   2.076702500  -1.560055417 -13.194087713  -0.433552321   0.259611369
[370]   0.015964183   0.105154985  -0.359399469   0.056095783   0.468687701   0.237191558  -0.180564873   0.071450142   1.633298516
[379]   0.098140453  -0.865187577   0.126199486   3.933684419  -2.439808356  -5.299750403   0.659122872   5.485314982  -0.060356968
[388]  -0.273469732   0.881925411   1.401311316   0.357596451   0.190943304   0.225427150   1.918599734  -0.135809594  -1.539832826
[397]   1.422332786  -0.216520004  -0.116653447  -0.945873057  -1.076258982   0.530886207 -14.858856834 -10.332358918   3.337007186
[406]   0.029548722  -0.177707477  -0.465327158   0.126175184   0.079632842   1.568379101   0.039669024   3.115629690  -0.035586517
[415]  -0.195985895  -0.577867640  -0.677797610   1.094892700  -0.376687825  -4.102964091   0.143566503   0.088692628  -1.453786558
[424]  -0.299393058   1.107887381   0.674420993  -2.646522732  -1.389318035  -0.211813487   0.390659815  -0.355424658  -0.228305953
[433]  -0.390932354   0.797608035  -0.400864878  -0.664606830  -2.128206234  -1.332743740  -0.048330806  -0.302019469  -1.364677631
[442]   0.850007608  -0.498655463  -1.120737861   0.044377074  -4.067918353  -0.177685144  -0.238197352  -0.099527123   0.063876860
[451]  -0.109607175   0.286745627   3.228227858   1.155269654  -1.714722099   2.044204651  -0.376951275  -0.256646314  -0.462593438
[460]  -0.150378251  -0.036127900  -0.168221859  -0.250161833  -0.132990944  -0.092393191   0.017930849  -0.821142000  -0.045328381
[469]  -0.156346755   0.251034593  -0.015115175  -0.049923585   9.808254454 -10.923406279   0.218855389  -3.481913209   2.269242277
[478]   1.618053635  -0.568450760  -0.100185900  -0.218372487  -0.533170368  -3.035492341  -0.018080448  -0.270147632   0.064100947
[487]  -1.656390523   0.044406230   0.403815427   4.463785965  -0.242225219  -0.060562600   0.111633931   1.952214234  -1.702724573
[496]  -2.321447242   7.091924992  -0.100624956   0.014795836  -0.127924982   2.603975673  -0.238145700  -0.206169571  -0.280573145
[505]   0.133267731   1.334596328  -0.080741903   1.136959798   0.103590046   0.382617191  -0.120816198 -10.255711630   4.959744726
[514]   0.203784749  -0.241718413  -0.418653153   7.263519760   0.175385478  -0.121027999  -0.921554800   0.014025795   0.812840365
[523]  -0.033736823  -0.881201126   0.014024605   1.413383175  -0.084845152  -1.501412214   0.014026681   0.351134701  -0.105825219
[532]  -0.190871159  -0.016198856   1.582199545  -0.355298930  -0.119154942   1.119952829  15.442702714  -0.526105476   0.254349129
[541]   0.182400471   0.290090107  -0.544720386  -0.704947881   0.182413507   0.415254298   0.478675043   0.337429554   0.015951895
[550]   0.579123775   0.276916517   0.336730820   0.166459998  -0.165019677  -0.170639297  -2.156172309   0.007015360   0.679413452
[559]  -0.180776170  -0.412514852   1.919485711   0.564221261  -1.814078186  -0.523469370   0.850946856   0.654857991  -0.433715765
[568]  -0.064963825   0.315600383   0.208368575  -1.362624306  -0.036163560   1.522171004   0.501262098  -2.957851936   0.041325375
[577]   0.792613097   0.461937482  -0.549182039  -0.201497868   0.559521219   2.330550367  -0.072344287  -0.696108042   0.255478720
[586]   1.414073413  -0.568775940  -0.324689599   0.489813834  -0.149345898  -0.083926323  12.669192340   0.395227403   0.575646109
[595]   0.969961408   0.139445941   0.028068230   0.961621369  -0.747248768  -1.600810860   1.075547238   0.420965386  -0.113631655
[604]  -0.903460198   0.524895605   0.044527148  -0.461720956  -0.547639601   0.084186467   1.174865283  -1.097194604  -0.032498414
[613]   2.392118505   0.986643396  -0.162424173  -3.164451200  -0.769422900  -0.184600192  -0.197917436  -0.352329082   1.290558257
[622]  -0.184791050  -0.076097183  -1.798061452   0.080099125   2.062484105  -0.348501142  -2.106705631   0.007017747   2.305538391
[631]  -0.915838960  -0.220861746   0.077200047   0.610466280  -0.688401758  -0.207153770   0.049139553  22.362186197  -5.441551857
[640]  -1.708605711   0.926299855   0.207035751  -0.106446657  -2.675294607  -0.404511023  -0.788943233  -0.048807464  -0.227614326
[649]  -0.085333035   0.977385829 -16.826537503  -0.200423157  -0.051340705   6.499090143  -0.304213082  -0.065082852   0.048070630
[658]   0.666539778  -0.064016381  -0.109602571   0.533325153   0.528565621  -0.248317213  -3.473373955   1.272400022   1.711836935
[667]  -0.228344960  -0.252753461  -0.488373752  -0.401594723  -0.030427542  -0.455079097   3.252051577   0.960391227  -0.256075733
[676]  -0.136915862   0.098237444   1.674612416   0.044609980  -0.248469202  -0.298662830   1.173803660  -0.208363252   1.850645023
[685]   1.036270876  -0.152137097  -0.048105658  -1.277109207   0.059431246   0.064698690  -0.246801765  -0.196775000   0.533961473
[694]   0.220699810  -0.312873635   0.014867680  -0.241851486   0.032156787  -1.816522484  -0.167733410   4.158794025   2.099466739
[703]  -0.030488507   0.112566051   0.075675048   0.302820106  -0.469476310   0.210115120   0.056139143  -0.030484607  -0.072570524
[712]  -0.033886465   0.091219120  -0.248023454  -0.081455556  -0.203068185   0.054095543  -0.015255905  -3.701371648  -0.623879061
[721]   0.853410776   6.103753013  -1.341198580   1.247921308   0.751060465   0.781642884   0.379135477   1.145320110   0.313305428
[730]   0.112618403  -1.412256823  -0.103142715  -0.007113346   0.659901598  -2.519498558  -0.365995410  -0.184937991   0.897416670
[739]  -0.518805259  -3.353209940   1.867572217   9.205127781  -0.187969046  -0.778383177  -0.042669664   0.806807477  -0.090799820
[748]  -0.021826161   0.448223805  -0.164371146  -0.618774302  -0.244839681   0.194235170   1.570125546   1.754972837   0.500679719
[757]   0.870366653   0.433784961  -1.002863246   2.101960944   0.697030522   7.950881827  -0.061270167  -2.371332122  -0.142291873
[766]  -1.729969712  -1.941166110  -0.245036824  -0.106730528   5.057757700  -1.038846526  -0.858866602   3.386084663   1.395573786
[775]  -0.291650577  -2.212035645   0.856991031  -1.532383568  -0.185747818  -0.711396025   1.062315644   0.241829929  -1.838103065
[784] -12.577074634   1.735801542   0.484405184   0.013854970   0.416285923  -1.975226723   0.938110382  -0.647308291  -0.706063547
[793]  -0.082810695  -0.054601369  -0.014973073   0.127614348  18.906618087   0.502810107  -0.152371107  -0.036187828


very many thanks for your time and effort....
Yours sincerely,
AKSHAY M KULKARNI

________________________________
From: Ivan Krylov <[hidden email]>
Sent: Thursday, March 21, 2019 9:06 PM
To: [hidden email]
Cc: akshay kulkarni
Subject: Re: [R] problem with nls....

One of the assumptions made by least squares method is that the
residuals are independent and normally distributed with same parameters
(or, in case of weighted regression, the standard deviation of the
residual is known for every point). If this is the case, the parameters
that minimize the sum of squared residuals are the maximum likelihood
estimation of the true parameter values.

The problem is, your data doesn't seem to adhere well to your formula.
Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
residuals themselves)? With large residual values (outliers?), the loss
function (i.e. sum of squared residuals) is disturbed and doesn't
reflect the values you would expect to get otherwise. Try computing
sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
changing m makes any difference.

Try looking up "robust regression" (e.g. minimize sum of absolute
residuals instead of squared residuals; a unique solution is not
guaranteed, but it's be less disturbed by outliers).

--
Best regards,
Ivan

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

Martin Maechler
In reply to this post by Ivan Krylov
>>>>> Ivan Krylov
>>>>>     on Thu, 21 Mar 2019 18:36:20 +0300 writes:

    > One of the assumptions made by least squares method is that the
    > residuals are independent and normally distributed with same parameters
    > (or, in case of weighted regression, the standard deviation of the
    > residual is known for every point). If this is the case, the parameters
    > that minimize the sum of squared residuals are the maximum likelihood
    > estimation of the true parameter values.

    > The problem is, your data doesn't seem to adhere well to your formula.
    > Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
    > residuals themselves)? With large residual values (outliers?), the loss
    > function (i.e. sum of squared residuals) is disturbed and doesn't
    > reflect the values you would expect to get otherwise. Try computing
    > sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
    > changing m makes any difference.

    > Try looking up "robust regression" (e.g. minimize sum of absolute
    > residuals instead of squared residuals; a unique solution is not
    > guaranteed, but it's be less disturbed by outliers).

Very good point, Ivan (as your previous ones on this thread -
thank you! it's great to have a couple of smart and patient
R-helpers such as you !!).

CRAN package robustbase
     https://cran.r-project.org/package=robustbase

has function nlrob()  to do non-linear
regression *robustly*, even using several methods, the default
one using robustly re-weighted nls().  I'm the maintainer of the
package and have been the "moderator" of the current nlrob()
function, but as you can read on it's help page, I'm not the
author of that function and its submethods:
https://www.rdocumentation.org/packages/robustbase/versions/0.93-4/topics/nlrob

Martin Maechler
ETH Zurich

    > --
    > Best regards,
    > Ivan

    > ______________________________________________
    > [hidden email] mailing list -- To UNSUBSCRIBE and more, see
    > https://stat.ethz.ch/mailman/listinfo/r-help
    > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
    > and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

akshay kulkarni
In reply to this post by Ivan Krylov
dear Ivan,
                    I think my nls call is not converging to the proper value. I've gone through the Gauss Newton algorithm implemented by nls. How do I get the gradient, Hessian, and the jacobian of the objective function created by call to the nls? Perhaps I can compare all of them between my succesful nls call and the one that didn't. I've gone through debug(nls) but of no avail.

Also, I've checked the residuals...they are approximately normally distributed....I am still wondering why the nls call is not getting converged....!

Also, is it possible that if I give the vectors HF1,Hf5,HF6 it will help members in the mailing list to get to the bottom of the problem( I am sorry to have given the printed values of the vectors in my previous response to your mail...The dput values were very large. However, I will give the dput values this time around )?

very many thanks for your time and effort....
yours sincerely,
AKSHAY M KULKARNI

________________________________
From: Ivan Krylov <[hidden email]>
Sent: Thursday, March 21, 2019 9:06 PM
To: [hidden email]
Cc: akshay kulkarni
Subject: Re: [R] problem with nls....

One of the assumptions made by least squares method is that the
residuals are independent and normally distributed with same parameters
(or, in case of weighted regression, the standard deviation of the
residual is known for every point). If this is the case, the parameters
that minimize the sum of squared residuals are the maximum likelihood
estimation of the true parameter values.

The problem is, your data doesn't seem to adhere well to your formula.
Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
residuals themselves)? With large residual values (outliers?), the loss
function (i.e. sum of squared residuals) is disturbed and doesn't
reflect the values you would expect to get otherwise. Try computing
sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
changing m makes any difference.

Try looking up "robust regression" (e.g. minimize sum of absolute
residuals instead of squared residuals; a unique solution is not
guaranteed, but it's be less disturbed by outliers).

--
Best regards,
Ivan

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

akshay kulkarni
In reply to this post by Martin Maechler
dear Martin,
                         I've not yet tried robust regression; I want to get to the root of the problem. I've replied to Ivan requesting how to get the Gradient,Hessian and the Jacobian of the objective function in an nls call. My question to you is, does robust regression takes an entirely different algorithm other than Gauss Newton? Will that help in my case? My point is ,I've two calls to nls, the variables only slightly different, but one converging but the other one not...If GN fails, what is the guarantee that the other succeeds(also, there is no problem with the residuals, they are approximately, normally distributed..so whether its robust regression or not, I am beset with same problem..!)

very many thanks for your time and effort...
yours sincerely,
AKSHAY M KULKARNI

________________________________
From: R-help <[hidden email]> on behalf of Martin Maechler <[hidden email]>
Sent: Friday, March 22, 2019 2:49 PM
To: Ivan Krylov
Cc: [hidden email]
Subject: Re: [R] problem with nls....

>>>>> Ivan Krylov
>>>>>     on Thu, 21 Mar 2019 18:36:20 +0300 writes:

    > One of the assumptions made by least squares method is that the
    > residuals are independent and normally distributed with same parameters
    > (or, in case of weighted regression, the standard deviation of the
    > residual is known for every point). If this is the case, the parameters
    > that minimize the sum of squared residuals are the maximum likelihood
    > estimation of the true parameter values.

    > The problem is, your data doesn't seem to adhere well to your formula.
    > Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
    > residuals themselves)? With large residual values (outliers?), the loss
    > function (i.e. sum of squared residuals) is disturbed and doesn't
    > reflect the values you would expect to get otherwise. Try computing
    > sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
    > changing m makes any difference.

    > Try looking up "robust regression" (e.g. minimize sum of absolute
    > residuals instead of squared residuals; a unique solution is not
    > guaranteed, but it's be less disturbed by outliers).

Very good point, Ivan (as your previous ones on this thread -
thank you! it's great to have a couple of smart and patient
R-helpers such as you !!).

CRAN package robustbase
     https://cran.r-project.org/package=robustbase

has function nlrob()  to do non-linear
regression *robustly*, even using several methods, the default
one using robustly re-weighted nls().  I'm the maintainer of the
package and have been the "moderator" of the current nlrob()
function, but as you can read on it's help page, I'm not the
author of that function and its submethods:
https://www.rdocumentation.org/packages/robustbase/versions/0.93-4/topics/nlrob

Martin Maechler
ETH Zurich

    > --
    > Best regards,
    > Ivan

    > ______________________________________________
    > [hidden email] mailing list -- To UNSUBSCRIBE and more, see
    > https://stat.ethz.ch/mailman/listinfo/r-help
    > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
    > and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

akshay kulkarni
In reply to this post by Ivan Krylov
dear Ivan and members,
                                              I was able to solve my problem. After going through Gauss Newton method, I tried to extract the Hessian,Gradient and the Jacobian from the nls call. But I could not succeed. However I observed that my formula contained only one parameter. Then the objective function is just a quadratic in that parameter. I applied directly Newton Raphson method and got the value of the parameter. To my surprise, it was the same as the output of the nls call!

I think I have to accept the value of the parameter, even though it is not a good fit. The world is very harsh(sometimes only?)!

I should thank Ivan for initiating me in the right direction...

very many thanks for your time and effort...
Yours sincerely,
AKSHAY M KULKARNI

________________________________
From: Ivan Krylov <[hidden email]>
Sent: Thursday, March 21, 2019 9:06 PM
To: [hidden email]
Cc: akshay kulkarni
Subject: Re: [R] problem with nls....

One of the assumptions made by least squares method is that the
residuals are independent and normally distributed with same parameters
(or, in case of weighted regression, the standard deviation of the
residual is known for every point). If this is the case, the parameters
that minimize the sum of squared residuals are the maximum likelihood
estimation of the true parameter values.

The problem is, your data doesn't seem to adhere well to your formula.
Have you tried plotting your HF1 - ((m/HF6) + 1) against HF6 (i.e. the
residuals themselves)? With large residual values (outliers?), the loss
function (i.e. sum of squared residuals) is disturbed and doesn't
reflect the values you would expect to get otherwise. Try computing
sum((HF1 - ((m/HF6) + 1))^2) for different values of m and see if
changing m makes any difference.

Try looking up "robust regression" (e.g. minimize sum of absolute
residuals instead of squared residuals; a unique solution is not
guaranteed, but it's be less disturbed by outliers).

--
Best regards,
Ivan

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: problem with nls....

Ivan Krylov
In reply to this post by akshay kulkarni
On Fri, 22 Mar 2019 12:29:14 +0000
akshay kulkarni <[hidden email]> wrote:

> How do I get the gradient, Hessian, and the jacobian of the
> objective function created by call to the nls?

nls() return value is a list containing an entry named `m`, which is an
object of type "nlsModel". It doesn't seem to be documented in modern
versions of R[*], so what I am describing might be an implementation
detail subject to change. Still, model$m$gradient() should return the
jacobian; Hessian is usually estimated as crossprod() of jacobian; and
the gradient of the objective function is computed as
-2*colSums(model$m$resid() * model$m$gradient()).

> Also, I've checked the residuals...they are approximately normally
> distributed....I am still wondering why the nls call is not getting
> converged....!

The more important question is, how does the objective function (sum of
squared residuals) depend on the parameter `m` you are trying to find?
Try computing it for various values of `m` and looking at the result:

plot(
        Vectorize(
                function(m) {
                        model$m$setPars(m);
                        model$m$deviance()
                }
        ),
        from = ..., to = ... # fill as needed
)

--
Best regards,
Ivan

[*] But used to be:
http://unixlab.stat.ubc.ca/R/library/stats/html/nlsModel.html

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.