optimization problem

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

optimization problem

Ted Zeng
Hi, all

I am facing an optimization problem. I am using the function optim(par,fun), but I find that every time I give different original guess parameter, I can get different result. For example
I have a data frame named data:
head(data)
   price     s     x         t
1 1678.0 12817 11200 0.1495902
2 1675.5 12817 11200 0.1495902
3 1678.0 12817 11200 0.1495902
4 1688.0 12817 11200 0.1495902
5 1677.0 12817 11200 0.1495902
6 1678.5 12817 11200 0.1495902
…….
…….
…….

f<-function(p,...){
        v=exp(p[1]+p[2]*(x/s)+p[3]*(x/s)^2)
        d1=(log(s/x)+(v^2)*t/2)/(v*sqrt(t))
        d2=(log(s/x)-(v^2)*t/2)/(v*sqrt(t))
        sum((price-(s*pnorm(d1)-x*pnorm(d2)))^2)
}
p=c(-0.1,-0.1,0.01)
optim(par=p,f) # use the default algorism

$par
[1] -1.2669459  0.4840307 -0.6607008

$value
[1] 14534.56

$counts
function gradient
     154       NA

$convergence
[1] 0

If I try a different original guess estimes, I can get different number
> p=c(-1,-0.1,0.5)
> optim(par=p,f)
$par
[1] -0.7784273 -0.4776970 -0.1877029

$value
[1] 14292.19

$counts
function gradient
      76       NA

$convergence
[1] 0

$message
NULL

I have other different original estimes, It also show me different result.

Why?

Thanks.

Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Armin Meier
Hi,
I guess your function has several local minima and depending on where
you start, i.e. your initial variables, you get into another mimimum.

HTH
Armin

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Ted Zeng
If I want to find out the globle minia, how shoul I change my code?
Thanks a lot
Armin Meier wrote
Hi,
I guess your function has several local minima and depending on where
you start, i.e. your initial variables, you get into another mimimum.

HTH
Armin

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Prof J C Nash (U30A)
In reply to this post by Ted Zeng
tedzxx asked about apparent multiple optima. See below.

Users should be aware that optim() does local optimization. The default Nelder-Mead approach is fairly robust at finding such a local minimum, though it may halt if it is on a flat area of the loss function surface. I would recommend trying one of the BFGS codes (they use somewhat different approaches) and look at the gradient information. With only 3 parameters, these should work fine. There is also another package (I forget the name -- someone?) that does full Newton with Hessian computed. That may be worth using to get more complete information about your problem.

tedzxx: If you send me the data off-list (maybe also include the function again to save me digging it up again), I'll try to provide more information.

John Nash





>Date: Thu, 27 Nov 2008 23:30:56 -0800 (PST)
>From: tedzzx <[hidden email]>
>Subject: [R]  optimization problem
>To: [hidden email]
>Message-ID: <[hidden email]>
>Content-Type: text/plain; charset=UTF-8


  I am facing an optimization problem. I am using the function optim(par,fun),
  but I find that every time I give different original guess parameter, I can
  get different result. For example
  I have a data frame named data:
  head(data)
     price     s     x         t
  1 1678.0 12817 11200 0.1495902
  2 1675.5 12817 11200 0.1495902
  3 1678.0 12817 11200 0.1495902
  4 1688.0 12817 11200 0.1495902
  5 1677.0 12817 11200 0.1495902
  6 1678.5 12817 11200 0.1495902
  ??.
  f<-function(p,...){
        v=exp(p[1]+p[2]*(x/s)+p[3]*(x/s)2)
        d1=(log(s/x)+(v2)*t/2)/(v*sqrt(t))
        d2=(log(s/x)-(v2)*t/2)/(v*sqrt(t))
        sum((price-(s*pnorm(d1)-x*pnorm(d2)))2)
  }
  p=c(-0.1,-0.1,0.01)
  optim(par=p,f) # use the default algorism


  I have other different original estimes, It also show me different result.

  Why?

Thanks.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Ben Bolker
In reply to this post by Ted Zeng
tedzzx <zengzhenxing <at> gmail.com> writes:

>
>
> Hi, all
>
> I am facing an optimization problem. I am using the function optim(par,fun),
> but I find that every time I give different original guess parameter, I can
> get different result. For example
> I have a data frame named data:
> head(data)
>    price     s     x         t
> 1 1678.0 12817 11200 0.1495902
> 2 1675.5 12817 11200 0.1495902
> 3 1678.0 12817 11200 0.1495902
> 4 1688.0 12817 11200 0.1495902
> 5 1677.0 12817 11200 0.1495902
> 6 1678.5 12817 11200 0.1495902
> …….
> …….
> …….
   
  Can you post the whole data set somewhere (on
the web if it's big, here if it's say < 100 lines)?
That way your problem would be reproducible ...

  cheers
    Ben Bolker

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Mike Prager
In reply to this post by Ted Zeng
tedzzx <[hidden email]> wrote:

>
> If I want to find out the globle minia, how shoul I change my code?

I sometimes use optim() within a loop, with random starting
values for each iteration of the loop. You can save the
objective function value each time and pick the best solution.
Last time I did that, I ran it 100 times.

That procedure does not guarantee finding the global minimum.
However, it does make it *more likely* to find the global minmum
*within the range of your starting values*.

Often, I make a boxplot of the various results. If they don't
show a strong mode, there is a data or model problem that needs
to be addressed. For example, the solution may be poorly defined
by the data, or the model may be specified with confounded
parameters.

--
Mike Prager, NOAA, Beaufort, NC
* Opinions expressed are personal and not represented otherwise.
* Any use of tradenames does not constitute a NOAA endorsement.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Hans W Borchers
Why not use one of the global optimizers in R, for instance 'DEoptim', and then apply optim() to find the last six decimals? I am relatively sure that the Differential Evolution operator has a better chance to come near a global optimum than a loop over optim(), though 'DEoptim' may be a bit slow (only for quite large numbers of parameters).

Regards,  Hans Werner


Mike Prager wrote
tedzzx <zengzhenxing@gmail.com> wrote:

>
> If I want to find out the globle minia, how shoul I change my code?

I sometimes use optim() within a loop, with random starting
values for each iteration of the loop. You can save the
objective function value each time and pick the best solution.
Last time I did that, I ran it 100 times.

That procedure does not guarantee finding the global minimum.
However, it does make it *more likely* to find the global minmum
*within the range of your starting values*.

Often, I make a boxplot of the various results. If they don't
show a strong mode, there is a data or model problem that needs
to be addressed. For example, the solution may be poorly defined
by the data, or the model may be specified with confounded
parameters.

--
Mike Prager, NOAA, Beaufort, NC
* Opinions expressed are personal and not represented otherwise.
* Any use of tradenames does not constitute a NOAA endorsement.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem

Mike Prager
"Hans W. Borchers" <[hidden email]> wrote:

> Why not use one of the global optimizers in R, for instance 'DEoptim', and
> then apply optim() to find the last six decimals? I am relatively sure that
> the Differential Evolution operator has a better chance to come near a
> global optimum than a loop over optim(), though 'DEoptim' may be a bit slow
> (only for quite large numbers of parameters).
 
Thanks for the reference. I will see if 'DEoptim' might be
useful in future problems.

HWB asked, why not use 'DEoptim' rather than a loop? Perhaps
that's a rhetorical question, but I'll answer it anyway, in the
context of the specific problem I am solving. (1) I did not know
that 'DEoptim' existed. (2) After starting a problem with 'nls',
I changed its structure slightly, which meant a change to
'optim'. Because the two functions have totally different
syntaxes, it was necessary to rewrite the entire script and its
supporting functions. Adding a loop was much simpler than
looking for yet *another* optimizer in R. (3) In the current
problem, perhaps 97 of 100 runs of 'optim' come to the same
solution (the best one found). That suggests that this is not a
terribly difficult problem and that there is little to be gained
by employing a different approach.

SOMEONE once posted about an R function that masked the syntax
differences among (at least some) R optimizers. That surely
would lower the barrier to switching among them. I've lost that
post, and my search has not turned it up. If that poster is
reading this, would you please respond with the information?

ALSO, is anyone aware of any document comparing the various
optimizers available in R (even in core R)?  What are the
different intended applications, and when would each be
preferred? There is some helpful material in MASS 4, but I am
hoping for something more recent and detailed.


--
Mike Prager, NOAA, Beaufort, NC
* Opinions expressed are personal and not represented otherwise.
* Any use of tradenames does not constitute a NOAA endorsement.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: optimization problem (optim vs. nlminb)

Mike Prager
In case anyone is still reading this thread, I want to add this:
In a current problem (a data-shy five-parameter nonlinear
optimization), I found "nlminb" markedly more reliable than
"optim" with method "L-BFGS-B". In reviewing the fit I made, I
found that "optim" only came close to its own minimum in about
13 of 120 trials (same data, different starting values). I
previously said 97, but I was clearly looking at the wrong data!
In contrast, "nlminb" came to that best answer in about 92
trials out of 120.

The original poster might consider "nlminb" instead of "optim".
Because nonlinear optimization is sensitive to starting values,
I would still advise solving the problem a number of times to
see if a clear minimum solution emerges.

--
Mike Prager, NOAA, Beaufort, NC
* Opinions expressed are personal and not represented otherwise.
* Any use of tradenames does not constitute a NOAA endorsement.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.