Optimization with Parallel Processing Functions/Packages

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Optimization with Parallel Processing Functions/Packages

Doran, Harold
More of a general query, but looking to see if others have successfully used something like the foreach package (or other parallel style functions) with certain functions that minimize likelihood or objective functions (e.g., optim/nlminb).

I have had great success with embarrassingly parallel problems and my R packages have benefited greatly. However, optimization doesn't fit as nicely within that context as the values at iteration t depend on the values found at iteration t-1 and such. So, I'm assuming the cost of splitting and combining might be more expensive in this context that simply doing minimization on a single core.

If others have experiences or even possibly R-specific resources that implement this that I would be able to study, I would appreciate seeing how this might be implemented.

Regards
Harold

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Optimization with Parallel Processing Functions/Packages

Jeff Newmiller
This is highly problem dependent... and you appear to already know the answer. Note that some differential evolution solution approaches may benefit from parallelizing evaluation of generations since within that sub-problem the optimization dependencies don't apply.

A theoretical discussion forum such as stats.stackexchange.com might be better for this... though even they may complain that the question is too vague.

On November 7, 2018 10:24:33 AM PST, "Doran, Harold" <[hidden email]> wrote:

>More of a general query, but looking to see if others have successfully
>used something like the foreach package (or other parallel style
>functions) with certain functions that minimize likelihood or objective
>functions (e.g., optim/nlminb).
>
>I have had great success with embarrassingly parallel problems and my R
>packages have benefited greatly. However, optimization doesn't fit as
>nicely within that context as the values at iteration t depend on the
>values found at iteration t-1 and such. So, I'm assuming the cost of
>splitting and combining might be more expensive in this context that
>simply doing minimization on a single core.
>
>If others have experiences or even possibly R-specific resources that
>implement this that I would be able to study, I would appreciate seeing
>how this might be implemented.
>
>Regards
>Harold
>
>______________________________________________
>[hidden email] mailing list -- To UNSUBSCRIBE and more, see
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide
>http://www.R-project.org/posting-guide.html
>and provide commented, minimal, self-contained, reproducible code.

--
Sent from my phone. Please excuse my brevity.

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.