Package inclusion in R core implementation

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

Package inclusion in R core implementation

MorganMorgan
Hi,

It sometimes happens that some packages get included to R like for example
the parallel package.

I was wondering if there is a process to decide whether or not to include a
package in the core implementation of R?

For example, why not include the Rcpp package, which became for a lot of
user the main tool to extend R?

What is our view on the (not so well known) dotCall64 package which is an
interesting alternative for extending R?

Thank you
Best regards,
Morgan

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Package inclusion in R core implementation

Jim Hester
Conversely, what is the process to remove a package from core R? It seems
to me some (many?) of the packages included are there more out of
historical accident rather than any technical need to be in the core
distribution. Having them as a core (or recommended) package makes them
harder update independently to R and makes testing, development and
contribution more cumbersome.

On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan <[hidden email]>
wrote:

> Hi,
>
> It sometimes happens that some packages get included to R like for example
> the parallel package.
>
> I was wondering if there is a process to decide whether or not to include a
> package in the core implementation of R?
>
> For example, why not include the Rcpp package, which became for a lot of
> user the main tool to extend R?
>
> What is our view on the (not so well known) dotCall64 package which is an
> interesting alternative for extending R?
>
> Thank you
> Best regards,
> Morgan
>
>         [[alternative HTML version deleted]]
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Package inclusion in R core implementation

J C Nash
As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in optim(), I've
been pushing for some time for their deprecation. They aren't "bad", but we have
better tools, and they are in CRAN packages. Similarly, I believe other optimization
tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved to
packages (there are already 2 versions at least of LBFGS that I and Matt Fidler
are merging). And optim::SANN does not match any usual expectations of users.

I'm sure there are other tools for other tasks that can and should move to packages
to streamline the work of our core team. However, I can understand that there is this
awkward issue of actually doing this. I know I'm willing to help with preparing
"Transition Guide" documentation and scripts, and would be surprised if there are
not others. R already has a policy of full support only for current version, so
hanging on to antique tools (the three codes at the top are based on papers all
of which now qualify for >50 years old) seems inconsistent with other messages.

For information: I'm coordinating a project to build understanding of what
older algorithms are in R as the histoRicalg project. See
https://gitlab.com/nashjc/histoRicalg. We welcome participation.

Best, JN

On 2019-03-04 7:59 a.m., Jim Hester wrote:

> Conversely, what is the process to remove a package from core R? It seems
> to me some (many?) of the packages included are there more out of
> historical accident rather than any technical need to be in the core
> distribution. Having them as a core (or recommended) package makes them
> harder update independently to R and makes testing, development and
> contribution more cumbersome.
>
> On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan <[hidden email]>
> wrote:
>
>> Hi,
>>
>> It sometimes happens that some packages get included to R like for example
>> the parallel package.
>>
>> I was wondering if there is a process to decide whether or not to include a
>> package in the core implementation of R?
>>
>> For example, why not include the Rcpp package, which became for a lot of
>> user the main tool to extend R?
>>
>> What is our view on the (not so well known) dotCall64 package which is an
>> interesting alternative for extending R?
>>
>> Thank you
>> Best regards,
>> Morgan
>>
>>         [[alternative HTML version deleted]]
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Package inclusion in R core implementation

Avraham Adler
On Mon, Mar 4, 2019 at 5:01 PM J C Nash <[hidden email]> wrote:

> As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in
> optim(), I've
> been pushing for some time for their deprecation. They aren't "bad", but
> we have
> better tools, and they are in CRAN packages. Similarly, I believe other
> optimization
> tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved to
> packages (there are already 2 versions at least of LBFGS that I and Matt
> Fidler
> are merging). And optim::SANN does not match any usual expectations of
> users.
>
> I'm sure there are other tools for other tasks that can and should move to
> packages
> to streamline the work of our core team. However, I can understand that
> there is this
> awkward issue of actually doing this. I know I'm willing to help with
> preparing
> "Transition Guide" documentation and scripts, and would be surprised if
> there are
> not others. R already has a policy of full support only for current
> version, so
> hanging on to antique tools (the three codes at the top are based on
> papers all
> of which now qualify for >50 years old) seems inconsistent with other
> messages.
>
> For information: I'm coordinating a project to build understanding of what
> older algorithms are in R as the histoRicalg project. See
> https://gitlab.com/nashjc/histoRicalg. We welcome participation.
>
> Best, JN
>
> On 2019-03-04 7:59 a.m., Jim Hester wrote:
> > Conversely, what is the process to remove a package from core R? It seems
> > to me some (many?) of the packages included are there more out of
> > historical accident rather than any technical need to be in the core
> > distribution. Having them as a core (or recommended) package makes them
> > harder update independently to R and makes testing, development and
> > contribution more cumbersome.
> >
> > On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan <[hidden email]>
> > wrote:
> >
> >> Hi,
> >>
> >> It sometimes happens that some packages get included to R like for
> example
> >> the parallel package.
> >>
> >> I was wondering if there is a process to decide whether or not to
> include a
> >> package in the core implementation of R?
> >>
> >> For example, why not include the Rcpp package, which became for a lot of
> >> user the main tool to extend R?
> >>
> >> What is our view on the (not so well known) dotCall64 package which is
> an
> >> interesting alternative for extending R?
> >>
> >> Thank you
> >> Best regards,
> >> Morgan
> >>
>

I have No arguments with updating code to more correct or modern versions,
but I think that as a design decision, base R should have optimization
routines as opposed to it being an external package which conceptually
could be orphaned. Or at least some package gets made recommended and
adopted by R core.

Thank you,

Avi
--
Sent from Gmail Mobile

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Package inclusion in R core implementation

J C Nash
I concur with Avraham that capabilities need to be ensured e.g., in recommended
packages. I should have mentioned that. My concern is that the core should be
focused on the programming language aspects. The computational math and some of the more
intricate data management could better be handled by folk outside the core.

JN

On 2019-03-04 9:12 a.m., Avraham Adler wrote:

> On Mon, Mar 4, 2019 at 5:01 PM J C Nash <[hidden email] <mailto:[hidden email]>> wrote:
>
>     As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in optim(), I've
>     been pushing for some time for their deprecation. They aren't "bad", but we have
>     better tools, and they are in CRAN packages. Similarly, I believe other optimization
>     tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved to
>     packages (there are already 2 versions at least of LBFGS that I and Matt Fidler
>     are merging). And optim::SANN does not match any usual expectations of users.
>
>     I'm sure there are other tools for other tasks that can and should move to packages
>     to streamline the work of our core team. However, I can understand that there is this
>     awkward issue of actually doing this. I know I'm willing to help with preparing
>     "Transition Guide" documentation and scripts, and would be surprised if there are
>     not others. R already has a policy of full support only for current version, so
>     hanging on to antique tools (the three codes at the top are based on papers all
>     of which now qualify for >50 years old) seems inconsistent with other messages.
>
>     For information: I'm coordinating a project to build understanding of what
>     older algorithms are in R as the histoRicalg project. See
>     https://gitlab.com/nashjc/histoRicalg. We welcome participation.
>
>     Best, JN
>
>     On 2019-03-04 7:59 a.m., Jim Hester wrote:
>     > Conversely, what is the process to remove a package from core R? It seems
>     > to me some (many?) of the packages included are there more out of
>     > historical accident rather than any technical need to be in the core
>     > distribution. Having them as a core (or recommended) package makes them
>     > harder update independently to R and makes testing, development and
>     > contribution more cumbersome.
>     >
>     > On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan <[hidden email] <mailto:[hidden email]>>
>     > wrote:
>     >
>     >> Hi,
>     >>
>     >> It sometimes happens that some packages get included to R like for example
>     >> the parallel package.
>     >>
>     >> I was wondering if there is a process to decide whether or not to include a
>     >> package in the core implementation of R?
>     >>
>     >> For example, why not include the Rcpp package, which became for a lot of
>     >> user the main tool to extend R?
>     >>
>     >> What is our view on the (not so well known) dotCall64 package which is an
>     >> interesting alternative for extending R?
>     >>
>     >> Thank you
>     >> Best regards,
>     >> Morgan
>     >>
>
>
> I have No arguments with updating code to more correct or modern versions, but I think that as a design decision, base R
> should have optimization routines as opposed to it being an external package which conceptually could be orphaned. Or at
> least some package gets made recommended and adopted by R core.
>
> Thank you,
>
> Avi
> --
> Sent from Gmail Mobile

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Package inclusion in R core implementation

Duncan Murdoch-2
In reply to this post by Jim Hester
On 04/03/2019 7:59 a.m., Jim Hester wrote:
> Conversely, what is the process to remove a package from core R? It seems
> to me some (many?) of the packages included are there more out of
> historical accident rather than any technical need to be in the core
> distribution. Having them as a core (or recommended) package makes them
> harder update independently to R and makes testing, development and
> contribution more cumbersome.

You are conflating base and recommended packages.  Base packages can't
be updated independently of R because they provide or make use of R
internals, so they couldn't be distributed separately.  The list of base
packages is

  [1] "base"      "compiler"  "datasets"  "graphics"  "grDevices" "grid"
      "methods"   "parallel"  "splines"   "stats"     "stats4"
[12] "tcltk"     "tools"     "utils"

The other packages distributed with R are recommended packages:

  [1] "boot"       "class"      "cluster"    "codetools"  "foreign"
"KernSmooth" "lattice"    "MASS"       "Matrix"     "mgcv"
[11] "nlme"       "nnet"       "rpart"      "spatial"    "survival"

Those ones have no particular connection to the internals, but they are
distributed with R, and suffer through somewhat more rigorous testing
than most contributed packages.  Some of them are used in R's own tests.
  They can be updated at any time, but their authors are asked not to
update them near R releases.  In many cases (but not all) their current
maintainers are R Core members.

In answer to your question and Morgan's:  the process is completely
opaque.  R Core will add or remove a package if they think it makes
sense from their point of view.  Generally that happens very rarely,
because it can be a lot of work, and usually there's not much to be gained.


>
> On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan <[hidden email]>
> wrote:
>
>> Hi,
>>
>> It sometimes happens that some packages get included to R like for example
>> the parallel package.
>>
>> I was wondering if there is a process to decide whether or not to include a
>> package in the core implementation of R?
>>
>> For example, why not include the Rcpp package, which became for a lot of
>> user the main tool to extend R?
>>
>> What is our view on the (not so well known) dotCall64 package which is an
>> interesting alternative for extending R?
>>
>> Thank you
>> Best regards,
>> Morgan
>>
>>          [[alternative HTML version deleted]]
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Fwd: Package inclusion in R core implementation

J C Nash
In reply to this post by J C Nash
Rereading my post below, I realize scope for misinterpretation. As I have said earlier,
I recognize the workload in doing any streamlining, and also the immense service to us
all by r-core. The issue is how to manage the workload efficiently while maintaining
and modernizing the capability. That is likely as challenging as doing the work itself.

JN


> I concur with Avraham that capabilities need to be ensured e.g., in recommended
> packages. I should have mentioned that. My concern is that the core should be
> focused on the programming language aspects. The computational math and some of the more
> intricate data management could better be handled by folk outside the core.
>
> JN
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel