Master's project to coerce linux nvidia drivers to run generalised linear models

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Master's project to coerce linux nvidia drivers to run generalised linear models

Oliver LYTTELTON


Hi,

I am working with a friend on a master's project. Our laboratory does a
lot of statistical analysis using the R stats package and we also have a
lot of under-utilised nvidia cards sitting in the back of our networked
linux machines. Our idea is to coerce the linux nvidia driver to run
some of our statistical analysis for us. Our first thought was to
specifically code up a version of glm() to run on the nvidia cards...

Thinking that this might be of use to the broader community we thought
we might ask for feedback before starting?

Any ideas...

Thanks,

Olly

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Master's project to coerce linux nvidia drivers to run generalised linear models

Marc Schwartz (via MN)
On Mon, 2006-01-23 at 15:24 -0500, Oliver LYTTELTON wrote:

>
> Hi,
>
> I am working with a friend on a master's project. Our laboratory does a
> lot of statistical analysis using the R stats package and we also have a
> lot of under-utilised nvidia cards sitting in the back of our networked
> linux machines. Our idea is to coerce the linux nvidia driver to run
> some of our statistical analysis for us. Our first thought was to
> specifically code up a version of glm() to run on the nvidia cards...
>
> Thinking that this might be of use to the broader community we thought
> we might ask for feedback before starting?
>
> Any ideas...
>
> Thanks,
>
> Olly


Well, I'll bite.

My first reaction to this was, why?


Then I did some Googling and found the following article:

http://www.apcmag.com/apc/v3.nsf/0/5F125BA4653309A3CA25705A0005AD27


And also noted the GPU Gems 2 site here:

http://developer.nvidia.com/object/gpu_gems_2_home.html


So, my new found perspective is, why not?


Best wishes for success, especially since I have a certain affinity for
McGill...

HTH,

Marc Schwartz

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Master's project to coerce linux nvidia drivers to run generalised linear models

Gabor Grothendieck
In reply to this post by Oliver LYTTELTON
I wonder if it would make more sense to get a relatively
low level package to run on it so that all packages that
used that low level package would benefit.  The Matrix
package and the functions runmean and sum.exact in
package caTools are some things that come to mind.
Others may have other ideas along these lines.

On 1/23/06, Oliver LYTTELTON <[hidden email]> wrote:

>
>
> Hi,
>
> I am working with a friend on a master's project. Our laboratory does a
> lot of statistical analysis using the R stats package and we also have a
> lot of under-utilised nvidia cards sitting in the back of our networked
> linux machines. Our idea is to coerce the linux nvidia driver to run
> some of our statistical analysis for us. Our first thought was to
> specifically code up a version of glm() to run on the nvidia cards...
>
> Thinking that this might be of use to the broader community we thought
> we might ask for feedback before starting?
>
> Any ideas...
>
> Thanks,
>
> Olly
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Master's project to coerce linux nvidia drivers to run generalised linear models

Douglas Bates
On 1/23/06, Gabor Grothendieck <[hidden email]> wrote:
> I wonder if it would make more sense to get a relatively
> low level package to run on it so that all packages that
> used that low level package would benefit.  The Matrix
> package and the functions runmean and sum.exact in
> package caTools are some things that come to mind.
> Others may have other ideas along these lines.

Martin and I are delighted to learn that the Matrix package is a
"relatively low-level" package :-)

We were of the opinion that the amount of code and design work that
went into it made it a little more sophisticated than that.

More seriously, the approach to speeding up model fitting that has
been most successful to date is to speed up the BLAS (Basic Linear
Algebra Subroutines), especially the Level-3 BLAS.  The bulk of the
computation in the Matrix package takes place in either Lapack (for
dense matrices) or CHOLMOD (for sparse matrices) code and those are
based on calls to the Levels 1, 2 and 3 BLAS.  The Atlas package and
K. Goto's BLAS are designed to obtain the highest level of performance
possible from the CPU on these routines.  I think the easiest way of
incorporating the power of the GPU into the model fitting process
would be to port the BLAS to the GPU.  I also imagine that someone
somewhere has already started on that.

>
> On 1/23/06, Oliver LYTTELTON <[hidden email]> wrote:
> >
> >
> > Hi,
> >
> > I am working with a friend on a master's project. Our laboratory does a
> > lot of statistical analysis using the R stats package and we also have a
> > lot of under-utilised nvidia cards sitting in the back of our networked
> > linux machines. Our idea is to coerce the linux nvidia driver to run
> > some of our statistical analysis for us. Our first thought was to
> > specifically code up a version of glm() to run on the nvidia cards...
> >
> > Thinking that this might be of use to the broader community we thought
> > we might ask for feedback before starting?
> >
> > Any ideas...
> >
> > Thanks,
> >
> > Olly
> >
> > ______________________________________________
> > [hidden email] mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: Master's project to coerce linux nvidia drivers to run generalised linear models

Gabor Grothendieck
On 1/24/06, Douglas Bates <[hidden email]> wrote:

> On 1/23/06, Gabor Grothendieck <[hidden email]> wrote:
> > I wonder if it would make more sense to get a relatively
> > low level package to run on it so that all packages that
> > used that low level package would benefit.  The Matrix
> > package and the functions runmean and sum.exact in
> > package caTools are some things that come to mind.
> > Others may have other ideas along these lines.
>
> Martin and I are delighted to learn that the Matrix package is a
> "relatively low-level" package :-)
>
> We were of the opinion that the amount of code and design work that
> went into it made it a little more sophisticated than that.

Low level refers to a package that is typically used by other routines
rather than used directly although, of course, it could be.

This was not intended to be a comment on the breadth, sophistication
or internal complexity of the package.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel