Memory in R.

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Memory in R.

R RR
Hi,

I am pretty new in R.
In fact, I started using R last month.
So, be indulgent if my questions seem
too boring or obvious.

I have 2 basic questions:

1 - I want to increase memory, but I can't find
the right way. E.g: in stata, you just type "set memory 1g".
When I try to load huge datasets, R crashes.

2 - I want to clear the memory (no object or data), such as the
"clear" command in stata.

Best regards.

Amadou DIALLO.
PhD student.
Cerdi, University of Auvergne
France.

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Memory in R.

jholtman
On 3/31/08, R RR <[hidden email]> wrote:

> Hi,
>
> I am pretty new in R.
> In fact, I started using R last month.
> So, be indulgent if my questions seem
> too boring or obvious.
>
> I have 2 basic questions:
>
> 1 - I want to increase memory, but I can't find
> the right way. E.g: in stata, you just type "set memory 1g".
> When I try to load huge datasets, R crashes.

PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

What exactly are you doing?  How much memory do you have on your
system?  What is "hugh"?  All your R objects have to fit in physical
memory; you don't want to page.  So some more information is needed.
>
> 2 - I want to clear the memory (no object or data), such as the
> "clear" command in stata.

?rm

>
> Best regards.
>
> Amadou DIALLO.
> PhD student.
> Cerdi, University of Auvergne
> France.
>
>        [[alternative HTML version deleted]]
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>


--
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Memory in R.

Liviu Andronic
In reply to this post by R RR
Hello Amadou,

On Mon, Mar 31, 2008 at 4:53 PM, R RR <[hidden email]> wrote:
>  1 - I want to increase memory, but I can't find
>  the right way. E.g: in stata, you just type "set memory 1g".
>  When I try to load huge datasets, R crashes.

You will find this recent thread [1] interesting. You'd also want to
check packages filehash, ff and sqldf.
Regards,
Liviu

[1] http://www.nabble.com/How-to-read-HUGE-data-sets--tt15729830.html

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re : Memory in R.

R RR
Dear R users,
Many thanks for your answers.
I've made much progress since my last posting.

I have now the followuing problem. I've run the GAM model

mygam <- gam(Y ~ factor(year)
+ m1.q02 + m1.q05y + m1.q05y2 + m1.q06 + m4b.q05 + m4b.q052 + m5a.q01
+ depratio + depratio2 + residence10y + urbrur + factor(prefect)
+ m1.q02_ps
+ m1.q05y_ps
+ m1.q05y2_ps
+ m1.q06_ps
+ m4b.q05_ps
+ m4b.q052_ps
+ m5a.q01_ps
+ depratio_ps
+ depratio2_ps
+ residence10y_ps
+ urbrur_ps
                  +factor(hhid), data=cwp2)

and obtained the following error code:

Erreur : impossible d'allouer un vecteur de taille 236.3 Mo
(in english: cannot allocate a 236.3 Mo vector memory).

I have 7237 observations in my data.

Is there any way to increase the memory to fit the model?

Best regards.

Amadou DIALLO



2008/4/1, Liviu Andronic <[hidden email]>:

> Hello Amadou,
>
> On Mon, Mar 31, 2008 at 4:53 PM, R RR <[hidden email]> wrote:
> >  1 - I want to increase memory, but I can't find
> >  the right way. E.g: in stata, you just type "set memory 1g".
> >  When I try to load huge datasets, R crashes.
>
> You will find this recent thread [1] interesting. You'd also want to
> check packages filehash, ff and sqldf.
> Regards,
> Liviu
>
> [1] http://www.nabble.com/How-to-read-HUGE-data-sets--tt15729830.html
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Re : Memory in R.

Gavin Simpson
On Wed, 2008-04-02 at 16:46 +0200, R RR wrote:

> Dear R users,
> Many thanks for your answers.
> I've made much progress since my last posting.
>
> I have now the followuing problem. I've run the GAM model
>
> mygam <- gam(Y ~ factor(year)
> + m1.q02 + m1.q05y + m1.q05y2 + m1.q06 + m4b.q05 + m4b.q052 + m5a.q01
> + depratio + depratio2 + residence10y + urbrur + factor(prefect)
> + m1.q02_ps
> + m1.q05y_ps
> + m1.q05y2_ps
> + m1.q06_ps
> + m4b.q05_ps
> + m4b.q052_ps
> + m5a.q01_ps
> + depratio_ps
> + depratio2_ps
> + residence10y_ps
> + urbrur_ps
>                   +factor(hhid), data=cwp2)
>
> and obtained the following error code:

No idea if this will help, but seeing as there doesn't appear to be any
smooth terms in that model (s() or lo() depending on whether this is
mgcv:::gam or gam:::gam), and you are fitting a Gaussian model (no
family so default used), why not fit this using lm? You might find this
takes less memory than the way gam handles making the model matrix.

HTH

G

>
> Erreur : impossible d'allouer un vecteur de taille 236.3 Mo
> (in english: cannot allocate a 236.3 Mo vector memory).
>
> I have 7237 observations in my data.
>
> Is there any way to increase the memory to fit the model?
>
> Best regards.
>
> Amadou DIALLO
>
>
>
> 2008/4/1, Liviu Andronic <[hidden email]>:
> > Hello Amadou,
> >
> > On Mon, Mar 31, 2008 at 4:53 PM, R RR <[hidden email]> wrote:
> > >  1 - I want to increase memory, but I can't find
> > >  the right way. E.g: in stata, you just type "set memory 1g".
> > >  When I try to load huge datasets, R crashes.
> >
> > You will find this recent thread [1] interesting. You'd also want to
> > check packages filehash, ff and sqldf.
> > Regards,
> > Liviu
> >
> > [1] http://www.nabble.com/How-to-read-HUGE-data-sets--tt15729830.html
> >
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%
 Dr. Gavin Simpson             [t] +44 (0)20 7679 0522
 ECRC, UCL Geography,          [f] +44 (0)20 7679 0565
 Pearson Building,             [e] gavin.simpsonATNOSPAMucl.ac.uk
 Gower Street, London          [w] http://www.ucl.ac.uk/~ucfagls/
 UK. WC1E 6BT.                 [w] http://www.freshwaters.org.uk
%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.