Quantcast

Error: cannot allocate vector of size 3.4 Gb

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Error: cannot allocate vector of size 3.4 Gb

Peng Yu
I run R on a linux machine that has 8GB memory. But R gives me an
error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
problem?

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Sharpie
On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
> I run R on a linux machine that has 8GB memory. But R gives me an
> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
> problem?

Is it 32-bit R or 64-bit R?

Are you running any other programs besides R?

How far into your data processing does the error occur?

The more statements you execute, the more "fragmented" R's available
memory pool becomes.  A 3.4 Gb chunk may no longer be available.


-Charlie

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Charlie Sharpsteen
Undergraduate-- Environmental Resources Engineering
Humboldt State University
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Peng Yu
On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <[hidden email]> wrote:

> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
>> I run R on a linux machine that has 8GB memory. But R gives me an
>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
>> problem?
>
> Is it 32-bit R or 64-bit R?
>
> Are you running any other programs besides R?
>
> How far into your data processing does the error occur?
>
> The more statements you execute, the more "fragmented" R's available
> memory pool becomes.  A 3.4 Gb chunk may no longer be available.

I'm pretty sure it is 64-bit R. But I need to double check. What
command I should use to check?

It seems that it didn't do anything but just read a lot of files
before it showed up the above errors.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Marc Schwartz-3
On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:

> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <[hidden email]
> > wrote:
>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix  
>>> the
>>> problem?
>>
>> Is it 32-bit R or 64-bit R?
>>
>> Are you running any other programs besides R?
>>
>> How far into your data processing does the error occur?
>>
>> The more statements you execute, the more "fragmented" R's available
>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>
> I'm pretty sure it is 64-bit R. But I need to double check. What
> command I should use to check?
>
> It seems that it didn't do anything but just read a lot of files
> before it showed up the above errors.


Check the output of:

  .Machine$sizeof.pointer

If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit.  
See ?.Machine for more information.

You can also check:

  R.version$arch

and

  .Platform$r_arch

which for 64 bit should show x86_64.

HTH,

Marc Schwartz

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Peng Yu
On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]> wrote:

> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <[hidden email]>
>> wrote:
>>>
>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
>>>>
>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
>>>> problem?
>>>
>>> Is it 32-bit R or 64-bit R?
>>>
>>> Are you running any other programs besides R?
>>>
>>> How far into your data processing does the error occur?
>>>
>>> The more statements you execute, the more "fragmented" R's available
>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>
>> I'm pretty sure it is 64-bit R. But I need to double check. What
>> command I should use to check?
>>
>> It seems that it didn't do anything but just read a lot of files
>> before it showed up the above errors.
>
>
> Check the output of:
>
>  .Machine$sizeof.pointer
>
> If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit.  See
> ?.Machine for more information.

It is 8. The code that give the error is listed below. There are 70
celfiles. I'm wondering how to investigate what cause the problem and
fix it.

library(oligo)
cel_files = list.celfiles('.', full.names=T,recursive=T)
data=read.celfiles(cel_files)

> You can also check:
>
>  R.version$arch
>
> and
>
>  .Platform$r_arch
>
> which for 64 bit should show x86_64.
>
> HTH,
>
> Marc Schwartz
>
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Benilton Carvalho
this is converging to bioc.

let me know what your sessionInfo() is and what type of CEL files  
you're trying to read, additionally provide exactly how you reproduce  
the problem.

it appears to me, i'm not sure, that you start a fresh session of R  
and then tries to read in the data - how much resource do you have  
available when you try reading in the data? having 8GB RAM does not  
mean that you have 8GB when you tried the task.

b

On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:

> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]>  
> wrote:
>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>
>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <[hidden email]
>>> >
>>> wrote:
>>>>
>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]>  
>>>> wrote:
>>>>>
>>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm  
>>>>> wondering
>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to  
>>>>> fix the
>>>>> problem?
>>>>
>>>> Is it 32-bit R or 64-bit R?
>>>>
>>>> Are you running any other programs besides R?
>>>>
>>>> How far into your data processing does the error occur?
>>>>
>>>> The more statements you execute, the more "fragmented" R's  
>>>> available
>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>>
>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>> command I should use to check?
>>>
>>> It seems that it didn't do anything but just read a lot of files
>>> before it showed up the above errors.
>>
>>
>> Check the output of:
>>
>> .Machine$sizeof.pointer
>>
>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64  
>> bit.  See
>> ?.Machine for more information.
>
> It is 8. The code that give the error is listed below. There are 70
> celfiles. I'm wondering how to investigate what cause the problem and
> fix it.
>
> library(oligo)
> cel_files = list.celfiles('.', full.names=T,recursive=T)
> data=read.celfiles(cel_files)
>
>> You can also check:
>>
>> R.version$arch
>>
>> and
>>
>> .Platform$r_arch
>>
>> which for 64 bit should show x86_64.
>>
>> HTH,
>>
>> Marc Schwartz
>>
>>
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Benilton Carvalho
oh, and i forgot to say the following:

if you're reading in 70 SNP 6.0 files, this is the math for memory  
usage:

70*(2560^2)/(2^27) = 3.4GB

the error message tells you don't have 3.4GB of free **contiguous** RAM.

b

On Nov 7, 2009, at 12:19 AM, Benilton Carvalho wrote:

> this is converging to bioc.
>
> let me know what your sessionInfo() is and what type of CEL files
> you're trying to read, additionally provide exactly how you reproduce
> the problem.
>
> it appears to me, i'm not sure, that you start a fresh session of R
> and then tries to read in the data - how much resource do you have
> available when you try reading in the data? having 8GB RAM does not
> mean that you have 8GB when you tried the task.
>
> b
>
> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]>
>> wrote:
>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>
>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <[hidden email]
>>>>>
>>>> wrote:
>>>>>
>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]>
>>>>> wrote:
>>>>>>
>>>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm
>>>>>> wondering
>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to
>>>>>> fix the
>>>>>> problem?
>>>>>
>>>>> Is it 32-bit R or 64-bit R?
>>>>>
>>>>> Are you running any other programs besides R?
>>>>>
>>>>> How far into your data processing does the error occur?
>>>>>
>>>>> The more statements you execute, the more "fragmented" R's
>>>>> available
>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>>>
>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>> command I should use to check?
>>>>
>>>> It seems that it didn't do anything but just read a lot of files
>>>> before it showed up the above errors.
>>>
>>>
>>> Check the output of:
>>>
>>> .Machine$sizeof.pointer
>>>
>>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64
>>> bit.  See
>>> ?.Machine for more information.
>>
>> It is 8. The code that give the error is listed below. There are 70
>> celfiles. I'm wondering how to investigate what cause the problem and
>> fix it.
>>
>> library(oligo)
>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>> data=read.celfiles(cel_files)
>>
>>> You can also check:
>>>
>>> R.version$arch
>>>
>>> and
>>>
>>> .Platform$r_arch
>>>
>>> which for 64 bit should show x86_64.
>>>
>>> HTH,
>>>
>>> Marc Schwartz
>>>
>>>
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Peng Yu
In reply to this post by Benilton Carvalho
On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho <[hidden email]> wrote:
> this is converging to bioc.
>
> let me know what your sessionInfo() is and what type of CEL files you're
> trying to read, additionally provide exactly how you reproduce the problem.


Here is my sessionInfo(). pname is 'moex10stv1cdf'.

> for (f in list.celfiles('.',full.names=T,recursive=T)) {
+   print(f)
+   pname=cleancdfname(whatcdf(f))
+   print(pname)
+ }


> sessionInfo()
R version 2.9.2 (2009-08-24)
x86_64-unknown-linux-gnu

locale:
LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

other attached packages:
[1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2           DBI_0.2-4
[4] oligo_1.8.3             preprocessCore_1.6.0    oligoClasses_1.6.0
[7] Biobase_2.4.1

loaded via a namespace (and not attached):
[1] affxparser_1.16.0 affyio_1.12.0     Biostrings_2.12.9 IRanges_1.2.3
[5] splines_2.9.2


> it appears to me, i'm not sure, that you start a fresh session of R and then
> tries to read in the data - how much resource do you have available when you
> try reading in the data? having 8GB RAM does not mean that you have 8GB when
> you tried the task.
>
> b
>
> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]>
>> wrote:
>>>
>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>
>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>> <[hidden email]>
>>>> wrote:
>>>>>
>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
>>>>>>
>>>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
>>>>>> problem?
>>>>>
>>>>> Is it 32-bit R or 64-bit R?
>>>>>
>>>>> Are you running any other programs besides R?
>>>>>
>>>>> How far into your data processing does the error occur?
>>>>>
>>>>> The more statements you execute, the more "fragmented" R's available
>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>>>
>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>> command I should use to check?
>>>>
>>>> It seems that it didn't do anything but just read a lot of files
>>>> before it showed up the above errors.
>>>
>>>
>>> Check the output of:
>>>
>>> .Machine$sizeof.pointer
>>>
>>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit.
>>>  See
>>> ?.Machine for more information.
>>
>> It is 8. The code that give the error is listed below. There are 70
>> celfiles. I'm wondering how to investigate what cause the problem and
>> fix it.
>>
>> library(oligo)
>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>> data=read.celfiles(cel_files)
>>
>>> You can also check:
>>>
>>> R.version$arch
>>>
>>> and
>>>
>>> .Platform$r_arch
>>>
>>> which for 64 bit should show x86_64.
>>>
>>> HTH,
>>>
>>> Marc Schwartz
>>>
>>>
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Benilton Carvalho
you haven't answered how much resource you have available when you try  
reading in the data.

with the mouse exon chip, the math is the same i mentioned before.

having 8 GB, you should be able to read in 70 samples of this chip. if  
you can't, that's because you don't have enough resources when trying  
to read.

best,

b

On Nov 7, 2009, at 10:12 AM, Peng Yu wrote:

> On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho  
> <[hidden email]> wrote:
>> this is converging to bioc.
>>
>> let me know what your sessionInfo() is and what type of CEL files  
>> you're
>> trying to read, additionally provide exactly how you reproduce the  
>> problem.
>
>
> Here is my sessionInfo(). pname is 'moex10stv1cdf'.
>
>> for (f in list.celfiles('.',full.names=T,recursive=T)) {
> +   print(f)
> +   pname=cleancdfname(whatcdf(f))
> +   print(pname)
> + }
>
>
>> sessionInfo()
> R version 2.9.2 (2009-08-24)
> x86_64-unknown-linux-gnu
>
> locale:
> LC_CTYPE
> =
> en_US
> .UTF
> -8
> ;LC_NUMERIC
> =
> C
> ;LC_TIME
> =
> en_US
> .UTF
> -8
> ;LC_COLLATE
> =
> en_US
> .UTF
> -8
> ;LC_MONETARY
> =
> C
> ;LC_MESSAGES
> =
> en_US
> .UTF
> -8
> ;LC_PAPER
> =
> en_US
> .UTF
> -8
> ;LC_NAME
> =
> C
> ;LC_ADDRESS
> =C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats     graphics  grDevices utils     datasets  methods   base
>
> other attached packages:
> [1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2           DBI_0.2-4
> [4] oligo_1.8.3             preprocessCore_1.6.0    oligoClasses_1.6.0
> [7] Biobase_2.4.1
>
> loaded via a namespace (and not attached):
> [1] affxparser_1.16.0 affyio_1.12.0     Biostrings_2.12.9  
> IRanges_1.2.3
> [5] splines_2.9.2
>
>
>> it appears to me, i'm not sure, that you start a fresh session of R  
>> and then
>> tries to read in the data - how much resource do you have available  
>> when you
>> try reading in the data? having 8GB RAM does not mean that you have  
>> 8GB when
>> you tried the task.
>>
>> b
>>
>> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>>
>>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]>
>>> wrote:
>>>>
>>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>>
>>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>>> <[hidden email]>
>>>>> wrote:
>>>>>>
>>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]>  
>>>>>> wrote:
>>>>>>>
>>>>>>> I run R on a linux machine that has 8GB memory. But R gives me  
>>>>>>> an
>>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm  
>>>>>>> wondering
>>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to  
>>>>>>> fix the
>>>>>>> problem?
>>>>>>
>>>>>> Is it 32-bit R or 64-bit R?
>>>>>>
>>>>>> Are you running any other programs besides R?
>>>>>>
>>>>>> How far into your data processing does the error occur?
>>>>>>
>>>>>> The more statements you execute, the more "fragmented" R's  
>>>>>> available
>>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>>>>
>>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>>> command I should use to check?
>>>>>
>>>>> It seems that it didn't do anything but just read a lot of files
>>>>> before it showed up the above errors.
>>>>
>>>>
>>>> Check the output of:
>>>>
>>>> .Machine$sizeof.pointer
>>>>
>>>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64  
>>>> bit.
>>>> See
>>>> ?.Machine for more information.
>>>
>>> It is 8. The code that give the error is listed below. There are 70
>>> celfiles. I'm wondering how to investigate what cause the problem  
>>> and
>>> fix it.
>>>
>>> library(oligo)
>>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>>> data=read.celfiles(cel_files)
>>>
>>>> You can also check:
>>>>
>>>> R.version$arch
>>>>
>>>> and
>>>>
>>>> .Platform$r_arch
>>>>
>>>> which for 64 bit should show x86_64.
>>>>
>>>> HTH,
>>>>
>>>> Marc Schwartz
>>>>
>>>>
>>>
>>> ______________________________________________
>>> [hidden email] mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Peng Yu
Most of the 8GB was available, when I run the code, because R was the
only computation session running.

On Sat, Nov 7, 2009 at 7:51 AM, Benilton Carvalho <[hidden email]> wrote:

> you haven't answered how much resource you have available when you try
> reading in the data.
>
> with the mouse exon chip, the math is the same i mentioned before.
>
> having 8 GB, you should be able to read in 70 samples of this chip. if you
> can't, that's because you don't have enough resources when trying to read.
>
> best,
>
> b
>
> On Nov 7, 2009, at 10:12 AM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho <[hidden email]>
>> wrote:
>>>
>>> this is converging to bioc.
>>>
>>> let me know what your sessionInfo() is and what type of CEL files you're
>>> trying to read, additionally provide exactly how you reproduce the
>>> problem.
>>
>>
>> Here is my sessionInfo(). pname is 'moex10stv1cdf'.
>>
>>> for (f in list.celfiles('.',full.names=T,recursive=T)) {
>>
>> +   print(f)
>> +   pname=cleancdfname(whatcdf(f))
>> +   print(pname)
>> + }
>>
>>
>>> sessionInfo()
>>
>> R version 2.9.2 (2009-08-24)
>> x86_64-unknown-linux-gnu
>>
>> locale:
>>
>> LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C
>>
>> attached base packages:
>> [1] stats     graphics  grDevices utils     datasets  methods   base
>>
>> other attached packages:
>> [1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2           DBI_0.2-4
>> [4] oligo_1.8.3             preprocessCore_1.6.0    oligoClasses_1.6.0
>> [7] Biobase_2.4.1
>>
>> loaded via a namespace (and not attached):
>> [1] affxparser_1.16.0 affyio_1.12.0     Biostrings_2.12.9 IRanges_1.2.3
>> [5] splines_2.9.2
>>
>>
>>> it appears to me, i'm not sure, that you start a fresh session of R and
>>> then
>>> tries to read in the data - how much resource do you have available when
>>> you
>>> try reading in the data? having 8GB RAM does not mean that you have 8GB
>>> when
>>> you tried the task.
>>>
>>> b
>>>
>>> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>>>
>>>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]>
>>>> wrote:
>>>>>
>>>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>>>
>>>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>>>> <[hidden email]>
>>>>>> wrote:
>>>>>>>
>>>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]> wrote:
>>>>>>>>
>>>>>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix
>>>>>>>> the
>>>>>>>> problem?
>>>>>>>
>>>>>>> Is it 32-bit R or 64-bit R?
>>>>>>>
>>>>>>> Are you running any other programs besides R?
>>>>>>>
>>>>>>> How far into your data processing does the error occur?
>>>>>>>
>>>>>>> The more statements you execute, the more "fragmented" R's available
>>>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be available.
>>>>>>
>>>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>>>> command I should use to check?
>>>>>>
>>>>>> It seems that it didn't do anything but just read a lot of files
>>>>>> before it showed up the above errors.
>>>>>
>>>>>
>>>>> Check the output of:
>>>>>
>>>>> .Machine$sizeof.pointer
>>>>>
>>>>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit.
>>>>> See
>>>>> ?.Machine for more information.
>>>>
>>>> It is 8. The code that give the error is listed below. There are 70
>>>> celfiles. I'm wondering how to investigate what cause the problem and
>>>> fix it.
>>>>
>>>> library(oligo)
>>>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>>>> data=read.celfiles(cel_files)
>>>>
>>>>> You can also check:
>>>>>
>>>>> R.version$arch
>>>>>
>>>>> and
>>>>>
>>>>> .Platform$r_arch
>>>>>
>>>>> which for 64 bit should show x86_64.
>>>>>
>>>>> HTH,
>>>>>
>>>>> Marc Schwartz
>>>>>
>>>>>
>>>>
>>>> ______________________________________________
>>>> [hidden email] mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide
>>>> http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>>
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
>

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Benilton Carvalho
ok, i'll take a look at this and get back to you during the week. b

On Nov 7, 2009, at 1:19 PM, Peng Yu wrote:

> Most of the 8GB was available, when I run the code, because R was the
> only computation session running.
>
> On Sat, Nov 7, 2009 at 7:51 AM, Benilton Carvalho  
> <[hidden email]> wrote:
>> you haven't answered how much resource you have available when you  
>> try
>> reading in the data.
>>
>> with the mouse exon chip, the math is the same i mentioned before.
>>
>> having 8 GB, you should be able to read in 70 samples of this chip.  
>> if you
>> can't, that's because you don't have enough resources when trying  
>> to read.
>>
>> best,
>>
>> b
>>
>> On Nov 7, 2009, at 10:12 AM, Peng Yu wrote:
>>
>>> On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho <[hidden email]
>>> >
>>> wrote:
>>>>
>>>> this is converging to bioc.
>>>>
>>>> let me know what your sessionInfo() is and what type of CEL files  
>>>> you're
>>>> trying to read, additionally provide exactly how you reproduce the
>>>> problem.
>>>
>>>
>>> Here is my sessionInfo(). pname is 'moex10stv1cdf'.
>>>
>>>> for (f in list.celfiles('.',full.names=T,recursive=T)) {
>>>
>>> +   print(f)
>>> +   pname=cleancdfname(whatcdf(f))
>>> +   print(pname)
>>> + }
>>>
>>>
>>>> sessionInfo()
>>>
>>> R version 2.9.2 (2009-08-24)
>>> x86_64-unknown-linux-gnu
>>>
>>> locale:
>>>
>>> LC_CTYPE
>>> =
>>> en_US
>>> .UTF
>>> -8
>>> ;LC_NUMERIC
>>> =
>>> C
>>> ;LC_TIME
>>> =
>>> en_US
>>> .UTF
>>> -8
>>> ;LC_COLLATE
>>> =
>>> en_US
>>> .UTF
>>> -8
>>> ;LC_MONETARY
>>> =
>>> C
>>> ;LC_MESSAGES
>>> =
>>> en_US
>>> .UTF
>>> -8
>>> ;LC_PAPER
>>> =
>>> en_US
>>> .UTF
>>> -8
>>> ;LC_NAME
>>> =
>>> C
>>> ;LC_ADDRESS
>>> =C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C
>>>
>>> attached base packages:
>>> [1] stats     graphics  grDevices utils     datasets  methods   base
>>>
>>> other attached packages:
>>> [1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2           DBI_0.2-4
>>> [4] oligo_1.8.3             preprocessCore_1.6.0    
>>> oligoClasses_1.6.0
>>> [7] Biobase_2.4.1
>>>
>>> loaded via a namespace (and not attached):
>>> [1] affxparser_1.16.0 affyio_1.12.0     Biostrings_2.12.9  
>>> IRanges_1.2.3
>>> [5] splines_2.9.2
>>>
>>>
>>>> it appears to me, i'm not sure, that you start a fresh session of  
>>>> R and
>>>> then
>>>> tries to read in the data - how much resource do you have  
>>>> available when
>>>> you
>>>> try reading in the data? having 8GB RAM does not mean that you  
>>>> have 8GB
>>>> when
>>>> you tried the task.
>>>>
>>>> b
>>>>
>>>> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>>>>
>>>>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]
>>>>> >
>>>>> wrote:
>>>>>>
>>>>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>>>>
>>>>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>>>>> <[hidden email]>
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]>  
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> I run R on a linux machine that has 8GB memory. But R gives  
>>>>>>>>> me an
>>>>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm  
>>>>>>>>> wondering
>>>>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How  
>>>>>>>>> to fix
>>>>>>>>> the
>>>>>>>>> problem?
>>>>>>>>
>>>>>>>> Is it 32-bit R or 64-bit R?
>>>>>>>>
>>>>>>>> Are you running any other programs besides R?
>>>>>>>>
>>>>>>>> How far into your data processing does the error occur?
>>>>>>>>
>>>>>>>> The more statements you execute, the more "fragmented" R's  
>>>>>>>> available
>>>>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be  
>>>>>>>> available.
>>>>>>>
>>>>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>>>>> command I should use to check?
>>>>>>>
>>>>>>> It seems that it didn't do anything but just read a lot of files
>>>>>>> before it showed up the above errors.
>>>>>>
>>>>>>
>>>>>> Check the output of:
>>>>>>
>>>>>> .Machine$sizeof.pointer
>>>>>>
>>>>>> If it is 4, R was built as 32 bit, if it is 8, R was built as  
>>>>>> 64 bit.
>>>>>> See
>>>>>> ?.Machine for more information.
>>>>>
>>>>> It is 8. The code that give the error is listed below. There are  
>>>>> 70
>>>>> celfiles. I'm wondering how to investigate what cause the  
>>>>> problem and
>>>>> fix it.
>>>>>
>>>>> library(oligo)
>>>>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>>>>> data=read.celfiles(cel_files)
>>>>>
>>>>>> You can also check:
>>>>>>
>>>>>> R.version$arch
>>>>>>
>>>>>> and
>>>>>>
>>>>>> .Platform$r_arch
>>>>>>
>>>>>> which for 64 bit should show x86_64.
>>>>>>
>>>>>> HTH,
>>>>>>
>>>>>> Marc Schwartz
>>>>>>
>>>>>>
>>>>>
>>>>> ______________________________________________
>>>>> [hidden email] mailing list
>>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>>> PLEASE do read the posting guide
>>>>> http://www.R-project.org/posting-guide.html
>>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>
>>>>
>>>
>>> ______________________________________________
>>> [hidden email] mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: cannot allocate vector of size 3.4 Gb

Benilton Carvalho
Hi Peng,

the major problem about your specific case is that when creating the  
final object, we need to set dimnames() appropriately. This triggers a  
copy of the object and that's where you get the error you describe.

With the current release, unfortunately, there isn't much to do  
(unless you're willing to add more memory). For the next release, I  
have plans of addressing issues like this and reduce memory footprint  
when processing data with the oligo package. It's nothing that can be  
done right now, as it takes some time, but I expect everything to be  
ready for the next release.

If you're trying to run RMA on your data, I can think of ways of  
working around this problem.

Cheers,

b

On Nov 7, 2009, at 5:46 PM, Benilton Carvalho wrote:

> ok, i'll take a look at this and get back to you during the week. b
>
> On Nov 7, 2009, at 1:19 PM, Peng Yu wrote:
>
>> Most of the 8GB was available, when I run the code, because R was the
>> only computation session running.
>>
>> On Sat, Nov 7, 2009 at 7:51 AM, Benilton Carvalho
>> <[hidden email]> wrote:
>>> you haven't answered how much resource you have available when you
>>> try
>>> reading in the data.
>>>
>>> with the mouse exon chip, the math is the same i mentioned before.
>>>
>>> having 8 GB, you should be able to read in 70 samples of this chip.
>>> if you
>>> can't, that's because you don't have enough resources when trying
>>> to read.
>>>
>>> best,
>>>
>>> b
>>>
>>> On Nov 7, 2009, at 10:12 AM, Peng Yu wrote:
>>>
>>>> On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho <[hidden email]
>>>>>
>>>> wrote:
>>>>>
>>>>> this is converging to bioc.
>>>>>
>>>>> let me know what your sessionInfo() is and what type of CEL files
>>>>> you're
>>>>> trying to read, additionally provide exactly how you reproduce the
>>>>> problem.
>>>>
>>>>
>>>> Here is my sessionInfo(). pname is 'moex10stv1cdf'.
>>>>
>>>>> for (f in list.celfiles('.',full.names=T,recursive=T)) {
>>>>
>>>> +   print(f)
>>>> +   pname=cleancdfname(whatcdf(f))
>>>> +   print(pname)
>>>> + }
>>>>
>>>>
>>>>> sessionInfo()
>>>>
>>>> R version 2.9.2 (2009-08-24)
>>>> x86_64-unknown-linux-gnu
>>>>
>>>> locale:
>>>>
>>>> LC_CTYPE
>>>> =
>>>> en_US
>>>> .UTF
>>>> -8
>>>> ;LC_NUMERIC
>>>> =
>>>> C
>>>> ;LC_TIME
>>>> =
>>>> en_US
>>>> .UTF
>>>> -8
>>>> ;LC_COLLATE
>>>> =
>>>> en_US
>>>> .UTF
>>>> -8
>>>> ;LC_MONETARY
>>>> =
>>>> C
>>>> ;LC_MESSAGES
>>>> =
>>>> en_US
>>>> .UTF
>>>> -8
>>>> ;LC_PAPER
>>>> =
>>>> en_US
>>>> .UTF
>>>> -8
>>>> ;LC_NAME
>>>> =
>>>> C
>>>> ;LC_ADDRESS
>>>> =C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C
>>>>
>>>> attached base packages:
>>>> [1] stats     graphics  grDevices utils     datasets  methods    
>>>> base
>>>>
>>>> other attached packages:
>>>> [1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2           DBI_0.2-4
>>>> [4] oligo_1.8.3             preprocessCore_1.6.0
>>>> oligoClasses_1.6.0
>>>> [7] Biobase_2.4.1
>>>>
>>>> loaded via a namespace (and not attached):
>>>> [1] affxparser_1.16.0 affyio_1.12.0     Biostrings_2.12.9
>>>> IRanges_1.2.3
>>>> [5] splines_2.9.2
>>>>
>>>>
>>>>> it appears to me, i'm not sure, that you start a fresh session of
>>>>> R and
>>>>> then
>>>>> tries to read in the data - how much resource do you have
>>>>> available when
>>>>> you
>>>>> try reading in the data? having 8GB RAM does not mean that you
>>>>> have 8GB
>>>>> when
>>>>> you tried the task.
>>>>>
>>>>> b
>>>>>
>>>>> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>>>>>
>>>>>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <[hidden email]
>>>>>>>
>>>>>> wrote:
>>>>>>>
>>>>>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>>>>>
>>>>>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>>>>>> <[hidden email]>
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <[hidden email]>
>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> I run R on a linux machine that has 8GB memory. But R gives
>>>>>>>>>> me an
>>>>>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm
>>>>>>>>>> wondering
>>>>>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How
>>>>>>>>>> to fix
>>>>>>>>>> the
>>>>>>>>>> problem?
>>>>>>>>>
>>>>>>>>> Is it 32-bit R or 64-bit R?
>>>>>>>>>
>>>>>>>>> Are you running any other programs besides R?
>>>>>>>>>
>>>>>>>>> How far into your data processing does the error occur?
>>>>>>>>>
>>>>>>>>> The more statements you execute, the more "fragmented" R's
>>>>>>>>> available
>>>>>>>>> memory pool becomes.  A 3.4 Gb chunk may no longer be
>>>>>>>>> available.
>>>>>>>>
>>>>>>>> I'm pretty sure it is 64-bit R. But I need to double check.  
>>>>>>>> What
>>>>>>>> command I should use to check?
>>>>>>>>
>>>>>>>> It seems that it didn't do anything but just read a lot of  
>>>>>>>> files
>>>>>>>> before it showed up the above errors.
>>>>>>>
>>>>>>>
>>>>>>> Check the output of:
>>>>>>>
>>>>>>> .Machine$sizeof.pointer
>>>>>>>
>>>>>>> If it is 4, R was built as 32 bit, if it is 8, R was built as
>>>>>>> 64 bit.
>>>>>>> See
>>>>>>> ?.Machine for more information.
>>>>>>
>>>>>> It is 8. The code that give the error is listed below. There are
>>>>>> 70
>>>>>> celfiles. I'm wondering how to investigate what cause the
>>>>>> problem and
>>>>>> fix it.
>>>>>>
>>>>>> library(oligo)
>>>>>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>>>>>> data=read.celfiles(cel_files)
>>>>>>
>>>>>>> You can also check:
>>>>>>>
>>>>>>> R.version$arch
>>>>>>>
>>>>>>> and
>>>>>>>
>>>>>>> .Platform$r_arch
>>>>>>>
>>>>>>> which for 64 bit should show x86_64.
>>>>>>>
>>>>>>> HTH,
>>>>>>>
>>>>>>> Marc Schwartz
>>>>>>>
>>>>>>>
>>>>>>
>>>>>> ______________________________________________
>>>>>> [hidden email] mailing list
>>>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>>>> PLEASE do read the posting guide
>>>>>> http://www.R-project.org/posting-guide.html
>>>>>> and provide commented, minimal, self-contained, reproducible  
>>>>>> code.
>>>>>
>>>>>
>>>>
>>>> ______________________________________________
>>>> [hidden email] mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide
>>>> http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>>
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Loading...