Quantcast

Error: memory exhausted (limit reached?)

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
2 messages Options
Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Error: memory exhausted (limit reached?)

Guillaume MULLER
Hi,

I first posted this message on r-help, but got redirected here.

I encounter a strange memory error, and I'd like some help to determine if I'm doing something wrong or if there's a bug in recent R versions...

I'm currently working on a DeepNet project @home, with an old PC with 4Gb RAM, running Ubuntu 16.04.

For efficiency reason, I preprocessed my dataset and stored it as a csv file with write.csv() so that I can reload it at will with read.csv(). I did it several time, everything was working fine.

A few days ago, I tried to pursue my work on anther machine @work, I wanted to use a more recent & powerful machine with 8Gb of RAM running under Ubuntu 16.10, but I ran into a strange error:

$ R
16:05:12 R > trainSet <- read.csv("trainSetWhole.csv")
Error: memory exhausted (limit reached?)
Error: C stack usage  7970548 is too close to the limit


I read a few fora on the Internet and found a potential workaround, consisting in increasing the stack size using ulimit. Unfortunately, it doesn't work for me:

$ ulimit -s
8192
$ ulimit -s $((100*$(ulimit -s)))
$ R --vanilla
16:05:12 R > trainSet <- read.csv("trainSetWhole.csv")
Error: memory exhausted (limit reached?)


This was under Ubuntu 16.10 with R version 3.3.1 (2016-06-21) "Bug in Your Hair"

Yesterday, I upgraded my Ubuntu to 17.04 (R version 3.3.2 (2016-10-31) "Sincere Pumpkin Patch") and tried again. This resulted in the exact same error.

How is it possible that a 513MB file cannot be read on a machine with 8GB RAM?
Also, how is it possible that a machine with twice the RAM as the previous one cannot load the same file?

Since the only other difference is the Ubuntu version (thus R version), I assume there's a bug in R/csv loader, but I don't know where to look for...

If anyone has an idea I would be glad to hear it...


GM
--------

For the sake of completeness, I share my Trainset at the following link:
> https://mega.nz/#!ZMs0TSRJ!47DCZCnE6_FnICUp8MVS2R9eY_GdVIyGZ5O9TiejHfc

FYI, it loads perfectly on 2 machines with Ubuntu 16.04 and R version 3.2.3 (2015-12-10) -- "Wooden Christmas-Tree". But exceededs stack mem under Ubuntu 16.10's R version 3.3.1 (2016-06-21) -- "Bug in Your Hair" and Ubuntu to 17.04's R version 3.3.2 (2016-10-31) "Sincere Pumpkin Patch".

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Error: memory exhausted (limit reached?)

Tomas Kalibera
Hi Guillaume,

the error "C stack usage is too close to the limit" is usually caused by
infinite recursion.
It can also be caused by an external library that corrupts the C stack
(such as Java on Linux, e.g. when using rJava).

I cannot repeat the problem on my machine.

To rule out the second option, you can try in a fresh R session without
loading any packages (check with sessionInfo).
To diagnose the first option one would need to know if and where the
infinite recursion happens, which can be found with gdb on a machine
where the problem can be repeated.

Best
Tomas


On 03/15/2017 12:01 PM, Guillaume MULLER wrote:

> Hi,
>
> I first posted this message on r-help, but got redirected here.
>
> I encounter a strange memory error, and I'd like some help to determine if I'm doing something wrong or if there's a bug in recent R versions...
>
> I'm currently working on a DeepNet project @home, with an old PC with 4Gb RAM, running Ubuntu 16.04.
>
> For efficiency reason, I preprocessed my dataset and stored it as a csv file with write.csv() so that I can reload it at will with read.csv(). I did it several time, everything was working fine.
>
> A few days ago, I tried to pursue my work on anther machine @work, I wanted to use a more recent & powerful machine with 8Gb of RAM running under Ubuntu 16.10, but I ran into a strange error:
>
> $ R
> 16:05:12 R > trainSet <- read.csv("trainSetWhole.csv")
> Error: memory exhausted (limit reached?)
> Error: C stack usage  7970548 is too close to the limit
>
>
> I read a few fora on the Internet and found a potential workaround, consisting in increasing the stack size using ulimit. Unfortunately, it doesn't work for me:
>
> $ ulimit -s
> 8192
> $ ulimit -s $((100*$(ulimit -s)))
> $ R --vanilla
> 16:05:12 R > trainSet <- read.csv("trainSetWhole.csv")
> Error: memory exhausted (limit reached?)
>
>
> This was under Ubuntu 16.10 with R version 3.3.1 (2016-06-21) "Bug in Your Hair"
>
> Yesterday, I upgraded my Ubuntu to 17.04 (R version 3.3.2 (2016-10-31) "Sincere Pumpkin Patch") and tried again. This resulted in the exact same error.
>
> How is it possible that a 513MB file cannot be read on a machine with 8GB RAM?
> Also, how is it possible that a machine with twice the RAM as the previous one cannot load the same file?
>
> Since the only other difference is the Ubuntu version (thus R version), I assume there's a bug in R/csv loader, but I don't know where to look for...
>
> If anyone has an idea I would be glad to hear it...
>
>
> GM
> --------
>
> For the sake of completeness, I share my Trainset at the following link:
>> https://mega.nz/#!ZMs0TSRJ!47DCZCnE6_FnICUp8MVS2R9eY_GdVIyGZ5O9TiejHfc
> FYI, it loads perfectly on 2 machines with Ubuntu 16.04 and R version 3.2.3 (2015-12-10) -- "Wooden Christmas-Tree". But exceededs stack mem under Ubuntu 16.10's R version 3.3.1 (2016-06-21) -- "Bug in Your Hair" and Ubuntu to 17.04's R version 3.3.2 (2016-10-31) "Sincere Pumpkin Patch".
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Loading...