Error: cons memory exhausted (limit reached?): Memory Management?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Error: cons memory exhausted (limit reached?): Memory Management?

R help mailing list-2
Hi all, I think this is only the second time that I have posted so I apologise if my etiquette isn't quite correct.
I'm loading a large (~30GB) geojson file into R using readOGR on a HPC. I am also loading a small shapefile, and then trying to undertake some processing on the large geojson using gBuffer from the rgeos package.

I believe that the HPC is running Red Hat Enterprise Linux 7.4, and it certainly has around 750 GB free for user jobs. I have allocated the full amount of ram to the job.
I previously used the following modules to undertake this task and it ran successfully, although only after tweaking the settings that I detail below - otherwise I had the same error:

module load proj/5.0.0
module load gdal/2.3.1
module load geos/3.6.2
module load gcc/6.4.0
module load R/3.5.2
module load python  # python 3 by default
module load numpy/1.14.0 # requires module load python

Settings at linux command line that previously allowed a successful run:
R_MAX_VSIZE=720G
R_GC_MEM_GROW=0

--min-nsize=50000k --min-vsize=12M --max-ppsize=500000 (when executing the R script from command line)

However, the modules have now been updated on the HPC, and so I am now using:
module load proj/6.1.1
module load R/3.6.2
(other modules remain the same)

I get the following error whilst processing (loading the file into R is ok), with gcinfo() turned on:

Garbage collection 144 = 86+22+36 (level 0) ... 
288541.3 Mbytes of cons cells used (66%)
55320.1 Mbytes of vectors used (98%)
Garbage collection 145 = 86+23+36 (level 1) ... 
66679.3 Mbytes of cons cells used (15%)
56447.7 Mbytes of vectors used (100%)
Garbage collection 146 = 86+23+37 (level 2) ... 
39852.3 Mbytes of cons cells used (11%)
49032.4 Mbytes of vectors used (72%)
Garbage collection 147 = 87+23+37 (level 0) ... 
124935.0 Mbytes of cons cells used (36%)
64961.0 Mbytes of vectors used (95%)
Garbage collection 148 = 87+24+37 (level 1) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)

Error: cons memory exhausted (limit reached?)
In addition: Warning message:
Garbage collection 149 = 88+24+37 (level 0) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)
Lost warning messages
Execution halted
Garbage collection 150 = 89+24+37 (level 0) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)

Error: cons memory exhausted (limit reached?)

The job halts with Memory Utilized: 411.29 GB

I cannot understand why the job worked previously (just) but now does not when seemingly the only change is an updated proj and R version (3.5.2 to 3.6.2).

Might anyone have any suggestions as to why this is the case? And/or how to alter the memory management so that memory is not exhausted so easily?

Many thanks, Chris

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Error: cons memory exhausted (limit reached?): Memory Management?

Bert Gunter-2
You might do better posting this on r-sig-geo -- I believe you will more
likely find the expertise you seek there. Could be wrong of course.


Bert Gunter

"The trouble with having an open mind is that people keep coming along and
sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )


On Tue, Jan 19, 2021 at 3:01 PM Christopher Lloyd via R-help <
[hidden email]> wrote:

> Hi all, I think this is only the second time that I have posted so I
> apologise if my etiquette isn't quite correct.
> I'm loading a large (~30GB) geojson file into R using readOGR on a HPC. I
> am also loading a small shapefile, and then trying to undertake some
> processing on the large geojson using gBuffer from the rgeos package.
>
> I believe that the HPC is running Red Hat Enterprise Linux 7.4, and it
> certainly has around 750 GB free for user jobs. I have allocated the full
> amount of ram to the job.
> I previously used the following modules to undertake this task and it ran
> successfully, although only after tweaking the settings that I detail below
> - otherwise I had the same error:
>
> module load proj/5.0.0
> module load gdal/2.3.1
> module load geos/3.6.2
> module load gcc/6.4.0
> module load R/3.5.2
> module load python  # python 3 by default
> module load numpy/1.14.0 # requires module load python
>
> Settings at linux command line that previously allowed a successful run:
> R_MAX_VSIZE=720G
> R_GC_MEM_GROW=0
>
> --min-nsize=50000k --min-vsize=12M --max-ppsize=500000 (when executing the
> R script from command line)
>
> However, the modules have now been updated on the HPC, and so I am now
> using:
> module load proj/6.1.1
> module load R/3.6.2
> (other modules remain the same)
>
> I get the following error whilst processing (loading the file into R is
> ok), with gcinfo() turned on:
>
> Garbage collection 144 = 86+22+36 (level 0) ...
> 288541.3 Mbytes of cons cells used (66%)
> 55320.1 Mbytes of vectors used (98%)
> Garbage collection 145 = 86+23+36 (level 1) ...
> 66679.3 Mbytes of cons cells used (15%)
> 56447.7 Mbytes of vectors used (100%)
> Garbage collection 146 = 86+23+37 (level 2) ...
> 39852.3 Mbytes of cons cells used (11%)
> 49032.4 Mbytes of vectors used (72%)
> Garbage collection 147 = 87+23+37 (level 0) ...
> 124935.0 Mbytes of cons cells used (36%)
> 64961.0 Mbytes of vectors used (95%)
> Garbage collection 148 = 87+24+37 (level 1) ...
> 985162418403226.2 Mbytes of cons cells used (-2147483648%)
> 35274.8 Mbytes of vectors used (52%)
>
> Error: cons memory exhausted (limit reached?)
> In addition: Warning message:
> Garbage collection 149 = 88+24+37 (level 0) ...
> 985162418403226.2 Mbytes of cons cells used (-2147483648%)
> 35274.8 Mbytes of vectors used (52%)
> Lost warning messages
> Execution halted
> Garbage collection 150 = 89+24+37 (level 0) ...
> 985162418403226.2 Mbytes of cons cells used (-2147483648%)
> 35274.8 Mbytes of vectors used (52%)
>
> Error: cons memory exhausted (limit reached?)
>
> The job halts with Memory Utilized: 411.29 GB
>
> I cannot understand why the job worked previously (just) but now does not
> when seemingly the only change is an updated proj and R version (3.5.2 to
> 3.6.2).
>
> Might anyone have any suggestions as to why this is the case? And/or how
> to alter the memory management so that memory is not exhausted so easily?
>
> Many thanks, Chris
>
>         [[alternative HTML version deleted]]
>
> ______________________________________________
> [hidden email] mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: Error: cons memory exhausted (limit reached?): Memory Management?

R help mailing list-2
 Cheers Bert, Will do. Best wishes, Chris
    On Wednesday, 20 January 2021, 00:35:26 GMT, Bert Gunter <[hidden email]> wrote:  
 
 You might do better posting this on r-sig-geo -- I believe you will more likely find the expertise you seek there. Could be wrong of course.


Bert Gunter

"The trouble with having an open mind is that people keep coming along and sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )


On Tue, Jan 19, 2021 at 3:01 PM Christopher Lloyd via R-help <[hidden email]> wrote:

Hi all, I think this is only the second time that I have posted so I apologise if my etiquette isn't quite correct.
I'm loading a large (~30GB) geojson file into R using readOGR on a HPC. I am also loading a small shapefile, and then trying to undertake some processing on the large geojson using gBuffer from the rgeos package.

I believe that the HPC is running Red Hat Enterprise Linux 7.4, and it certainly has around 750 GB free for user jobs. I have allocated the full amount of ram to the job.
I previously used the following modules to undertake this task and it ran successfully, although only after tweaking the settings that I detail below - otherwise I had the same error:

module load proj/5.0.0
module load gdal/2.3.1
module load geos/3.6.2
module load gcc/6.4.0
module load R/3.5.2
module load python  # python 3 by default
module load numpy/1.14.0 # requires module load python

Settings at linux command line that previously allowed a successful run:
R_MAX_VSIZE=720G
R_GC_MEM_GROW=0

--min-nsize=50000k --min-vsize=12M --max-ppsize=500000 (when executing the R script from command line)

However, the modules have now been updated on the HPC, and so I am now using:
module load proj/6.1.1
module load R/3.6.2
(other modules remain the same)

I get the following error whilst processing (loading the file into R is ok), with gcinfo() turned on:

Garbage collection 144 = 86+22+36 (level 0) ... 
288541.3 Mbytes of cons cells used (66%)
55320.1 Mbytes of vectors used (98%)
Garbage collection 145 = 86+23+36 (level 1) ... 
66679.3 Mbytes of cons cells used (15%)
56447.7 Mbytes of vectors used (100%)
Garbage collection 146 = 86+23+37 (level 2) ... 
39852.3 Mbytes of cons cells used (11%)
49032.4 Mbytes of vectors used (72%)
Garbage collection 147 = 87+23+37 (level 0) ... 
124935.0 Mbytes of cons cells used (36%)
64961.0 Mbytes of vectors used (95%)
Garbage collection 148 = 87+24+37 (level 1) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)

Error: cons memory exhausted (limit reached?)
In addition: Warning message:
Garbage collection 149 = 88+24+37 (level 0) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)
Lost warning messages
Execution halted
Garbage collection 150 = 89+24+37 (level 0) ... 
985162418403226.2 Mbytes of cons cells used (-2147483648%)
35274.8 Mbytes of vectors used (52%)

Error: cons memory exhausted (limit reached?)

The job halts with Memory Utilized: 411.29 GB

I cannot understand why the job worked previously (just) but now does not when seemingly the only change is an updated proj and R version (3.5.2 to 3.6.2).

Might anyone have any suggestions as to why this is the case? And/or how to alter the memory management so that memory is not exhausted so easily?

Many thanks, Chris

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

 
        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.