[r] How to Solve the Error( error:cannot allocate vector of size 1.1 Gb)

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

[r] How to Solve the Error( error:cannot allocate vector of size 1.1 Gb)

Kum-Hoe Hwang
Hi, Gurus

Thanks to your good helps, I have managed starting the use of a text
mining package so called "tm" in R under the OS of Win XP.

However, during running the tm package, I got another mine like memory problem.

What is a the best way to solve this memory problem among increasing a
physical RAM, or doing other recipes, etc?

###############################
###### my R Script's Outputs ######
###############################

> memory.limit(size = 2000)
NULL
> corpus.ko <- Corpus(DirSource("test_konews/"),
+  readerControl = list(reader = readPlain,
+  language = "UTF-8", load = FALSE))
> corpus.ko.nowhite <- tmMap(corpus.ko, stripWhitespace)
> corpus <- tmMap(corpus.ko.nowhite, tmTolower)
> tdm <- TermDocMatrix(corpus)
>  findAssocs(tdm, "city", 0.97)
error:cannot allocate vector of size 1.1 Gb
-------------------------------------------------------------
>
################################
Thanks for your precious time,

--
Kum-Hoe Hwang, Ph.D.

Phone : 82-31-250-3516
Email : [hidden email]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Reply | Threaded
Open this post in threaded view
|

Re: [r] How to Solve the Error( error:cannot allocate vector of size 1.1 Gb)

Uwe Ligges-3


Kum-Hoe Hwang wrote:
> Hi, Gurus
>
> Thanks to your good helps, I have managed starting the use of a text
> mining package so called "tm" in R under the OS of Win XP.
>
> However, during running the tm package, I got another mine like memory problem.
>
> What is a the best way to solve this memory problem among increasing a
> physical RAM, or doing other recipes, etc?


How can we know? We do not know anything about your problem. Maybe not
even 64Gb are sufficient or maybe it is simplest to just use a huge
machine with 16Gb....

Uwe Ligges


> ###############################
> ###### my R Script's Outputs ######
> ###############################
>
>> memory.limit(size = 2000)
> NULL
>> corpus.ko <- Corpus(DirSource("test_konews/"),
> +  readerControl = list(reader = readPlain,
> +  language = "UTF-8", load = FALSE))
>> corpus.ko.nowhite <- tmMap(corpus.ko, stripWhitespace)
>> corpus <- tmMap(corpus.ko.nowhite, tmTolower)
>> tdm <- TermDocMatrix(corpus)
>>  findAssocs(tdm, "city", 0.97)
> error:cannot allocate vector of size 1.1 Gb
> -------------------------------------------------------------
> ################################
> Thanks for your precious time,
>
> --
> Kum-Hoe Hwang, Ph.D.
>
> Phone : 82-31-250-3516
> Email : [hidden email]
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.