segfault when trying to allocate a large vector

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

segfault when trying to allocate a large vector

pbruneau
Dear R contributors,

I'm running into trouble when trying to allocate some large (but in
theory viable) vector in the context of C code bound to R through
.Call(). Here is some sample code summarizing the problem:

SEXP test() {

int size = 10000000;
double largevec[size];
memset(largevec, 0, size*sizeof(double));
return(R_NilValue);

}

If size if small enough (up to 10^6), everything is fine. When it
reaches 10^7 as above, I get a segfault. As far as I know, a double
value is represented with 8 bytes, which would make largevec above
approx. 80Mo -> this is certainly large for a single variable, but
should remain well below the limits of my machine... Also, doing a
calloc for the same vector size leads to the same outcome.

In my package, I would use large vectors that cannot be assumed to be
sparse - so utilities for sparse matrices may not be considered.

I run R on ubuntu 64-bit, with 8G RAM, and a 64-bit R build (3.1.2).
As my problem looks close to that seen in
http://r.789695.n4.nabble.com/allocMatrix-limits-td864864.html,
following what I have seen in ?"Memory-limits" I checked that ulimit
-v returns "unlimited".

I guess I must miss something, like contiguity issues, or other. Does
anyone have a clue for me?

Thanks by advance,
Pierrick

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: segfault when trying to allocate a large vector

R devel mailing list
Hi Pierrick,

You're storing largevec on the stack, which is probably causing a stack
overflow.  Allocate largvec on the heap with malloc or one of the R memory
allocation routines instead and it should work fine.

Karl

On Thu, Dec 18, 2014 at 12:00 AM, Pierrick Bruneau <[hidden email]>
wrote:

>
> Dear R contributors,
>
> I'm running into trouble when trying to allocate some large (but in
> theory viable) vector in the context of C code bound to R through
> .Call(). Here is some sample code summarizing the problem:
>
> SEXP test() {
>
> int size = 10000000;
> double largevec[size];
> memset(largevec, 0, size*sizeof(double));
> return(R_NilValue);
>
> }
>
> If size if small enough (up to 10^6), everything is fine. When it
> reaches 10^7 as above, I get a segfault. As far as I know, a double
> value is represented with 8 bytes, which would make largevec above
> approx. 80Mo -> this is certainly large for a single variable, but
> should remain well below the limits of my machine... Also, doing a
> calloc for the same vector size leads to the same outcome.
>
> In my package, I would use large vectors that cannot be assumed to be
> sparse - so utilities for sparse matrices may not be considered.
>
> I run R on ubuntu 64-bit, with 8G RAM, and a 64-bit R build (3.1.2).
> As my problem looks close to that seen in
> http://r.789695.n4.nabble.com/allocMatrix-limits-td864864.html,
> following what I have seen in ?"Memory-limits" I checked that ulimit
> -v returns "unlimited".
>
> I guess I must miss something, like contiguity issues, or other. Does
> anyone have a clue for me?
>
> Thanks by advance,
> Pierrick
>
> ______________________________________________
> [hidden email] mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

        [[alternative HTML version deleted]]

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Reply | Threaded
Open this post in threaded view
|

Re: segfault when trying to allocate a large vector

pbruneau
Hi Karl,

I thought I had also tried to allocate on the heap - just tried again,
and everything went fine (even up to 10^9 cells). I guess everything's
OK then ^^
Thanks for your help!

Pierrick

On Thu, Dec 18, 2014 at 9:44 AM, Karl Millar <[hidden email]> wrote:

> Hi Pierrick,
>
> You're storing largevec on the stack, which is probably causing a stack
> overflow.  Allocate largvec on the heap with malloc or one of the R memory
> allocation routines instead and it should work fine.
>
> Karl
>
> On Thu, Dec 18, 2014 at 12:00 AM, Pierrick Bruneau <[hidden email]>
> wrote:
>>
>> Dear R contributors,
>>
>> I'm running into trouble when trying to allocate some large (but in
>> theory viable) vector in the context of C code bound to R through
>> .Call(). Here is some sample code summarizing the problem:
>>
>> SEXP test() {
>>
>> int size = 10000000;
>> double largevec[size];
>> memset(largevec, 0, size*sizeof(double));
>> return(R_NilValue);
>>
>> }
>>
>> If size if small enough (up to 10^6), everything is fine. When it
>> reaches 10^7 as above, I get a segfault. As far as I know, a double
>> value is represented with 8 bytes, which would make largevec above
>> approx. 80Mo -> this is certainly large for a single variable, but
>> should remain well below the limits of my machine... Also, doing a
>> calloc for the same vector size leads to the same outcome.
>>
>> In my package, I would use large vectors that cannot be assumed to be
>> sparse - so utilities for sparse matrices may not be considered.
>>
>> I run R on ubuntu 64-bit, with 8G RAM, and a 64-bit R build (3.1.2).
>> As my problem looks close to that seen in
>> http://r.789695.n4.nabble.com/allocMatrix-limits-td864864.html,
>> following what I have seen in ?"Memory-limits" I checked that ulimit
>> -v returns "unlimited".
>>
>> I guess I must miss something, like contiguity issues, or other. Does
>> anyone have a clue for me?
>>
>> Thanks by advance,
>> Pierrick
>>
>> ______________________________________________
>> [hidden email] mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel

______________________________________________
[hidden email] mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel