How to avoid the memory limit in R -


i'm trying replace values in matrix, "t"->1 , "f"->0, keep getting error messages:

error: cannot allocate vector of size 2.0 mb ...     reached total allocation of 16345mb: see help(memory.size) 

i'm using win7 computer 16gb of memory on 64-bit version of r in rstudio.

what i'm running

a <- matrix( dataset, nrow=nrow(dataset), ncol=ncol(dataset), byrow=true) memory.size() a[a=="t"] <- 1 

where dataset (about) 525000x300 size data frame. memory.size() line gives me less 4gb used, , memory.limit() 16gb. why replace line require memory execute? there way replace without hitting memory limit (and there tips on avoiding in general), , if so, going cost me lot of time run it? i'm still pretty new r don't know if makes difference depending on data class use , how r allocates memory...

when call line

a[a=="t"] <- 1 

r has create whole new boolean matrix index a. if huge, boolean matrix huge.

maybe can try working on smaller sections of matrix instead of trying in 1 shot.

for (i in 1:ncol(a)){   ix = (a[:,i] == "t")   a[ix,i] = 1 } 

it's not fast or elegant, might around memory problem.


Comments

Popular posts from this blog

linux - Does gcc have any options to add version info in ELF binary file? -

android - send complex objects as post php java -

charts - What graph/dashboard product is facebook using in Dashboard: PUE & WUE -