4

I am trying to extract the various attributes of a large ArcGIS grid file similar to this forestry map

Even using a smaller cropped version of the file the operation is slow, but more importantly requires a many GB temp file that eventually exhausts my harddrive causing the operation to fail. I have approximately 35 GB of free space on this drive.

foo <- raster("grid/w001001.adf")
allLayers <- deratify(foo)

It is possible with a cropped version of the above file to extract a single attribute layer, but this still requires a multi-GB temp file when the cropped ArcGIS grid file directory is only ~160MB. Specifying a filename in the function doesn't seem to improve the amount of harddrive space used.

allLayers <- deratify(fooCropped, att="BA_GE_3")

I would like to extract several layers and then do pixel by pixel calculations using those attributes. Is a way to extract the attribute table as a dataframe, do calculations on that and re-associate it with the raster?

System information:

> R.Version()
$platform
[1] "x86_64-apple-darwin15.3.0"
$arch
[1] "x86_64"
$os
[1] "darwin15.3.0"
$system
[1] "x86_64, darwin15.3.0"
...

‘raster’ version 2.5-2 
R version 3.2.4
5
  • A smaller file of similar construction would have been appreciated. That's 900+ MB and the Forest Service has a slow server. Commented Apr 14, 2016 at 20:46
  • I haven't tested it myself, but this file lemma.forestry.oregonstate.edu/data/species-maps is described as the same format. The size is clearly part of the issue I am having, hence posting this link to the above file. Commented Apr 14, 2016 at 20:53
  • 1
    deratify will make a layer for each attribute. How many attributes are there? Commented Apr 14, 2016 at 22:53
  • 1
    There are 180 layers; however, I run in to disk space issues trying to deratify a single attribute, does that make a layer for each attribute in the background? Commented Apr 14, 2016 at 22:58
  • I tried to repro this using a more manageable ArcGIS grid on Windows R. I attached ProcMon to rsession.exe and I didn't see any temporary files created by Rsession.exe. Are the temp files being created by R, by raster, or by OSX? Commented Apr 23, 2016 at 22:41

1 Answer 1

0

I run calculations on stream temperature data, which is often millions of records. Whenever I run into memory issues, I incorporate the gc() and rm() functions into loops where I'm processing data. You might consider using these functions in your loop to process cropped files such that rm() removes objects you don't need from the R environment and gc() returns the memory to your system.

If you have a multicore system you can then work this into a loop to batch process all your cropped files, with cleanup after each node completes a looped deratify() call. Without more of your code it's hard to suggest how you would implement.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for the answer, I'm not presently using any loops. The examples provided are actually all the code I'm trying to run. The space is not with RAM which rm() and gc() can help with, but the size of the temp file made by the raster package

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.