I am reading from a netcdf file with 97 arrays of size (1165,720,1440), reading each array using ncdf_varget. Because of memory allocation problems, I started using delvar to erase the arrays after finished using them - ultimately creating 19 new arrays of the same size from the original arrays (by adding some of them together). Yet am still getting the error Unable to allocate memory: to make array. Cannot allocate memory. When I test delvar and try to print out one of the deleted arrays, they no longer exist, so it does appear to be deleting them. This is IDL Version 8.3 on linux platform.
I calculate that each array is about 1.1 GB (assuming they are byte arrays). So, the 19 arrays is totaling over 20 GB. How much RAM does your machine have? When you look at the performance monitor, does it start to thrash (push things in RAM out to disk)? What does the monitor show in terms of memory usage (does it go down at all when the delvars are executed)? At what point do you see the memory problem - after the 19th array?
Note: I used gnome-system-monitor on some of my Linux machines for performance monitoring and trouble-shooting.
Doug - it actually occurs after only the third of the 19 arrays is computed, that's what is so confounding. Now, I did previously declare each of the 19 arrays using fltarr, but I did not get an error at that point. Looks like my total ram is 132125536 kB, with 403164 kB free.
Maybe NetCDF is still holding on to memory. Does the PRO-code use NCDF_CLOSE after it's done with each file? What do "help, /memory" and "memory()" show as the code loops through the data? Hopefully that'll help determine if it's NetCDF or IDL holding on to the memory.
I am only dealing with one very large netcdf file - with over 100 variables in it - so I only close it once at the very end. After I read in the netcdf file and ncdf_varid 90 of the variables, I get
heap memory used: 1512507, max: 140114814761, gets: 1511, frees: 640
Then I set fltarr for 19 files of size 1165, 720, 1440 and I get:
heap memory used: 91799787243, max: 91799787243, gets: 1530, frees: 640
then I ncdf_varget lon, lat, time, and two of the variables, and sum them to one of my newly created arrays, and get:
heap memory used: 101462790563, max: 106294280195, gets: 1542, frees: 6
I then use delvar to delete the two arrays read in with ncdf_varget that were used to create the new array, and then do ncdf_varget on 10 other variables, and sum those 10 into another of the newly created arrays, then do delvar on all 10 of those, and then get
heap memory used: 149777672003, max: 154609161635, gets: 1572, frees: 6
then I do ncdf_varget on three additional variables, which is when I get the Unable to allocate memory: to make array. error.
It seems like my memory usage keeps growing, even after using delvar.
As a way around, I have created 19 separate idl scripts, writing 19 separate netcdf files, each containing one of the final 19 rebinned arrays. Then I create another idl script to read those in, and create one total file. While I am now able to create an ascii output file, writing the netcdf file seems to give an error in the line NCDF_CONTROL, newcdf, /ENDEF (switching modes) when there are more than a certain number of variables being written - am getting NC_ERROR=-62 IDL. The error does not occur for the first several variables, but only when I get to the third. Any help on this one?
Sign Up for News & Updates: Stay informed with the latest news, events, technologies and special offers.