[英]Why does gc() not free memory?

I run simulations on a Windows 64bit-computer with 64 GB RAM. Memory use reaches 55% and after a finished simulation run I remove all objects in the working space by rm(list=ls()), followed by a double gc().

我在裝有64gb內存的Windows 64bit電腦上運行模擬。內存使用達到55%,在完成模擬運行后,我使用rm(list=ls())刪除工作空間中的所有對象,然后使用double gc()。

I supposed that this would free enough memory for the next simulation run, but actually memory usage drops by just 1%. Consulting a lot of different fora I could not find a satisfactory explanation, only vague comments such as:


"Depending on your operating system, the freed up memory might not be returned to the operating system, but kept in the process space."


I'd like to find information on:


  • 1) which OS and under which conditions freed memory is not returned to the OS, and
  • 1)哪些操作系統和哪些條件下釋放的內存沒有返回到操作系統,以及。
  • 2) if there is any other remedy than closing R and start it again for the next simulation run?
  • 2)如果有任何其他補救措施,而不是關閉R,並在下一次模擬運行時重新啟動它?

2 个解决方案



How do you check memory usage? Normally virtual machine allocates some chunk of memory that it uses to store its data. Some of the allocated may be unused and marked as free. What GC does is discovering data that is not referenced from anywhere else and marking corresponding chunks of memory as unused, this does not mean that this memory is released to the OS. Still from the VM perspective there's now more free memory that can be used for further computation.


As others asked did you experience out of memory errors? If not then there's nothing to worry about.


EDIT: This and this should be enough to understand how memory allocation and garbage collection works in R.


From the first document:


Occasionally an attempt is made to release unused pages back to the operating system. When pages are released, a number of free nodes equal to R_MaxKeepFrac times the number of allocated nodes for each class is retained. Pages not needed to meet this requirement are released. An attempt to release pages is made every R_PageReleaseFreq level 1 or level 2 collections.




To see used memory try running gc() with verbose set to TRUE:



Here's a result with an array of 10'000'000 integers in memory:


Garbage collection 9 = 1+0+8 (level 2) ... 
10.7 Mbytes of cons cells used (49%)
40.6 Mbytes of vectors used (72%)
          used (Mb) gc trigger (Mb) max used (Mb)
Ncells  198838 10.7     407500 21.8   350000 18.7
Vcells 5311050 40.6    7421749 56.7  5311504 40.6

And here's after discarding reference to it:


Garbage collection 10 = 1+0+9 (level 2) ... 
10.7 Mbytes of cons cells used (49%)
2.4 Mbytes of vectors used (5%)
         used (Mb) gc trigger (Mb) max used (Mb)
Ncells 198821 10.7     407500 21.8   350000 18.7
Vcells 310987  2.4    5937399 45.3  5311504 40.6

As you can see memory used by Vcells fell from 40.6Mb to 2.4Mb.




The R garbage collector is imperfect in the following (not so) subtle way: it does not move objects (i.e., it does not compact memory) because of the way it interacts with C libraries. (Some other languages/implementations suffer from this too, but others, despite also having to interact with C, manage to have a compacting generational GC which does not suffer from this problem).


This means that if you take turns allocating small chunks of memory which are then discarded and larger chunks for more permanent objects (this is a common situation when doing string/regexp processing), then your memory becomes fragmented and the garbage collector can do nothing about it: the memory is released, but cannot be re-used because the free chunks are too short.


The only way to fix the problem is to save the objects you want, restart R, and reload the objects.


Since you are doing rm(list=ls()), i.e., you do not need any objects, you do not need to save and reload anything, so, in your case, the solution is precisely what you want to avoid - restarting R.


PS. Garbage collection is a highly non-trivial topic. E.g., Ruby used 5 (!) different GC algorithms over 20 years. Java GC does not suck because Sun/Oracle and IBM spent many man-years on the their respective implementations of the GC. On the other hand, R and Python have lousy GC - because no one bothered to invest the necessary man-years - and they are quite popular. That's worse-is-better for you.

垃圾收集是一個非常重要的話題。例如,Ruby在20年中使用了5(!)不同的GC算法。Java GC並不糟糕,因為Sun/Oracle和IBM在各自的GC實現上花費了多年的時間。另一方面,R和Python有糟糕的GC—因為沒有人費心去投資必要的人年—它們非常流行。這是壞的就是好的。



粤ICP备14056181号  © 2014-2021 ITdaan.com