I am encountering an issue where memory is not being released back to the OS, even after clearing large in-memory tables and running .Q.gc[]. Despite multiple attempts, the process continues to consume a large amount of RAM.
The process handles real-time data ingestion and maintains large in-memory tables throughout the day. At the end of the day, data is partitioned on disk, and all in-memory tables are cleared using:
`tableName set 0#tableName / Empty the table
Garbage collection is then triggered:
.Q.gc[]
However, heap memory remains significantly larger than used memory.
Memory Stats After Cleanup (.Q.w[] Output):
(`used`heap`peak`wmax`mmap`mphy`syms`symw)!
(550678528j, 6174015488j, 219915747328j, 0j, 0j, 1622702747648j, 390966606j, 126656286272j)
Observations
used = 550MB but heap = 5.75GB, indicating significant fragmentation.
Peak memory usage reached 204GB at some point.
symw = 118GB, symbol memory is consuming a large portion of RAM.
.Q.gc[] runs successfully but does not free memory back to the OS.
Symbol memory (symw) is high, but symbols cannot be garbage collected without a restart.
Are there any techniques I can try to reduce memory fragmentation?
Is there a way to force KDB+ to release memory back to the OS without restarting?
Thank you in advance for your response.
symsandsymwand the creation of so many unique symbol values should be investigated.126656286272%390966606=323suggesting long symbols are being generated. On disk what is the size/count of thesymfile? Does it have a similar issue?