megaframe has asked for the wisdom of the Perl Monks concerning the following question:
I have a some C code that's malloc'ing a bunch of memory. Using swig I created some simple perl bindings and call out this function in perl. I also have a memory free block in the C code.
When I compile and run just the C code with valgrind, their's no issues. It all gets created and free'd. When I use it in perl and call the destructor it never actually releases the memory.
To test this I wrote up a simple test case of what I was doing.
C code, lib.h
#include <stdio.h> #include <stdlib.h> #include <string.h> #include <ctype.h> #include <errno.h> typedef struct { int index; double value; } x_space;
C code, lib.c
include <stdio.h> #include <stdlib.h> #include <string.h> #include <ctype.h> #include <errno.h> #include "lib.h" #define Malloc(type,n) (type *)malloc((n)*sizeof(type)) x_space* new_space(){ x_space *space = Malloc(x_space,1000); return space; } void destroy_space(x_space *space){ free(space); }
swig binding, lib.i
/* lib.i */ %module libspace %{ #include "lib.h" extern x_space* new_space(); extern void destroy_space(x_space* space); %} extern x_space* new_space(); extern void destroy_space(x_space* space);
Perl code
#!/usr/bin/perl use libspace; my $space = libspace::new_space(); libspace::destroy_space($space); exit;
Compiling/Running
swig -perl lib.i g++ -Wall -Wconversion -fPIC -c -I/usr/lib/perl/5.14.2/CORE lib.c lib_ +wrap.c g++ -shared lib.o lib_wrap.o -o libspace.so perl -I`pwd` test.pl
Is their something else I should be doing to get the .so to release this memory?
UPDATE: So it seems the free is working but its releasing the memory back to perl and not to the system. I'm consuming some 40G of memory in the tool I was using in threads. Safest solution was to fork() and exit when operation was complete. Only other thing would have been to reuse the memory in another thread. If I'm wrong and it should go back to system let me know.
UPDATE 2: Found my issue. In the original code I'm wrapping around it uses a single malloc for this giant data series (in the multiple of gigabytes). I can't do that as I don't know the size of the data till I'm done loading it. So I used a linked list to dynamically allocate the space as the data is fed in from perl. These mallocs are small so glibc isn't releasing them back to the system even when several gigabytes are free'd. To fix the issue I added "#include <malloc.h>" to the header and "malloc_trim(0)" to the destroy call.
|
|---|