I've written an application that uses OpenSSL library. But I've found there are memory leaks of several kilobytes whenever a client connects to my server program. Both uses TLS. The server program never frees up the memory when the client disconnects. I've found out this problem from issuing "top" command in GNU/Linux. Here is a summary of what happens inside the server program from the moment a client connects till it disconnects:
BIO_do_accept(acc) (Connection stage)
client = BIO_pop(acc)
ssl = SSL_new(ctx)
SSL_set_bio(ssl, client, client)
SSL_read/write(ssl ....) (Processing data)
SSL_shutdown(ssl) (Disconnection stage)
acc is a BIO object, whereas CTX is a SSL_CTX object. Both of the these objects are only freed up using SSL_CTX_free(ctx) and BIO_free(acc) when the server program shutdowns. My server program was written using the guideline from the book "Network Security with OpenSSL". Ironically, the example in the book also have memory leak.
I use valgrind 2.2 and valgrind 2.4 to debug my server program. Valgrind 2.2 crashes but is able to report that there are several block of leaks that are caused by CRYPTO_malloc call. Valgrind 2.4 completes successfully but complains that there is a big block of memory leak that is caused by pthread_create. The result from valgrind 2.4 is very weird since no data pointer is passed to thread. I also debug the example from the book using valgrind. The result also shows that there are several block of leaks that are caused by CRYPTO_malloc.
Can anybody tell me what causes the memory leak? Do I miss out any important step? Do I miss out any OpenSSL function that frees up memory?