We have a software company developing a client for us. The first version of this was server based and created a major issue for us and forced us to roll back. We then asked for a client-side version, but for obvious reasons we are a little nervous in releasing this, making sure enough testing takes place.
The software has been developed in .NET. One of our queries is over the amount of memory used, it can be anywhere up to 125MB used (more than anything else on the client), even when the client is not being used. I understand that this is to speed up the launching of the app, but my concern is the memory used locks at a varying amount each time application goes to an idol state and stays at the level until re launched. Is this normal? We asked the developer and the response was:
the reason that the memory usage is a little random when the IC viewer is closed is because we use a .NET method to dispose of the memory rather than explicitly specifying disposing of objects of when they are no longer required. This method will check for objects that are not being used at specific points and they will be disposed of accordingly.
Disposing of objects individually is more resource intensive and so re-coding the IC to operate in this way will more than likely have a negative effect on the performance
I appreciate this may be much ado about nothing, due to my lack of understanding, but I just want to make sure this version doesn't have any memory leaks or similar that could cause us more issues. Is there anything specific to ask them or things I should be looking for?