• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 368
  • Last Modified:

Samba Memory Allocation Error

I have a Unix directory (SCO unix 3.2v4.2) mounted on a NT Server (NT4 service Pack 6a), using Samba (1.9.18p2 - January 26th 1998)

Its been running for a few years (since 1998) , but suddenly it keeps crashing - about twice a day.

Suddenly we can no longer access the Unix directory from the NT server.  MsgBox says [Insufficent system resources exist to complete request]
The log on the Unix says [Memory Allocation Error: Failed to expand to 17408 bytes]

When this happens, all we do is restart Samba on the Unix and it is then happy for a few hours.

It is an old legacy system, so nothing has changed (that I can think of)  recently. Maybe a few extra users over the years, although I tried removing the obsolete ones.

nb. I do not want to update to the latest version. It is a system that is being phased out, and it is working pretty well.   "If it aint broke..."  I know my current version works, as it has worked for 6 years!

Any guidance on where to look, things to try?  Over to you! Thanks.
0
MortimerCat
Asked:
MortimerCat
  • 3
  • 3
  • 2
2 Solutions
 
Alf666Commented:
If it suddenly started to crash with memory allocation errors, there is a slight chance you encoutered a bug in this samba version.

Most probably, one of your users (or client box) is doing something that the samba server does not like.

You might want to check vor viruses that do weird things. You also might want to check the samba log files.
0
 
MortimerCatAuthor Commented:
The Samba log files are where I found the message "Memory Allocation Error".

One thing I noticed. Today I was having a few problems with Samba, and I noticed that some of the exported directories had hundreds of files. Once I had a little purge, things seem to have improved.  Does Samba need to allocate more memory if the directory is full of files?

Coincidently, I am setting up a Linux machine, which will have the latest version of Samba.  I will probably end up using this to share the directorys, and cure my problem.
0
 
Alf666Commented:
The "memory allocation error" comes from the realloc function of samba. So, it can be many things.
One thing for sure. Directories with lots of files (500+) are known to be a hassle to the filesystem AND the softwares. That's why many softwares (like squid, among others) hash their data dir content among multiple dirs.

When a Windows client opens a directory, samba has to list it's entire content. This might be the problem.

So, it could be one reason.

One thing you might want to try though, is to check how much RAM your samba process is allowed to allocate. Get a shell and type :

ulimit -a
and
ulimit -a -H

There is a chance that your samba, getting more used, now claims more memory than what the system allows it to use.

0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
gheistCommented:
Your samba is very old, no wonder it is somewhat limited.
Maybe it is simply limited to 256 or 127 files or 32k or 64k of memory to hold file list or so.
Newer samba will not have this problem, and maybe will not run on older unix... so the problem stays.
0
 
MortimerCatAuthor Commented:
Today, I have set up a new machine with the latest versions of linux Samba TCP NFS etc.

I have exported directories from this new machine, and these are mounted on both the Old Unix and NT4 Server. So as far as my applications are concerned nothing has changed. The only difference is that the files physically reside on the new Linux Machine.

It looked pretty good for the first half hour, but then I had to leave for the day. So watch this space....
0
 
gheistCommented:
Feel free to disable unnecessary network services on your server, and subscribe to at least samba-anounce, to be one of first to know if particular NT SP broke samba connection again, and you need to update first in case of problems....
Next limit you will hit will be files sized above 2G, so test it out and do not get caught unprepared.
Do you need NFS for any hosts ??? If Yes - Makers of windows give you NFS client as part of Services for UNIX, so you can run even smaller set of applications in your network.
0
 
MortimerCatAuthor Commented:
My system is fully functional again.  

The answer was to have a PC running the latest version of Linux, TCP and Samba, and use this machine as the File Server. My very old machines are then happy with their shared directories once more.

Although the answers did not solve my problem directly, they made me see the light, and made me give up on my old software..

Thanks
0
 
gheistCommented:
Nobody had copy of your exact environment, so all the chat what can be wrong with older or generic software.
Nice it helped.
0

Featured Post

VIDEO: THE CONCERTO CLOUD FOR HEALTHCARE

Modern healthcare requires a modern cloud. View this brief video to understand how the Concerto Cloud for Healthcare can help your organization.

  • 3
  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now