Wasted space on workstation

Posted on 2004-08-09
Last Modified: 2010-04-11

I have 5 new workstation in my business domain. Like all new computers, they all come with 40 or even 80Gb. Obviously all the users work is going on to the central storage server.  Because of that, the workstations usualy don't use up more than 10Gb. So i have about 30Gb free on all the workstations.  5x30=150Gb wasted!!
Is there a way to combine all the free space on the workstations into one big virtual volume or something like it?  And is it recomended?
Question by:AlexanderR
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions

Expert Comment

ID: 11755000
I don't know of a way to combine them all, but you can certainly use them by enabling sharing on them and them mapping systems to them.

I don't recommend it for the following reasons:
1) It's hard to backup. You'd have to buy remote agents for each system you store data on.
2) It's difficult to manage. You store all files in a central place for a reason.
3) Systems are unreliable. You have nothing preventing the users from turning the systems off and preventing access to data. In addition laptop users may roam.

Good Luck.
LVL 11

Author Comment

ID: 11755258
Thanks. I understand.

Does anyone have ideas how otherwise i can use that space?

Accepted Solution

intreeg earned 200 total points
ID: 11755272
If you are simply speaking of the 5 new machines. It would be a better idea to remove the larger size drives and replace them with something more suitable for their needs. This way you could simply install the larger drive directly into the centeral file server. There would be a little added cost to acquire the smaller drives and possible and IDE RAID controller. Howerver, you may also be able to sell off a couple of the larger drives to off set the cost. I cannot imagine a 10 or 15GB drive costing that much anymore. Considering I seen a 200GB for $79.99 the other day. Not the greatest solution either but would solve some of the issues given by syn_ack_fin.
Manage your data center from practically anywhere

The KN8164V features HD resolution of 1920 x 1200, FIPS 140-2 with level 1 security standards and virtual media transmissions at twice the speed. Built for reliability, the KN series provides local console and remote over IP access, ensuring 24/7 availability to all servers.

LVL 11

Author Comment

ID: 11755357
Thanks, i will try the options out.

Expert Comment

ID: 11755427
Just Had A Thought:
1. Sell Two Drives.
2. Buy 5 Smaller Drives
3. Buy 1 IDE RAID controller
4. Install and Load the smaller drives in the workstations
5. Install and configure the RAID controller and 3 larger drives for RAID5 using an older machine that you could proabably piece together for next to nothing. Install Linux on it to avoid the requirement of another MS license. You would just need to install/configure Samba and you should have a VERY inexpensive and fairly large fileserver.
LVL 11

Author Comment

ID: 11756270
I realy like that option.
Only have one question.  That older machine, what is the minimum power you recomend for what you have said?  I got a 300MHz and i think somewhere below 256Mb RAM, for an old machine.

Only problem left, is if my boss will agree to do that.  Doubt that.  I will try it though.  Sounds good.

Expert Comment

ID: 11756710
That 300 should be fine. Linux is not resource heavy at all with no X-server running, which you wouldn't need obviously. I have a P90 I am using as a firewall at my house. Runs fine, just takes a while to load or patch, but that does not take place to often so it works out nicely.
As for your boss, I know how that goes. Just got to remeber if it will save them money in the long run (and the will) they are more likely to do it. Stick to you guns, Linux is  free how much better can it get?

Expert Comment

ID: 11758530
At my place now, 120AU$ can buy a 120G IDE HDD.  So 1AU$ per G. Buying a 20G or 10G is hard (and cost 2/3 of the cost of a 40G HDD), unless you want to use SCSI. However you can buy old/cheap/obsolete HDD but slower speed and (maybe) not supported.

 intreeg suggestion is very interesting, but not so cost effective :( And eventhough the 300 will work effectively as a file server, you may have problem with networking and setting up the server. A 300MHz may go with a mainboard that not support 40 or 80G HDD as well, unless you pay for a good RAID controller.

I'd better go with nothing (just leave it there, you'll need it in the future). In the next few years, no one will need a 20G HDD (and there's no way you can resell those machines once you finished with them). Use the space for swap file and create partitions to back up each machine (so you can restore it in minutes when crashed), so you won't feel like wasting space. And it won't cost you extra to do that.

Just my thoughts. It's up to you anyway, and good luck.

Expert Comment

ID: 11763919
I agree with most of what NetExpert said, except having trouble with networking and setting up the server. I am not sure what is meant by this. Linux is not as scary as most people think. It is rather simple actually, and the documentation cannot be beat. As for the RAID controller, yes you could end up spending quite a bit if you wanted to. Or you could buy two of them for around $20US and have a reserve incase the first one breaks.
As for the drive you would be installing in the PC's, just as stated before its about $1US/GB so 5 x20GB = $100US + $40US(for 2 IDE RAID controllers) = $140. Through in some shipping cost and you are still under $175US. That probably one of the cheapest fileservers I have seen.

And for the idea of using a second partition to store a backup image etc. Why not store them all on a central storage area and push the image across the network for reinstallation? That method has worked for me for years. Not too mention not giving the end-user too much space to fill with mp3z or any other garbage that we allow them to put on their hard drives. Then your back image becomes huge and takes more time and space than using one central core system image and inform the end-user that any data not stored on their network drive will be lost in the case of a crash. The simply running cleaning scripts on your fileserver to erase *.mp3  etc or whatever administration method you choose. As long as these administration polices are followed and only applications deemed necessary for work function are installed, the desktops should not need more than 20GB unless you are producing digital video or other storage heavy processes. For example you can run win2k or xp on 4GB with office installed. Other than a few other job specific applications that should be about all a streamline or core installation should require for typical business functionality.

I do not believe that NetExperts suggestion is a bad one by any means. However, I do not see it as being a solution to your question. As well, it would not provide a centralized point of administration.

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

When you try to share a printer , you may receive one of the following error messages. Error message when you use the Add Printer Wizard to share a printer: Windows could not share your printer. Operation could not be completed (Error 0x000006…
During and after that shift to cloud, one area that still poses a struggle for many organizations is what to do with their department file shares.
Viewers will learn how to connect to a wireless network using the network security key. They will also learn how to access the IP address and DNS server for connections that must be done manually. After setting up a router, find the network security…
Monitoring a network: why having a policy is the best policy? Michael Kulchisky, MCSE, MCSA, MCP, VTSP, VSP, CCSP outlines the enormous benefits of having a policy-based approach when monitoring medium and large networks. Software utilized in this v…
Suggested Courses
Course of the Month9 days, 18 hours left to enroll

624 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question