We help IT Professionals succeed at work.

Ubuntu Packages with all Dependencies

We have an Ubuntu Server (Xenial) that will never touch the internet.  We can't query the Ubuntu Repo from another Ubuntu Server that has Internet access, how can we get the packages we need, with all dependencies?
Comment
Watch Question

Fractional CTO
Distinguished Expert 2019
Commented:
Somehow you'll have to transfer the package files, which will require some transfer mechanism.

Maybe by connecting a disk to a net connected machine, then connect the disk to your Xenial machine.

If I were tasked with this... I'd do the following.

1) On the Xenial machine, setup a local repository.

2) Plumb APT to pull from #1.

3) Then setup a mirror repository on some net connected machine, which matches #1.

4) Then keep #3 updated.

5) At some frequency, use your available transfer mechanism to copy files from #3 to #1.

6) Then use your normal APT commands to install updates.

Note: Using this approach you'll have all packages, which will provide all dependencies.

Note: If your Xenial machine is short on disk space, then point APT to the disk you connect as your transfer mechanism.
ManieyaK_CSSP

Author

Commented:

Can i use Windows to pull down the packages?

Gerwin Jansen, EE MVETopic Advisor
Most Valuable Expert 2016

Commented:
You can use Windows to create the offline repository that you can update from. Here’s the how to:

https://help.ubuntu.com/community/AptGet/Offline/Repository
Dr. KlahnPrincipal Software Engineer

Commented:
David has pointed at the heart of the problem.  Debian (and Ubuntu, which derives from Debian) resolves all dependencies when updating a package.  Sometimes this can result in an amazing large cascade.  For example, updating one of the "libxxxxxx" libraries might require updating GCC, which would in turn call for updating many more "lib" libraries, which would cascade even further into forcing the update of every utility which uses one of those libraries, and finally result in rebuilding grub and other firmware tools.  Updating a 48K single library could well cascade into updating over 100 packages with consequent downloads of over 200 MB.

So the method of "download a single package install kit" just isn't practical.  Either the system must be allowed access to the internet, or a local package repository must be built.

Now here is a thing.  If a local package repository is going to be put on the LAN, on another machine, by implication that machine has access to the internet.  "As one, so all" -- if that is the situation, then you might as well give the Ubuntu machine access to the internet to do his updates.  Permanently or periodically, doesn't matter; it will make things easier.

Here is another thing.  Looking at the update descriptions, most of them are to fix things such as buffer overruns, i.e., security issues.  If the machine does not have internet access then security issues are not so great a concern, and -- so long as the machine is running correctly, the need for updates is greatly reduced.  Unless your company has a policy that all updates must be applied, perhaps it is best just to let sleeping dogs lie.
David FavorFractional CTO
Distinguished Expert 2019

Commented:
Dr. Klahn provided a good explanation of why setting up a full repository is useful.

As Dr. Klahn stated, one simple library update can trigger updating 100s of dependencies, so very rarely can you just download one package file, install it, and be done.

Another simple approach you can take is to connect your machine to the net, then use a firewall to drop all connections.

Then only allow connections, during time when you update your packages.