Solved

gzip files to different directory

Posted on 2009-04-13
5
5,294 Views
Last Modified: 2013-12-27
I am running out of space to hold both my cold and hot backup of my database files.
I use three gzip scipts to zip up the database files after the cold backup - to save time.
So basically I do a find on "tool*.dbf" and gzip it - ... a find on three different file names and gzip them.
What I need to do now is to gzip them to a different directory.
gzip -c tools*.dbf >/oradata1/backup/????
I can do one file at a time if I put the name in....
gzip -c tools_01.dbf >/oradata1/backup/tools_01.dbf.gz

This won't work with multiple files.
Is there a way around this?

Thanks
0
Comment
Question by:bkreynolds48
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 22

Expert Comment

by:blu
ID: 24130145
I don't know of a way to tell gzip to write to a different output directory.

I am not sure I understand the problem totally. You say you are doing a find on file. Isn't that finding each file individually anyway?     What is the problem with runnign gzip on each individual file?
0
 
LVL 1

Author Comment

by:bkreynolds48
ID: 24130190
Blu,
This is one of my gzip scripts.........
#
cd  /backup//cold
for next_file in  $(find . -type f -print |grep -v org   )
do
/bin/gzip -f  $next_file
done
#
 
I grep or grep -v to get the files I want.
There are almost 90 database files.
Could I use $next_file like this?
gzip -f -c $next_file >oradata1/backup/$next_file.gz
Thanks
0
 
LVL 1

Author Comment

by:bkreynolds48
ID: 24130398
blu,
I created a test script using the above logic and it seems to work.
Won't know for sure until Sunday when my cold backup runs.
Thanks
0
 
LVL 22

Accepted Solution

by:
blu earned 500 total points
ID: 24130462
You could do this:

cd /backup/cold
for next_file in  $(find . -type f  ! -name \*org\* )
do
/bin/gzip -f -c $next_file > oradata1/backup/$next_file.gz
done

It is too bad that gzip doesn't have a output file flag, then you could do it all in one line.
0
 
LVL 1

Author Closing Comment

by:bkreynolds48
ID: 31569507
Once again blu I really appreciate your help.
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A metadevice consists of one or more devices (slices). It can be expanded by adding slices. Then, it can be grown to fill a larger space while the file system is in use. However, not all UNIX file systems (UFS) can be expanded this way. The conca…
Every server (virtual or physical) needs a console: and the console can be provided through hardware directly connected, software for remote connections, local connections, through a KVM, etc. This document explains the different types of consol…
Learn how to get help with Linux/Unix bash shell commands. Use help to read help documents for built in bash shell commands.: Use man to interface with the online reference manuals for shell commands.: Use man to search man pages for unknown command…
In a previous video, we went over how to export a DynamoDB table into Amazon S3.  In this video, we show how to load the export from S3 into a DynamoDB table.

733 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question