Solved

very long times to copy files on local disks

Posted on 2010-08-25
4
686 Views
Last Modified: 2013-12-28
Server SUN M4000.
Disks:
       0. c0t0d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
          /pci@0,600000/pci@0/pci@8/pci@0/scsi@1/sd@0,0
       1. c0t1d0 <SUN146G cyl 14087 alt 2 hd 24 sec 848>
          /pci@0,600000/pci@0/pci@8/pci@0/scsi@1/sd@1,0

Disk's RAID1 with SDS

Hi,

we have 2 Servers with a SUN One Directory Server 5.2 installed.
Both are in multimaster Replication since 2010/Feb.
Every Night a binary backup of DB Files running in a time of 1 Hour or less.
Since 2010/Jul/10 a db2bak (Binary Backup of Database Files)
run up to 9 Hours on both Servers!
At Jul/09 an Jul/10 no changes are made by the systems.

The producer and I find no solution until now...
Does anyone have ideas or tips?

Thanks,

regards
Robby

0
Comment
Question by:cure4you
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 

Author Comment

by:cure4you
ID: 33518721
thea are 765 db3 files to back up. The Size of all db3 files (inclusiv changelog) are 40 GB.
The Changelog sizes only are 24 GB.
The Changelog Size increases every day 70 MB.
The Bigest Changelog are 16864568 bytes.
0
 
LVL 9

Expert Comment

by:fcontrepois
ID: 33520277
how do you "copy" the files ?

try to replicate the backup transfer and check the results of
iostat -nxz on both side
bye
0
 
LVL 2

Expert Comment

by:Mohan Shivaiah
ID: 33547513
/usr/bin/rsync -avz --ignore-existing user@x.x.x.x:/path_of_the_new_location
x.x.x.x=Ipaddress of the server where u want to take the backup
OR
If you want to take the backup in the same server just replace "user@x.x.x.x:/path_of_the_new_location" with the path.
0
 

Accepted Solution

by:
cure4you earned 0 total points
ID: 33660280
The Tools that have copied the database is "db2back" and "db2ldif". Both are binaries from SUN Directory Server.

The Source of the Problem was the very large changelogs (3 files up to 16 GB).
All in all 40 GB Changelogs, that was handled by the slapd-process during the db2bak (copy db) or db2ldif was running.
Normaly that's no Problem, but with this large changelog it is still a Problem or a Bug in the sharing of the database files
between ns-slapd and db2bak at the same runtime.
We have rebuild the changelogs , the size of this changelogs now are 3 GB and the Problem has gone away.

Thanks at All,

regards,
Robby Lehmann

0

Featured Post

What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Shadow IT is coming out of the shadows as more businesses are choosing cloud-based applications. It is now a multi-cloud world for most organizations. Simultaneously, most businesses have yet to consolidate with one cloud provider or define an offic…
A company’s centralized system that manages user data, security, and distributed resources is often a focus of criminal attention. Active Directory (AD) is no exception. In truth, it’s even more likely to be targeted due to the number of companies …
In this video we show how to create an Accepted Domain in Exchange 2013. We show this process by using the Exchange Admin Center. Log into Exchange Admin Center.: First we need to log into the Exchange Admin Center. Navigate to the Mail Flow >> Ac…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.

737 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question