Solved

sqlldr - Generate Files "BAD" from several "INFILES" data

Posted on 2014-12-02
8
331 Views
Last Modified: 2014-12-10
Hi experts, I have the next control_file:
LOAD DATA
INFILE '/xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141202020202'
INFILE '/xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141203030303'
APPEND
INTO TABLE "INT_OPERATIONS"
FIELDS TERMINATED BY '!'
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
...COLUMNS...
 )

Open in new window

and executing:
sqlldr $USER_GIS/$PASS_GIS@$DB_GIS control=$TEMP_GIS/control_$dt.ctl log=$LOGS_GIS/sqlldr_$dt.log bad=$BAD_GIS/Transfer_$dt.bad discard=$DISCARD_GIS/Transfer_$dt.dsc

Open in new window

The sqlldr log says:
SQL*Loader: Release 11.2.0.3.0 - Production on Tue Dec 2 12:33:07 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Control File:   /xahome/xa21/CAP/TEMP_GIS/control_20141202123304.ctl

There are 2 data files:
Data File:      /xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141202020202
  Bad File:     /xahome/xa21/CAP/BAD_GIS/Transfer_data.20141202020202.bad
  Discard File: /xahome/xa21/CAP/DISCARD_GIS/Transfer_data.20141202020202.dsc
 (Allow all discards)
Data File:      /xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141203030303
  Bad File:     /xahome/xa21/CAP/TEMP_GIS/Transfer_data.bad
  Discard File:  none specified

Open in new window


That is for the second file, it forms a "Bad file" undated, like the "Discard file"
I would like to see:
Data File:      /xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141202020202
  Bad File:     /xahome/xa21/CAP/BAD_GIS/Transfer_data.20141202020202.bad
  Discard File: /xahome/xa21/CAP/DISCARD_GIS/Transfer_data.20141202020202.dsc
 (Allow all discards)
Data File:      /xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141203030303
  Bad File:     /xahome/xa21/CAP/BAD_GIS/Transfer_data.20141203030303.bad
  Discard File: /xahome/xa21/CAP/DISCARD_GIS/Transfer_data.20141203030303.dsc

Open in new window

The generation of the control file is dynamic and is obtained by reading a list of data files to run, modifying a generic control file:
     for f in $(cat $TEMP_GIS/lista_$dt.lst)

       do

          sed -i ${line}" i\INFILE '$f'" $TEMP_GIS/control_$dt.ctl  #Modifica archivo de control con los files a cargar

       done

Open in new window

And date:
dt=`date +%Y%m%d%H%M%S`

Open in new window

The generic control file is:
LOAD DATA
APPEND
INTO TABLE "INT_OPERATIONS"
FIELDS TERMINATED BY '!'
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
...COLUMNS...
 )

Open in new window

It is possible to solve this from the "FOR", or should I use another form of logging?
I would appreciate any contribution to achieving individualized logging.
Thanks and regards
0
Comment
Question by:carlino70
  • 4
  • 2
  • 2
8 Comments
 
LVL 76

Assisted Solution

by:slightwv (䄆 Netminder)
slightwv (䄆 Netminder) earned 250 total points
ID: 40476711
Per the docs:
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_params.htm#SUTIL1019

A bad file name specified on the command line becomes the bad file associated with the first INFILE statement in the control file.

The solutions are:
1: only have ONE INFILE per control file
or
2: Don't specify the BAD/LOG/DISCARD files on the command line and have a similar sed command change the values in the control file.
0
 
LVL 34

Expert Comment

by:johnsone
ID: 40476844
I wouldn't even specify infile in the controlfile.  I would run a different loader for each file.
for f in $(cat $TEMP_GIS/lista_$dt.lst)
  do
    dt=`date +%Y%m%d%H%M%S`
    sqlldr $USER_GIS/$PASS_GIS@$DB_GIS control=$TEMP_GIS/control_$dt.ctl data=$f log=$LOGS_GIS/sqlldr_$dt.log bad=$BAD_GIS/Transfer_$dt.bad discard=$DISCARD_GIS/Transfer_$dt.dsc
  done

Open in new window


If your load could run in less than a second, you would have to do something to sequence the log files further, but specifying the data file on the command line removes the need to have to modify a file.
0
 

Author Comment

by:carlino70
ID: 40478316
johnsone, including "$ dt" in the "FOR", "sqlldr" works,
inserting records from all files lista.ctl
    for f in $(cat $TEMP_GIS/lista.lst)

    do

        dt=`date +%Y%m%d%H%M%S`

        sqlldr $USER_GIS/$PASS_GIS@$DB_GIS control=$TEMP_GIS/control.ctl data=$f log=$LOGS_GIS/sqlldr_$dt.log bad=$BAD_GIS/Transfer_$dt.bad discard=$DISCARD_GIS/Transfer_$dt.dsc

    done

Open in new window

but the log files, bad and discard (these last only if there are problems), is generated only once:
SQL*Loader: Release 11.2.0.3.0 - Production on Wed Dec 3 10:02:03 2014

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Control File:   /xahome/xa21/CAP/TEMP_GIS/control.ctl
Data File:      /xahome/xa21/CAP/INPUTS_GIS/Transfer_data.20141203123543
Bad File:     /xahome/xa21/CAP/BAD_GIS/Transfer_20141203100203.bad
Discard File: /xahome/xa21/CAP/DISCARD_GIS/Transfer_20141203100203.dsc
 (Allow all discards)
...

Open in new window

I would get a log for each execution sqlldr for each data file, is this possible ?, could help me with an example?
Thank you
0
 

Author Comment

by:carlino70
ID: 40478323
Thanks Slightwv, I'm looking to make log files, bad and discard take the date of each of the data files,  is this possible ?
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 

Author Comment

by:carlino70
ID: 40478328
johnsone, adding:

Could I get the date of name of each of the data files and thus form the name of the log files, bad and discard, dento the "FOR"?

Thanks
0
 
LVL 76

Expert Comment

by:slightwv (䄆 Netminder)
ID: 40478335
>> I'm looking to make log files, bad and discard take the date of each of the data files,  is this possible ?

Yes.  Based on the documentation link I provided:  One INFILE per control file.
0
 
LVL 34

Accepted Solution

by:
johnsone earned 250 total points
ID: 40478357
I am making an assumption that the file name will always be like the example.  That is, it will contain only one period, no spaces and everything after the period is considered the "date".  That is what you show in your sample data.

To extract that from the file name, you would do this:

f_date=`basename $f | cut -f2 -d.`

This would change the SQL*Loader command to:

sqlldr $USER_GIS/$PASS_GIS@$DB_GIS control=$TEMP_GIS/control.ctl data=$f log=$LOGS_GIS/sqlldr_${f_date}.log bad=$BAD_GIS/Transfer_${f_date}.bad discard=$DISCARD_GIS/Transfer_${f_date}.dsc

So, the partial code that you have would become:
for f in $(cat $TEMP_GIS/lista.lst)
  do
    f_date=`basename ${f} | cut -f2 -d.`
    sqlldr $USER_GIS/$PASS_GIS@$DB_GIS control=$TEMP_GIS/control.ctl data=$f log=$LOGS_GIS/sqlldr_${f_date}.log bad=$BAD_GIS/Transfer_${f_date}.bad discard=$DISCARD_GIS/Transfer_${f_date}.dsc
  done

Open in new window

0
 

Author Closing Comment

by:carlino70
ID: 40491128
Thanks!
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

How to Create User-Defined Aggregates in Oracle Before we begin creating these things, what are user-defined aggregates?  They are a feature introduced in Oracle 9i that allows a developer to create his or her own functions like "SUM", "AVG", and…
Have you ever had to make fundamental changes to a table in Oracle, but haven't been able to get any downtime?  I'm talking things like: * Dropping columns * Shrinking allocated space * Removing chained blocks and restoring the PCTFREE * Re-or…
This video shows syntax for various backup options while discussing how the different basic backup types work.  It explains how to take full backups, incremental level 0 backups, incremental level 1 backups in both differential and cumulative mode a…
This video shows how to Export data from an Oracle database using the Original Export Utility.  The corresponding Import utility, which works the same way is referenced, but not demonstrated.

920 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now