• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 2649
  • Last Modified:

sql loader help

Hi ,I have a 2 gb file with insert statments that I need to insert into an oracle table .
I create the schema and table in oracle.
How should I proceed with it now .
Should I run the insert statments as is .
Its a big file .

OR

How do I use the sql loader to insert the statements .
Example of an insert statment is as following:
INSERT INTO "dbo"."EODTransaction"
  ("transaction_id", "org_id", "extraction_date", "create_date", "transaction_size", "profile_count", "
version_count")
VALUES (116078, 201, TO_DATE('18-11-2008 12:01:00 AM','DD-MM-YYYY HH:MI:SS AM'), TO_DATE('28-10-2008 11
:51:16 AM','DD-MM-YYYY HH:MI:SS AM'), 813610, 4, 8);
Need help on this ASAP.

THANKS,GYANS.
0
gyans
Asked:
gyans
  • 20
  • 20
  • 6
1 Solution
 
ajexpertCommented:
If you run the file as is, it may not open up or it may take long time to execute

Using SQL Loader is the fastest method of loading data.

Hope this helps

0
 
ajexpertCommented:
If you are new to SQL Loader.
This link will help you
http://www.orafaq.com/wiki/SQL*Loader_FAQ
0
 
gyansAuthor Commented:
This is what I have come up with ,can you help me further :
 sqlldr USERID=eod_reports CONTROL=eod_transaction.csv LOG=eod_transaction.log


   
eod_transaction.csv

LOAD DATA
INFILE '/opt/app/oracle/ora10g/admin/eodrpt/misc/dbo_EODTransaction.sql'
INSERT
INTO TABLE eod_reports.transaction
FIELDS TERMINATED BY ","              
( transaction_id, org_id, extraction_date, create_date, transaction_size, profile_count, version_count )
DIRECT=TRUE
LOGFILE=SQLLOADER.LOG
thanks
0
NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

 
ajexpertCommented:
The syntax that you are using is for DIRECT path.  Since your data file contains SQL INSERT statments this isn't hold good.

Are you saying that you just need to run 2Gig file in fastest way?
0
 
ajexpertCommented:

Usually the DIRECT paths are fastest.  But they require data file (NOT THE SQL INSERT FILE)
So if you can generate the data file with SQL INSERT statements, Using Direct path will be feasible

If you still want to use SQL Staement file consider using tools like PL/SQL Developer or TOAD and also try to commit the data frequently after 10K records or so
0
 
ajexpertCommented:
Please read it as
So if you can generate the data file WITHOUT SQL INSERT statements, Using Direct path will be feasible
0
 
sujith80Commented:
Do you have one insert statement per line?

I.e. In your file is one insert statement lying on a single line OR is it broken into multiple line?
If you have it in a single line, you still can have hopes of using sqlloader.
If so, post a few more sample lines from the dat file as an attachment.
0
 
gyansAuthor Commented:
Hi ,I tried using sql loader to insert data .
Getting the follwoing erorrs in the file .
ALso attaching the result of the bad file .
0
 
gyansAuthor Commented:
Sorry ,here is the attachment .
sqlloader.log
0
 
ajexpertCommented:
Ok.
Here is what you can do
1) Open the file and copy only 5 INSERT statments.
2) Open SQL PLUS and execute these 5 statements.

Let me know the result
0
 
gyansAuthor Commented:
If I insert 5 statements , after taking out the " " from the insert statements they are inserted .
SQL> INSERT INTO eod_reports.Transaction
  2    (transaction_id, org_id, extraction_date, create_date, transaction_size,
profile_count, version_count)
  3  VALUES (116913, 201, TO_DATE('19-11-2008 12:01:00 AM','DD-MM-YYYY HH:MI:SS
AM'), TO_DATE('30-10-2008 12:00:17 PM','DD-MM-YYYY HH:MI:SS AM'), 406773, 2, 4);

1 row created.
-----------
Also I Tried to use export the table in sql server in csv format .
Can you help me to write the sql loader correct control file and insert statement .
There are only 6 columns in a table .the data in the csv file as like following:
"597","192","1/29/2007 12:01 AM","1/15/2007 12:25 PM","619,608","1","1"
"600","192","1/29/2007 12:01 AM","1/15/2007 12:30 PM","112,363","1","1"
"603","192","1/29/2007 12:01 AM","1/22/2007 10:41 AM","111,213","1","1"
"576","190","1/30/2007 12:01 AM","1/11/2007 4:15 AM","126,915","2","2"
"588","192","1/30/2007 12:01 AM","1/12/2007 8:32 AM","0","1","0"
"591","192","1/30/2007 12:01 AM","1/15/2007 12:18 PM","153,632","1","1"

The destination table looks like the following:

SQL> desc transaction
 Name                                      Null?    Type
 ----------------------------------------- -------- --------------------------

 TRANSACTION_ID                            NOT NULL NUMBER(18)
 ORG_ID                                    NOT NULL NUMBER(18)
 EXTRACTION_DATE                           NOT NULL TIMESTAMP(6)
 CREATE_DATE                               NOT NULL TIMESTAMP(6)
 TRANSACTION_SIZE                                   NUMBER
 PROFILE_COUNT                                      NUMBER
 VERSION_COUNT                                      NUMBER

I am in as bind ,would really appreciate your help ASAP.thank you.

0
 
ajexpertCommented:
If your data is in csv format you can use DIRECT path load.
As said, the following link would definately be of your help.
http://www.orafaq.com/wiki/SQL*Loader_FAQ

I dont have acess to oracle at this moment, but I will try to create ctl file from the information given by you
0
 
gyansAuthor Commented:
I would def .appreciate it .
This is what I created so far .
LOAD DATA
INFILE '/opt/app/oracle/ora10g/admin/eodrpt/misc/dbo_EODTransaction.sql'
INSERT
INTO TABLE eod_reports.transaction
REPLACE FIELDS TERMINATED BY ","              
( transaction_id, org_id, extraction_date, create_date, transaction_size, profile_count, version_count )
DIRECT=TRUE
LOGFILE=SQLLOADER.LOG

Appreciate your help .
0
 
ajexpertCommented:
This should work
0
 
gyansAuthor Commented:
its not working ,attaching the error file.

SQL*Loader: Release 10.2.0.1.0 - Production on Thu Nov 20 10:14:48 2008
 
Copyright (c) 1982, 2005, Oracle.  All rights reserved.
 
Control File:   control.ctl
Data File:      /opt/app/oracle/ora10g/admin/eodrpt/misc/dbo_EODTransaction1.csv
  Bad File:     dbo_EODTransaction1.bad
  Discard File:  none specified
 
 (Allow all discards)
 
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array:     64 rows, maximum of 256000 bytes
Continuation:    none specified
Path used:      Conventional
 
Table TRANSACTION, loaded from every logical record.
Insert option in effect for this table: INSERT
 
   Column Name                  Position   Len  Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
TRANSACTION_ID                      FIRST     *   ,  O(") CHARACTER
ORG_ID                               NEXT     *   ,  O(") CHARACTER
EXTRACTION_DATE                      NEXT     *   ,  O(") CHARACTER
CREATE_DATE                          NEXT     *   ,  O(") CHARACTER
TRANSACTION_SIZE                     NEXT     *   ,  O(") CHARACTER
PROFILE_COUNT                        NEXT     *   ,  O(") CHARACTER
VERSION_COUNT                        NEXT     *   ,  O(") CHARACTER
 
Record 1: Rejected - Error on table TRANSACTION, column TRANSACTION_ID.
ORA-01722: invalid number
 
Record 2: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 3: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 4: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 5: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 6: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 7: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 8: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 9: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 10: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 11: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 12: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 13: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 14: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 15: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 16: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 17: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 18: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 19: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 20: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 21: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 22: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 23: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 24: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 25: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 26: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 27: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 28: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 29: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 30: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 31: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 32: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 33: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 34: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 35: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 36: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 37: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 38: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 39: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 40: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 41: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 42: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 43: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 44: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 45: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 46: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 47: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 48: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 49: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 50: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
Record 51: Rejected - Error on table TRANSACTION, column EXTRACTION_DATE.
ORA-01843: not a valid month
 
 
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
 
Table TRANSACTION:
  0 Rows successfully loaded.
  51 Rows not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.
 
 
Space allocated for bind array:                 115584 bytes(64 rows)
Read   buffer bytes: 1048576
 
Total logical records skipped:          0
Total logical records read:            64
Total logical records rejected:        51
Total logical records discarded:        0
 
Run began on Thu Nov 20 10:14:48 2008
Run ended on Thu Nov 20 10:14:53 2008
 
Elapsed time was:     00:00:05.32
CPU time was:         00:00:00.18

Open in new window

0
 
ajexpertCommented:
Since you are using csv file you should use DIRECT path.
I see from the log file that its applying CONVENTIONAL path which is the cause of error
0
 
gyansAuthor Commented:
how do I fix this ?
0
 
ajexpertCommented:
Can u please attach a sample (extract 5 records) data file, control file and the command that you use for loading?
0
 
gyansAuthor Commented:
thanks .
Data exported example:
"597","192","1/29/2007 12:01 AM","1/15/2007 12:25 PM","619,608","1","1"
"600","192","1/29/2007 12:01 AM","1/15/2007 12:30 PM","112,363","1","1"
"603","192","1/29/2007 12:01 AM","1/22/2007 10:41 AM","111,213","1","1"
"576","190","1/30/2007 12:01 AM","1/11/2007 4:15 AM","126,915","2","2"
"588","192","1/30/2007 12:01 AM","1/12/2007 8:32 AM","0","1","0"
"591","192","1/30/2007 12:01 AM","1/15/2007 12:18 PM","153,632","1","1"
Control file:
LOAD DATA
INFILE '/opt/app/oracle/ora10g/admin/eodrpt/misc/dbo_EODTransaction1.csv'
INTO TABLE TRANSACTION
fields terminated by "," optionally enclosed by '"'
(TRANSACTION_ID,ORG_ID,EXTRACTION_DATE,CREATE_DATE,TRANSACTION_SIZE,PROFILE_COUNT,VERSION_COUNT)

Command I use:
sqlldr userid-test controlfile=control.ctl
0
 
ajexpertCommented:
Use this command
sqlldr userid-test controlfile=control.ctl DIRECT=TRUE

Open in new window

0
 
gyansAuthor Commented:
getting the following error:
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Nov 20 11:15:07 2008

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

SQL*Loader-951: Error calling once/load initialization
ORA-00604: error occurred at recursive SQL level 1
ORA-00054: resource busy and acquire with NOWAIT specified

0
 
ajexpertCommented:
seems some user has locked the table for updating.  Are u sure that you are the only person updating tihs table?
0
 
gyansAuthor Commented:
yes I am the only one using this .
there is only 1 table in the schema,
0
 
ajexpertCommented:
ok, close the session and if possible restart the server and try again
0
 
ajexpertCommented:
Restart the server only if you are using it, this should be the last resort
0
 
gyansAuthor Commented:
there are other databases on this .
I cannot restart the server .
I can try to restart the db if it helps.
0
 
ajexpertCommented:
Yup, restarting the instance is the best option if no one else is working except you.
Alternatively, You can kill the session if you dont want to restart DB.
Here is the link where u can see the avaialble sessions and kill YOUR session if its active.
http://www.oracle-training.cc/oracle_tips_v$session.htm

0
 
gyansAuthor Commented:
still getting error:

error.txt
0
 
ajexpertCommented:
Its because you need to map the columns with proper format.
Please see the files attached and make your ctl file in similar fashion i.e map the data columns and create control file.
I have added two files
1. csv_change_file_extn_to_ctl - this is control file
2. csv_change_file_extn_to_dat - this is data file

Hope this helps
csv-change-file-extn-to-ctl.txt
csv-change-file-extn-to-dat.txt
0
 
gyansAuthor Commented:
I understood the control file .
The test file comes in the format I sent you first or the other one iis like this:
transaction_id        org_id extraction_date    create_date        transaction_size profile_count version_count
           576           190 1/13/2007 12:01 AM 1/11/2007 4:01 AM           126,915             2             2
           588           192 1/13/2007 12:01 AM 1/12/2007 8:01 AM                 0             1             0
           576           190 1/14/2007 12:01 AM 1/11/2007 4:01 AM           126,915             2             2
           588           192 1/14/2007 12:01 AM 1/12/2007 8:01 AM                 0             1             0
           576           190 1/15/2007 12:01 AM 1/11/2007 4:01 AM           126,915             2             2
           588           192 1/15/2007 12:01 AM 1/12/2007 8:01 AM                 0             2             0
           576           190 1/16/2007 12:01 AM 1/11/2007 4:01 AM           126,915             2             2
           588           192 1/16/2007 12:01 AM 1/12/2007 8:01 AM                 0             1             0
           591           192 1/16/2007 12:01 AM 1/15/2007 12:01 PM          153,632             1             1
           594           192 1/16/2007 12:01 AM 1/15/2007 12:01 PM          486,572             1             1

I am using a Sql Server EMS lite tool to export the data .
Can you help me make the control file to match the either one of the format .
thanks a ton for your help .
gyans
0
 
ajexpertCommented:
OK,
The file format that you have just specified is the fixed length format.  There are different ways to handle fixed file format.

I have attached a file that should load data into delimted file format (i.e delimited by ",")

Hope this helps
0
 
ajexpertCommented:
Hit send button bit too quick...Here is the file
sqlload.txt
0
 
gyansAuthor Commented:
getting the following error:
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Nov 20 15:33:24 2008

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

SQL*Loader-350: Syntax error at line 6.
Expecting valid column specification, "," or ")", found keyword optionally.
   TRANSACTION_ID        INTEGER        OPTIONALLY ENCLOSED BY '"',
                                        ^

0
 
sujith80Commented:
In your data
>> "597","192","1/29/2007 12:01 AM","1/15/2007 12:25 PM","619,608","1","1"

The transaction_size value is appearing as "619,608" and the column is defined as number in the table, this is one reason for rejection.

YOu may have the column as varchar2 in the table.

Try these steps

SQL> create table transaction(
  2   TRANSACTION_ID                            NUMBER(18) NOT NULL ,
  3   ORG_ID                                    NUMBER(18) NOT NULL ,
  4   EXTRACTION_DATE                           TIMESTAMP(6) NOT NULL ,
  5   CREATE_DATE                               TIMESTAMP(6) NOT NULL ,
  6   TRANSACTION_SIZE                                   varchar2(60),
  7   PROFILE_COUNT                                      NUMBER,
  8   VERSION_COUNT                                      NUMBER
  9  );

Table created.

Create the control file (test.ctl) like this.
-----
Load DATA
INSERT
INTO TABLE transaction
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
 TRANSACTION_ID                            INTEGER EXTERNAL ,
 ORG_ID                                    INTEGER EXTERNAL ,
 EXTRACTION_DATE                           TIMESTAMP "mm/dd/yyyy hh:mi AM" ,
 CREATE_DATE                               TIMESTAMP "mm/dd/yyyy hh:mi AM",
 TRANSACTION_SIZE                          CHAR,
 PROFILE_COUNT                             INTEGER EXTERNAL ,
 VERSION_COUNT                             INTEGER EXTERNAL
)

-----

load data using this command.

sqlldr username/password@connectstring control=test.ctl data=test.txt direct=y

Note:
direct=y is optional. You may use it or not. If your database is not setup properly for direct path load, it may not work. If there is error try loading without that option.

If you are getting the error
>> resource busy and acquire with NOWAIT specified

It means that another session has modifed the table and has not committed yet. Close all the other sqlplus (including TOAD sessions) sessions accessing this table.
0
 
gyansAuthor Commented:
I still get the same error on column -transaction_id
Record 49: Rejected - Error on table TRANSACTION, column TRANSACTION_ID.
ORA-01722: invalid number

Record 50: Rejected - Error on table TRANSACTION, column TRANSACTION_ID.
ORA-01722: invalid number

Record 51: Rejected - Error on table TRANSACTION, column TRANSACTION_ID.
ORA-01722: invalid number

because if I scroll along the file down further ,it  has ',' in its value.
For example:
  11,229           201 5/20/2008 12:01 AM  2/12/2008 11:24 AM           138,000             1             2
        11,233           201 5/20/2008 12:01 AM  2/12/2008 11:31 AM           269,711             1             2
        11,237           201 5/20/2008 12:01 AM  2/12/2008 11:33 AM           138,040             1             2
        11,241           201 5/20/2008 12:01 AM  2/12/2008 11:38 AM           269,691             1             2

I would really really appreciate if I COULD get some help ASAP.
thanks,Gyans,
0
 
ajexpertCommented:
Yes, and for this, you need to export the file again which does not have any "," for thousands etc.
Else consider fixed length file format
0
 
gyansAuthor Commented:
ok .let me try again .
0
 
gyansAuthor Commented:
transaction size still has the comma .
However it is giving error at transaction id and not the size .
transaction_i        org_id extraction_date    create_date        transaction_s profile_count version_count
          576           190 1/13/2007 12:01 AM 1/11/2007 4:01 AM        126,915             2             2
          588           192 1/13/2007 12:01 AM 1/12/2007 8:01 AM              0             1             0
          576           190 1/14/2007 12:01 AM 1/11/2007 4:01 AM        126,915             2             2
          588           192 1/14/2007 12:01 AM 1/12/2007 8:01 AM              0             1             0
          576           190 1/15/2007 12:01 AM 1/11/2007 4:01 AM        126,915             2             2
          588           192 1/15/2007 12:01 AM 1/12/2007 8:01 AM              0             2             0
          576           190 1/16/2007 12:01 AM 1/11/2007 4:01 AM        126,915             2             2
          588           192 1/16/2007 12:01 AM 1/12/2007 8:01 AM              0             1             0
          591           192 1/16/2007 12:01 AM 1/15/2007 12:01 PM       153,632             1             1
          594           192 1/16/2007 12:01 AM 1/15/2007 12:01 PM       486,572             1             1
How else do I export the sql server file .
0
 
ajexpertCommented:
You have to see options while exporting, wherein the numbers are not masked by "," and  stick to csv file format else you have to change the control file.
0
 
sujith80Commented:
gyans,
The problem is because your decimal separator is comma.
You can have the columns TRANSACTION_ID and TRANSACTION_SIZE as NUMBER columns and use the control file below.


create table transaction(
TRANSACTION_ID                            NUMBER(18) NOT NULL ,
ORG_ID                                    NUMBER(18) NOT NULL ,
EXTRACTION_DATE                           TIMESTAMP(6) NOT NULL ,
CREATE_DATE                               TIMESTAMP(6) NOT NULL ,
TRANSACTION_SIZE                                   NUMBER,
PROFILE_COUNT                                      NUMBER,
VERSION_COUNT                                      NUMBER
);

Load DATA
INSERT
INTO TABLE transaction
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
( 
 TRANSACTION_ID                            CHAR "replace(:TRANSACTION_ID,',','')",
 ORG_ID                                    INTEGER EXTERNAL ,
 EXTRACTION_DATE                           TIMESTAMP "mm/dd/yyyy hh:mi AM" ,
 CREATE_DATE                               TIMESTAMP "mm/dd/yyyy hh:mi AM",
 TRANSACTION_SIZE                          CHAR "replace(:TRANSACTION_SIZE, ',','')",
 PROFILE_COUNT                             INTEGER EXTERNAL ,
 VERSION_COUNT                             INTEGER EXTERNAL 
)

Open in new window

0
 
gyansAuthor Commented:
to answer ajxpert question; when I try to export in csv format it brings it in excel format .

To use sujith80's control file I get the following error :
Record 42: Discarded - all columns null.
Record 1: Rejected - Error on table TRANSACTION, column ORG_ID.
ORA-01400: cannot insert NULL into (ORG_ID)
Record 2: Rejected - Error on table TRANSACTION, column ORG_ID.
ORA-01722: invalid number

Record 3: Rejected - Error on table TRANSACTION, column ORG_ID.
ORA-01400: cannot insert NULL into (ORG_ID)
Record 4: Rejected - Error on table TRANSACTION, column ORG_ID.
ORA-01722: invalid number

0
 
sujith80Commented:
you should post a few of the rejected records from the bad file
0
 
gyansAuthor Commented:
Hi .I got it working some what .
I created a staging table with the following columns:
CREATE TABLE EOD_REPORTS.VINAI1
(
  TRANSACTION_ID    CHAR(140 BYTE)              NOT NULL,
  ORG_ID            CHAR(140 BYTE),
  EXTRACTION_DATE   VARCHAR2(20 BYTE),
  CREATE_DATE       VARCHAR2(20 BYTE),
  TRANSACTION_SIZE  VARCHAR2(10 BYTE),
  PROFILE_COUNT     NUMBER,
  VERSION_COUNT     NUMBER
)

Actual table:
create table VINAI
(TRANSACTION_ID NUMBER(18) NOT NULL ,
ORG_ID NUMBER(18) NOT NULL ,
EXTRACTION_DATE TIMESTAMP(6) NOT NULL ,
CREATE_DATE TIMESTAMP(6) NOT NULL ,
TRANSACTION_SIZE NUMBER,
PROFILE_COUNT NUMBER,
VERSION_COUNT NUMBER);
The data is in the staging table , how do I update my table to reflect the columns correctly.

thanks .
0
 
sujith80Commented:
Dont create a table with CHAR columns unless you really want to do it.
Because CHAR columns are blank padded to fill up the complete length of the column.

You dont need a staging table here. Because it is a simple load, it can be done with sqlloader itself. Post a few of the rejected records.
If you dont need that NOT NULL constraint on ORG_ID remove it.

Try this control file.

Load DATA
INSERT
INTO TABLE transaction
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
 TRANSACTION_ID                            CHAR "replace(:TRANSACTION_ID,',','')",
 ORG_ID                                    CHAR "replace(:ORG_ID,',','')",
 EXTRACTION_DATE                           TIMESTAMP "mm/dd/yyyy hh:mi AM" ,
 CREATE_DATE                               TIMESTAMP "mm/dd/yyyy hh:mi AM",
 TRANSACTION_SIZE                          CHAR "replace(:TRANSACTION_SIZE, ',','')",
 PROFILE_COUNT                             INTEGER EXTERNAL ,
 VERSION_COUNT                             INTEGER EXTERNAL
)

You probably have data problems. Post a few of the rejected records.
0
 
gyansAuthor Commented:
Thanks for all your help .Sujith .
I got a utility ,it converts a dbf file to oracle .
Working with it now .
Will let you konw how it goes .
I am scramling my head for the past 4 days to get this done and I have been bugged to get this done ASAP.
Will accept your answer as the final one though.

0
 
sujith80Commented:
I am surprised about the accepted answer!!!!
A direct path has nothing to do with the errors you are getting. Its JUST because of the data errors.
0

Featured Post

Nothing ever in the clear!

This technical paper will help you implement VMware’s VM encryption as well as implement Veeam encryption which together will achieve the nothing ever in the clear goal. If a bad guy steals VMs, backups or traffic they get nothing.

  • 20
  • 20
  • 6
Tackle projects and never again get stuck behind a technical roadblock.
Join Now