load data above varchar2(4000) using sqlldr

i pull data from sybase using BCP and then build a pipe delimited data file.

I see 2 of the fields are above varchar2(4000).

I need to use sql loader to load the data file to my oracle table.

How to handle the data above 4000 bytes.?


LVL 1
vishali_vishuAsked:
Who is Participating?
 
mrjoltcolaConnect With a Mentor Commented:
CREATE TABLE T (
  ID INTEGER PRIMARY KEY,
  COL1 CLOB
);


Then use the file below. It has inline data, not external file data. To use your data file, you comment out:

infile *

and uncomment:

infile 'dataload.dat'

-- sample sql loader for clob
-- t.ctl
-- run: sqlldr username/password control=t.ctl
 
load data
--infile 'dataload.dat'
infile *
append into table t
fields terminated by '|' optionally enclosed by '"'
(
id      INTEGER EXTERNAL,
col1    CHAR
)
begindata
1|sample data clob ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ
2|"sample data clob 2 ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ"

Open in new window

0
 
mrjoltcolaCommented:
You should probably use CLOB, there is no "long varchar" as in Sybase, but CLOB works just fine.
0
 
vishali_vishuAuthor Commented:
can you please give an example of the ctl file using clob and the create table structure.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.