• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1259
  • Last Modified:

load data above varchar2(4000) using sqlldr

i pull data from sybase using BCP and then build a pipe delimited data file.

I see 2 of the fields are above varchar2(4000).

I need to use sql loader to load the data file to my oracle table.

How to handle the data above 4000 bytes.?


0
vishali_vishu
Asked:
vishali_vishu
  • 2
1 Solution
 
mrjoltcolaCommented:
You should probably use CLOB, there is no "long varchar" as in Sybase, but CLOB works just fine.
0
 
vishali_vishuAuthor Commented:
can you please give an example of the ctl file using clob and the create table structure.
0
 
mrjoltcolaCommented:
CREATE TABLE T (
  ID INTEGER PRIMARY KEY,
  COL1 CLOB
);


Then use the file below. It has inline data, not external file data. To use your data file, you comment out:

infile *

and uncomment:

infile 'dataload.dat'

-- sample sql loader for clob
-- t.ctl
-- run: sqlldr username/password control=t.ctl
 
load data
--infile 'dataload.dat'
infile *
append into table t
fields terminated by '|' optionally enclosed by '"'
(
id      INTEGER EXTERNAL,
col1    CHAR
)
begindata
1|sample data clob ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ
2|"sample data clob 2 ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ"

Open in new window

0

Featured Post

Visualize your virtual and backup environments

Create well-organized and polished visualizations of your virtual and backup environments when planning VMware vSphere, Microsoft Hyper-V or Veeam deployments. It helps you to gain better visibility and valuable business insights.

  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now