Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

ORA-12899: value too large for column "TGT_SCHEMA"."CUSTOMERS"."ZIPCODE"

Posted on 2013-05-11
12
Medium Priority
?
1,116 Views
Last Modified: 2013-06-03
When trying to load data using following sql from source to target, I get the following error:

insert into tgt_schema.customer_tab
select
*
from
src_schema.customer_tab;


ERROR at line 3:
ORA-12899: value too large for column "TGT_SCHEMA"."CUSTOMER_TAB"."ZIPCODE"
(actual: 16, maximum: 10)

I altered the column data type in the target schema and changed the precision of zipcode to varchar2(32) and executed the above sql and the query ram fine.

Now I have to questions:

1. How to find the record in the source which exceeded varchar2(10)
2. How did the source contained this value when the precision in the source table is also     varchar2(10)?

Please advise.. please let me know if you need any other information

Thanks
0
Comment
Question by:gs79
  • 4
  • 4
  • 4
12 Comments
 
LVL 74

Expert Comment

by:sdstuber
ID: 39158937
select * from "TGT_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where length(zipcode) > 10
0
 

Author Comment

by:gs79
ID: 39158958
Thanks sdstuber

I think you meant src_schema instead of tgt_schema:
select * from "SRC_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where length(zipcode) > 10

My other question is how did this record exist in the source as we have the same constraint in the source table too..Please let me know

Thanks..
0
 

Author Comment

by:gs79
ID: 39158962
Also I didnt see any record in the source where length>10

select count(*)
from "SRC_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where length(zipcode) > 10

returned 0

This is strange..

Thanks
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
LVL 74

Assisted Solution

by:sdstuber
sdstuber earned 600 total points
ID: 39158966
actually, I did mean the target schema

the only way it could exist is either:

 1 - you're mistaken about the source being constrained to 10 characters.

2 - there is a trigger on the target table and the error is arising from larger values being inserted than what you intended

3 - characters vs bytes - If the source system is defined as 10 characters where the characters are multi-byte and the target system is defined as 10 bytes, then an 8-character value in the source will be too big on the target.
0
 

Author Comment

by:gs79
ID: 39158986
This is the column definition in source and target after I do describe. I think we can rule out 2 since I ensured that there is no trigger.

The source and target were both defined as:

 VARCHAR2 (10 Byte)

I tried your above query on TGT_SCHEMA as well and there were no rows returned:

select count(*)
from "TGT_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where length(zipcode) > 10

Then is it option 3? With column definition as VARCHAR2(10 byte), is it not same that source and target has same definition?

I am still not able to find the culprit record

Please advice..

Thanks
0
 
LVL 49

Accepted Solution

by:
PortletPaul earned 1400 total points
ID: 39159060
try lengthB

select * from "SRC_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where lengthB(zipcode) >= 16

I've used 16 due to this: actual: 16, maximum: 10
if 16 produces no results drop back the number perhaps.
0
 

Author Comment

by:gs79
ID: 39160062
Thanks PortletPaul..

I was able to find this out using lengthB..

This is how it looks  in src vs target:

select zipcode lengthb(zipcode), length(zipcode) from "SRC_SCHEMA"."CUSTOMER_TAB"."ZIPCODE" where lengthB(zipcode) >= 10

In the target (after I have modified the definition from varchar2(10 bytes) to 32 bytes

ZIPCODE      LENGTH_IN_BYTES      LENGTH_IN_CHAR
ïïïïïïïï              8                              8

Target:

ZIPCODE      LENGTH_IN_BYTES      LENGTH_IN_CHAR
ïïïïïïïï              16                              8

The same 8 character length zipcode is taking 16 byte in Target. The only difference is source is 10g and target is 11g.

The characterset value in Target is UTF8 and I am not able to find that information in Source yet.

Will it be the difference in characterset that is causing this problem. We have modeled the target tables same as source tables. Now if the characterset is different probably we have to change the definition..

Please advice..

Thanks,
0
 
LVL 49

Expert Comment

by:PortletPaul
ID: 39160088
>>Will it be the difference in characterset that is causing this problem
probably, certainly looks like there's a difference

what is certain is that extended characters of UTF8 require more than 1 byte

the bigger question is: Should a zipcode accept UTF8 extended characters?
0
 
LVL 74

Expert Comment

by:sdstuber
ID: 39160189
If the length in bytes is 16,  then the field can't be defined as varchar2(10 byte)
0
 
LVL 49

Expert Comment

by:PortletPaul
ID: 39160209
if the data is accepted at source as varchar(10) but that data expanded due to UTF8 at target, and this may be an idiot suggestion giving rise to mirth, but could (should?) the NLS_LANG be changed in the target at session level during the import?
0
 
LVL 74

Expert Comment

by:sdstuber
ID: 39160212
oh sorry, I was looking at the output above backwards,  8 on target, 16 on source.

Nevermind

yes, changing character sets could cause the problem seen here
0
 
LVL 49

Expert Comment

by:PortletPaul
ID: 39160224
this might help identify inconsistencies perhaps - run in both source & target

select
  d.parameter parameter
, d.value value
, i.value instance_value
, s.value session_value
from nls_database_parameters d
left join nls_instance_parameters i on d.parameter = i.parameter
left join nls_session_parameters s on d.parameter = s.parameter
0

Featured Post

Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Working with Network Access Control Lists in Oracle 11g (part 1) Part 2: http://www.e-e.com/A_9074.html So, you upgraded to a shiny new 11g database and all of a sudden every program that used UTL_MAIL, UTL_SMTP, UTL_TCP, UTL_HTTP or any oth…
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
This video explains what a user managed backup is and shows how to take one, providing a couple of simple example scripts.
Suggested Courses
Course of the Month15 days, 23 hours left to enroll

580 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question