Sometimes we can be asked to support multilingualism in our database. Now, this can come during the design phase or after. These decisions and requirements can change on a day to day bases, and there was one such scenario where I came across this situation where all my data was failing while moving from one instance to another.
I will not be covering why we are moving the same data from one instance to another. Where the database software is the same, it's Oracle.
The best part sometimes happens where we just keep DBMS output or keep exceptions as NULL:
... ... exception when others then null; end;
Many times it can be really hard to track the exact errors, the best practice is tracking the errors and making it possible to maintain the logs.
As I mentioned earlier, we are moving the data from one instance to another and the main difference is between the "NLS_CHARACTERSET" where my source is on:
My source:
select * from NLS_DATABASE_PARAMETERS where parameter in ( 'NLS_LENGTH_SEMANTICS','NLS_CHARACTERSET') / PARAMETER VALUE ------------------------------ ---------------------------------------- NLS_CHARACTERSET WE8MSWIN1252 NLS_LENGTH_SEMANTICS BYTE
My target system NLS setting as below:
select * from NLS_DATABASE_PARAMETERS where parameter in ( 'NLS_LENGTH_SEMANTICS','NLS_CHARACTERSET') / PARAMETER VALUE ------------------------------ ---------------------------------------- NLS_CHARACTERSET AL32UTF8 NLS_LENGTH_SEMANTICS BYTE
Here is our sample table to demonstrate the errors:
CREATE TABLE varchar2_test (col1 VARCHAR2(10)); / Table VARCHAR2_TEST created.
Now let's check the datatype and char used on:
SELECT table_name,COLUMN_NAME,DATA_TYPE, CHAR_USED,DATA_LENGTH FROM ALL_TAB_COLUMNS C where table_name IN ( 'VARCHAR2_TEST'); TABLE_NAME COLUMN_NAME DATA_TYPE CHAR_USED DATA_LENGTH VARCHAR2_TEST COL1 VARCHAR2 B 10
Now let's enter some sample data on our source database:
insert into VARCHAR2_TEST values ('slobārāyéé') ; commit; 1 row inserted.
So we see that the data inserted without any issues. Now let us repeat the same steps on our target instance:
CREATE TABLE varchar2_test (col1 VARCHAR2(10)); / Table VARCHAR2_TEST created.
Verify the char used for the same column.
SELECT table_name,COLUMN_NAME,DATA_TYPE, CHAR_USED,DATA_LENGTH FROM ALL_TAB_COLUMNS C where table_name IN ( 'VARCHAR2_TEST'); TABLE_NAME COLUMN_NAME DATA_TYPE CHAR_USED DATA_LENGTH VARCHAR2_TEST COL1 VARCHAR2 B 10
Now everything looks same now let us try to insert one row with same value:
insert into VARCHAR2_TEST values ('slobārāyéé') ;
While running the above insert statement we encountered with the below error:
Error starting at line : 13 in command - insert into VARCHAR2_TEST values ('slob?r?yéé') Error report - ORA-12899: value too large for column "VARCHAR2_TEST"."COL1" (actual: 14, maximum: 10)
This seems to be an issue even though we have the same structure and just the data causing the issue on the "AL32UTF8" database. To avoid this we can create the table or use a session-level trigger so that whenever we create the structure we should be able to use CHAR type.
For example, if say we are creating the same table with CHAR type:
CREATE TABLE varchar2_test_char (col1 VARCHAR2(10 char));
Now let's give a try to insert the same values to this newly created table:
insert into varchar2_test_char values ('slobārāyéé') ;
1 row inserted.
So we can see that if we change the CHAR use then we will be able to overcome this error. It could be a challenge to change all the VARCHAR2 columns but here is a simple script that will help to convert all existing BYTES used to CHAR.
SET SERVEROUTPUT ON; DECLARE BEGIN FOR x IN ( SELECT c.owner, c.table_name, c.column_name, c.data_type, c.data_length FROM all_tab_columns c WHERE data_type LIKE '%CHAR%' AND owner = ( 'USERNAME' ) AND char_used = 'B' AND NOT EXISTS ( SELECT 1 FROM user_views v WHERE v.view_name = c.table_name ) ) LOOP dbms_output.put_line('ALTER TABLE ' || x.owner || '.' || x.table_name || ' MODIFY ' || x.column_name || ' ' || x.data_type || '(' || x.data_length || ' CHAR);'); BEGIN EXECUTE IMMEDIATE ( 'ALTER TABLE ' || x.owner || '.' || x.table_name || ' MODIFY ' || x.column_name || ' ' || x.data_type || '(' || x.data_length || ' CHAR)' ); EXCEPTION WHEN OTHERS THEN dbms_output.put_line('EXCEPTION for:-' || 'ALTER TABLE ' || x.owner || '.' || x.table_name || ' MODIFY ' || x.column_name || ' ' || x.data_type || '(' || x.data_length || ' CHAR);'); END; END LOOP; EXCEPTION WHEN OTHERS THEN dbms_output.put_line('issue in select'); END;
It's often a good idea to print the values before executing it, so always make a backup, so that in case any issues do arise, you will be to rollback the statements.
You will also see a section where the views not been considered. There are multiple reasons for this. There is a slight possibility of having functional columns created, for example, if we have a case statement with columns for a view, then we have to follow the steps shown below:
SET SERVEROUTPUT ON; DECLARE BEGIN FOR x IN (SELECT c.owner, c.table_name, c.column_name, c.data_type, c.data_length FROM all_tab_columns c WHERE data_type LIKE '%CHAR%' AND char_used = 'B' AND owner = ('USERNAME') AND EXISTS (SELECT 1 FROM user_views v WHERE v.view_name = c.table_name)) LOOP DBMS_OUTPUT.put_line ( 'ALTER VIEW ' || x.owner || '.' || x.table_name || ' COMPILE ' ); begin EXECUTE IMMEDIATE ( 'ALTER VIEW ' || x.owner || '.' || x.table_name || ' COMPILE ' ); exception when others then DBMS_OUTPUT.put_line ( 'EXCEPTION for :-'||'ALTER VIEW ' || x.owner || '.' || x.table_name || ' COMPILE ' ); end; END LOOP; exception when others then DBMS_OUTPUT.PUT_LINE ('issue in select'); END;
But even after executing the above script we might still see some objects as BYTES we can still make it as CHAR.
Say we have a view that is using a case statement as like below:
MAX(CASE WHEN col1= 'acckey' THEN col2 END) colkey
Now from the above we can see that even if the columns are present but we are using a case statement based on our need in such scenario the even if we run the script to make the changes from BYTES to CHAR we will not be able to convert, so we need to manually convert them using "CAST"
As shown below:
MAX(CASE WHEN col1= 'acckey' THEN cast (col2 as varchar2(4000 CHAR)) END) colkey
Before making the above changes extract the DDL of the view and save it, then drop the view. Finally, modify the statement and compile the view. From all the scripts replace the actual schema name in place of "'USERNAME'".
We might see another issue while altering the table if the functional index is defined on one of the columns from the table.
Error:
ORA-30556: either functional or bitmap join index is defined on the column to be modified 30556. 00000 - "either functional or bitmap join index is defined on the column to be modified" *Cause: An ALTER TABLE MODIFY COLUMN was issued on a column on which either a functional index or bitmap join index exists. *Action: Drop the functional or bitmap join index before attempting to modify the column.
To avoid the above error follow the steps below:
To verify the changes run the following statement:
SELECT c.owner, c.table_name, c.column_name, c.data_type, c.data_length,char_used FROM all_tab_columns c WHERE data_type LIKE '%CHAR%' AND char_used = 'B' AND owner = ('USERNAME');
The above statement should return zero rows.
So from the above steps, we now know how to convert the VARCHAR2 columns from BYTES to CHAR uses. Note the use-case may vary to each business use-case which cannot be generalized.
Thank you for reading my article, please feel free to leave me some feedback or to suggest any future topics. I'll be looking forward to hearing from you – Swadhin Ray (Sloba)
For more information about me, please check out my Experts Exchange Profile page.
Have a question about something in this article? You can receive help directly from the article author. Sign up for a free trial to get started.
Comments (0)