anusankarpn
asked on
Conversion of Varbinary to nvarchar in SQL Server.
My requirement is to write a Transact SQL function which can read the binary data in a VarBinary(Max) column, then convert it into nVarChar(Max)(The data in the VarBinary(Max) is created using a java application with utf-8 encoding.).
When I use the cast keyword to convert the Varbinary to nVarchar, all ascii characters are converted properly, but the unicode characters (Ex:Japanese) are not converted correctly.
Please provide a solution for this.
When I use the cast keyword to convert the Varbinary to nVarchar, all ascii characters are converted properly, but the unicode characters (Ex:Japanese) are not converted correctly.
Please provide a solution for this.
ASKER
Currently our unicode data is stored in VarBinary(Max) column. We are trying to Drop this column and store the data in a new nVarChar(max) column.
We were trying to to do this conversion in The DB it self, rather than fetching the entire data to the application, convert it and then store it back to the new column.
We were able to find a solution in Oracle. But in SQL Server, we are not sure how to specify the encoding while converting. Is there any Stored procedures which can do this?
We were trying to to do this conversion in The DB it self, rather than fetching the entire data to the application, convert it and then store it back to the new column.
We were able to find a solution in Oracle. But in SQL Server, we are not sure how to specify the encoding while converting. Is there any Stored procedures which can do this?
as above comment, convert cannot know "application-specific rules" on how to interprete the data.
unless you create your own convertion function in t-sql, and use that function instead, I see no other way to do it, but surely not with t-sql convert. ...
unless you create your own convertion function in t-sql, and use that function instead, I see no other way to do it, but surely not with t-sql convert. ...
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
In a varbinary(max) you are free to do anything. You don't have to stick to 2 bytes (unicode size) boundary. If the characters are shifted from the boundary, then plain CONVERT cannot read it back correctly. On the other hand, the webapp can read it back because it knows the format. "
Would it not be possible to read the binary data back through the same application that generated it?