Hello experts!
We have a C# applicaiton that uses double datatypes to calcualte money with Exchange rate processing. The exchange rate is currently defined in SQL Server 2008 as a REAL data type which is an 'approximate' datatype as defined by MS.
http://msdn.microsoft.com/en-us/library/ms187912.aspx
So, I need to convert the REAL datatype to an exact datatype like DECIMAL. However, when applying the ALTER COLMN statement below I am losing precision.
Is there a DATATYPE definition that will provide the result=1.0500000000000000 ?? thanks
---------------------
CREATE TABLE table1 (field1 real);
INSERT table1 select ('1.05')
SELECT * from table1
-- Try tyo convert field1 from a REAL to exacat precision datatype --
ALTER TABLE table1
--ALTER COLUMN field1 float(53) /* returns=1.0499999995231628
*/
-- ALTER COLUMN field1 decimal(38,16) /* returns=1.0499999995231628 */
--ALTER COLUMN field1 decimal(38,8) /* returns=1.04999995 */
ALTER COLUMN field1 decimal(38,7) /* returns=1.0500000 */
-- I want the value to be 16 digits to the right but be convert with exact precision
-- Can any datatype definition give me 1.0500000000000000 when converting from a REAL?
select * from table1
DROP TABLE table1