Link to home
Start Free TrialLog in
Avatar of NEXT IT
NEXT ITFlag for United States of America

asked on

InputBaseN() returns different value in ColdFusion 10 and CF2016

We recently upgraded from CF10 to CF2016 and are now getting a 'Cannot convert the value 4.023233417E9 to an integer because it cannot fit inside an integer'. I was able to narrow it down to the initialization of the MD5 buffer. This is the code in question:
<CFSET h = ArrayNew(1)>
<CFSET h[1] = InputBaseN("0x67452301",16)>
<CFSET h[2] = InputBaseN("0xefcdab89",16)>
<CFSET h[3] = InputBaseN("0x98badcfe",16)>
<CFSET h[4] = InputBaseN("0x10325476",16)>

<cfdump var="#h#">

Open in new window


The problem is in the value being returned by the InputbaseN(). On the CF10 server the result looks like this:
1      1732584193
2      -271733879
3      -1732584194
4      271733878

However, on the server running CF2016, it looks like this:
1      1732584193
2      4023233417
3      2562383102
4      271733878

Why would the values for 2&3 change based on the CF version? And is there a way to fix this?
ASKER CERTIFIED SOLUTION
Avatar of _agx_
_agx_
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of NEXT IT

ASKER

Thanks for the help - worked like a charm. Gonna add this to their bug tracker and hope this isn't another one of Adobe's "undocumented features"
Yeah, hope not but ... it's kind of a weird change. CF has always had a 32 bit int limit for most functions. To suddenly start returning an unsigned long value without warning ... smacks of something "undocumented" ;-)
Avatar of NEXT IT

ASKER

Just to circle back on  this... we filed bug https://bugbase.adobe.com/index.cfm?event=bug&id=4175842 with Adobe.

Adobe then came back and said the behavior in CF10 and CF11 was actually wrong they "fixed" it in CF2016 Update 2.  They referenced CF10 bug:

https://bugbase.adobe.com/index.cfm?event=bug&id=3712098

I'm pretty sure our code had worked on CF versions prior to CF10 without issue as well so I'm not sure this explains that, but I can't remember that far back to say for certain.
Interesting. Though I am not sure I agree with how they handled it.  CF's long been known to have a 32bit int limit for most functions and operators. So of course some numbers may be too large, and could obviously be truncated when converted to an INT. However, the documentation says:

Description:  Converts string, using the base specified by radix, to an integer.

To me that says the expected result should be an integer. To suddenly decide it will now be a Long, or some other type, breaks backward compatibility.  Why not throw an error, or add a new parameter/setting that defaults to the old behavior, but allows developers to chose which behavior they want on a per app basis?  BTW, if the function is no longer returning a 32 bit integer - do you know what is it returning? I didn't see that mentioned in either bug report.  I'm guessing:

http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html#MIN_VALUE
http://docs.oracle.com/javase/7/docs/api/java/lang/Long.html#MAX_VALUE

?

I'm pretty sure our code had worked on CF versions prior to CF10 without issue

Well I suppose if it's something simply dependent on matching output like hashing, or the other system uses ints too, in theory it might work either way.