Quick, (easy?) question about SQL bit data type..
Posted on 2007-10-16
I'm just wondering for piece of mind-
I recently was developing an application and, when working in my dev environment (dev db), I was able to pass in varchar values of 'true' and 'false' to a bit column. This worked fine, however, when I moved to production I would get an error saying SQL could not convert the varchar values to a bit value. Hmmm. Wasn't a huge problem- I just changed the values to 1 and 0 for true and false.
I guess my question is- why would this occur in on different SQL Servers? Is there a Serer setting, or is this due to different versions? I was just wondering why this was the case..
Thanks in advance! (I'll be happy to give points to the best response(s)).