steva
asked on
CString behavior
I notice that if I assign a wide string to a CString the resulting CString is automatically converted from wide to char, as in the example below...
CString str;
str = val.bstrVal;
where val is a VARIANT filled in by a query to WMI. The BSTR at val.bstrVal contains wide characters terminated by a NULL and after the assign str is the same text, but characters. Does anyone know where this is "cast" behavior is documented, if it is, and if I can rely on it always happening?
Thanks,
Steve
CString str;
str = val.bstrVal;
where val is a VARIANT filled in by a query to WMI. The BSTR at val.bstrVal contains wide characters terminated by a NULL and after the assign str is the same text, but characters. Does anyone know where this is "cast" behavior is documented, if it is, and if I can rely on it always happening?
Thanks,
Steve
ASKER
You're right. It's not. But does that empower the compiler to automatcially "cast" wide strings to character strings? If I hadn't stumbled onto this I would have thought that some sort of conversion call was going to be necessary first to convert the wide string to a char string before I assigned it to a CString. And if this is what the compiler does indeed do, where is it documented? Can I always count on it?
Again, thanks for any input,
Steve
Again, thanks for any input,
Steve
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Ok. So what you seem to be saying is that this automatic cast behavior is a fluke and I shouldn't count on it.
It's not that I want to "keep" UNICODE or "keep" ANSI. This conversion from wide to char is exactly what I wanted to do, and the simple assignment seems cleaner than hauling in atlbase.h and applying W2A - if it can be counted on.
It's not that I want to "keep" UNICODE or "keep" ANSI. This conversion from wide to char is exactly what I wanted to do, and the simple assignment seems cleaner than hauling in atlbase.h and applying W2A - if it can be counted on.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Ok guys. Thanks for the discussion. I split the points.
ASKER
I just came across the following information in Que's "Practical Visual V++ 6" by Jon bates and Tim Tompkins. They're talking on page 660 about how to convert from a CString to a BSTR and then say "The reverse translation can be performned by using the (char*) and (const char*) casts to turn the BSTR into a null-terminated string."
So an explicit cast certainly works, and I guess when the compiler sees that the CString is the char version, rather than the UNICODE type, it will also do the cast automatically, sort of like when you assign a byte to an integer.
Anyway, I thought I'd toss that in here, in case anyone other than me had an interest.
Steve
So an explicit cast certainly works, and I guess when the compiler sees that the CString is the char version, rather than the UNICODE type, it will also do the cast automatically, sort of like when you assign a byte to an integer.
Anyway, I thought I'd toss that in here, in case anyone other than me had an interest.
Steve
That sounds like your C++ code is not compiler using the _UNICODE define (Project Properties).
Share and Enjoy Christoph