Strings in UNICODE

Hi !

Bellow is some simple test program:

#include "tchar.h"

#define UNICODE
#define _UNICODE

int main(int argc, char* argv[])
{

     TCHAR MyString[80];
     _stprintf(MyString,_T("URL is i %d"),6);

     //wprintf(MyString,"%s");
        printf(MyString,"LLL");
     return 0;
}

As far as I undertstand, when UNICODE is defined, TCHAR should be translated as UNICODE string, and if not - as ANSI string. However, it seems that this code allways comilled in ANSI, because I when I use wprintf function, even when UNICODE is defined, I allways get "C2664 error" - cannot convert parameter 1 from 'char [80]' to 'const unsigned short *'. At the same time, "printf' function allways get compilled Ok.

What is going on there ?
What is the right way to make UNICODE string operations ?

Thanks a lot !
ab11Asked:
Who is Participating?
 
nietodConnect With a Mentor Commented:
Apparently it also affects procedures and code defined in the C++ run-time library (RTL) too.   Howeve,r you need to define it BEFORE including those files.  i.e. those files contain conditional compilation (#ifdef....#endif) that uses this define, so definiing this value after them is too late.

You need to move the #defines to the top of the program, or define them using a command-line parameter for the compiler.
0
 
nietodCommented:
Those definitions will affect code found in windows.h (and files it includes).   But you don't appear to be including windows.h.
0
 
ccaprarCommented:
out the 2 UNICODE and _UNICODE defines in project\settings\C/C++\PreprocessorDefinitions

this applies if you have VC++ :)

your code will work just fine and you won't have to take care of putting the defines in some specific place in your code.

if you use something else, be carefull to define the 2 before ANY includes of h files :)
0
Cloud Class® Course: Ruby Fundamentals

This course will introduce you to Ruby, as well as teach you about classes, methods, variables, data structures, loops, enumerable methods, and finishing touches.

 
jkrCommented:
>>even when UNICODE is defined, I allways get "C2664
>>error" - cannot convert parameter
>>1 from 'char [80]' to 'const unsigned short *'

This is not related toyour string variable, but to the 'wprintf()' format string:

wprintf(MyString,"%s"); // "%s" represents an *ANSI* char* !!!

should read

wprintf(MyString,L"%s"); // L"%s" represents an *UNICODE* wchar_t* !!!


which should logically be

wprintf(L"%s", MyString);


It'd be better to use '_tprintf()', though:


_tprintf(_T("%s"), MyString);

to keep it compatible when you don't '#define UNICODE'


Example

/* PRINTF.C: This program uses the printf and wprintf functions
 * to produce formatted output.
 */

#include <stdio.h>

void main( void )
{
   char   ch = 'h', *string = "computer";
   int    count = -9234;
   double fp = 251.7366;
   wchar_t wch = L'w', *wstring = L"Unicode";

   /* Display integers. */
   printf( "Integer formats:\n"
           "\tDecimal: %d  Justified: %.6d  Unsigned: %u\n",
           count, count, count, count );

   printf( "Decimal %d as:\n\tHex: %Xh  C hex: 0x%x  Octal: %o\n",
            count, count, count, count );

   /* Display in different radixes. */
   printf( "Digits 10 equal:\n\tHex: %i  Octal: %i  Decimal: %i\n",
            0x10, 010, 10 );

   /* Display characters. */

   printf("Characters in field (1):\n%10c%5hc%5C%5lc\n", ch, ch, wch, wch);
   wprintf(L"Characters in field (2):\n%10C%5hc%5c%5lc\n", ch, ch, wch, wch);

   /* Display strings. */

   printf("Strings in field (1):\n%25s\n%25.4hs\n\t%S%25.3ls\n",
   string, string, wstring, wstring);
   wprintf(L"Strings in field (2):\n%25S\n%25.4hs\n\t%s%25.3ls\n",
       string, string, wstring, wstring);

   /* Display real numbers. */
   printf( "Real numbers:\n\t%f %.2f %e %E\n", fp, fp, fp, fp );

   /* Display pointer. */
   printf( "\nAddress as:\t%p\n", &count);

   /* Count characters printed. */
   printf( "\nDisplay to here:\n" );
   printf( "1234567890123456%n78901234567890\n", &count );
   printf( "\tNumber displayed: %d\n\n", count );

0
 
ab11Author Commented:
Ok, thanks for all
0
 
ab11Author Commented:
Thanks for all your help !
In fact, it's really not much practical to define UNICODE after the include files :)
0
 
jkrCommented:
>>In fact, it's really not much practical to define
>>UNICODE after the include files :)

Yes, but

>>wprintf(MyString,"%s");

Is

- logically wrong
- and produces 'cannot convert..' regardless where you #define UNICODE
0
 
ccaprarCommented:
jkr is totally right
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.