Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1118
  • Last Modified:

Convert ansi string to unicode string?


I have a win32 console app. All I need to do is convert a command line argument to a unicode string, but it seems to always fail.
For instance:

int main(int argc, _TCHAR* argv[])
    // Copy argv[1] into a unicode string.
    wchar_t wcMyUnicodeString[MAX_PATH];
    MultiByteToWideChar(CP_ACP, MB_PRECOMPOSED, argv[1], (int)strlen(argv[1]), wcMyUnicodeString, MAX_PATH);

    // This prints out just fine.
    fwprintf(stdout, wcMyUnicodeString);

    // Now I want to pass it to this interface:

but it always fails at SetApplicationName()!!!!! It works just fine if I use a unicode literal such as:

SetApplicationName(L"some name");

SetApplicationName() takes a LPCWSTR as its only parameter, is this not compatible with my w_chart? I thought it would be the same as the literal string I preceded with the capital L?

Please help me.
1 Solution
Actually, that should work. Would

wsprintfW(wcMyUnicodeString,L"%S", argv[1]);


minnirokAuthor Commented:
That's it!

Thanks again jkr. Once I get some free time, I'll have read up what its actual malfunction was!

Featured Post

Receive 1:1 tech help

Solve your biggest tech problems alongside global tech experts with 1:1 help.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now