[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1262
  • Last Modified:

Unicode/Ansi mix woes


Using VS 2010, I'm involved in developing a series inter-related projects that has a mix of ansi and unicode character sets. Most projects properties are set to unicode, but some will be "Not Set". The non-unicode projects will need to call some unicode functions, (with appropriate translations of course).

I refer you to the code below. In non-unicode projects, before including unicode project header files, I define ABV_FORCE_UNICODE_DOWNSTREAM which tells the unicode header file that it needs to define UNICODE and _UNICODE because it is being included by a non-unicode project.

If I compile in release mode, it works. However, a debug compile will fail; example: the below header file LPCTSTR incorrectly translates to the ansi version, (typedef LPCSTR LPCTSTR), but _T() macro of Clear_Error() correctly translates to the unicode version thus generating the error:
Error      1      error C2664: 'ABV_ErrorStateClass::Set_Error' : cannot convert parameter 3 from 'const wchar_t [1]' to 'LPCTSTR'      D:\abv00_common_cpp\abv00_cpp_errors.h      125      1      zzzABVtest

God I hate dealing with stings in C++. Any ideas?


// ABV_ODBC_Driver.cpp, (non-unicode)
// ==================================
#pragma once
// ...
#define ABV_FORCE_UNICODE_DOWNSTREAM
#include "../ABV00_Common/ABV00_Common_CPP/ABV00_CPP_Errors.h"
// ...

// ABV00_CPP_Errors.h, (from a unicode project)
// ============================================
// - - -- - - - - - - - - - - - - - - -
#pragma once

#ifdef ABV_FORCE_UNICODE_DOWNSTREAM

// Force UNICODE
#ifndef UNICODE
#define UNICODE
#define XUNICODE_ERROR_H
#endif

// Force _UNICODE
#ifndef _UNICODE
#define _UNICODE
#define _XUNICODE_ERROR_H
#endif

#endif	// #ifdef ABV_FORCE_UNICODE_DOWNSTREAM
// - - -- - - - - - - - - - - - - - - -


// ...

class ABV_ErrorStateClass
{
	// ...
public:

	// Get or set HRESULT return values, ala COM. SystemErrorNo is generally result of GetLastError()
	void Set_Error(int ErrorState, HRESULT Set_Hresult, LPCTSTR Set_ErrorStr, 
			char *ArgSourceRoutine, char *ArgSourceFile, char *ArgSourceFileDate, LPSTR ArgStackTrace);

	void Clear_Error() { Set_Error(0,S_OK,_T(""),"","","",""); ClearLast_WINAPI_Error();};

	// ...

};

// ...

// - - -- - - - - - - - - - - - - - - -
#ifdef ABV_FORCE_UNICODE_DOWNSTREAM
// Reset UNICODE
#ifdef XUNICODE_ERROR_H
#undef UNICODE
#undef XUNICODE_ERROR_H
#endif

// Reset _UNICODE
#ifdef _XUNICODE_ERROR_H
#undef _UNICODE
#undef _XUNICODE_ERROR_H
#endif

#endif	// #ifdef ABV_FORCE_UNICODE_DOWNSTREAM
// - - -- - - - - - - - - - - - - - - -

Open in new window

ScreenDump.png
0
elcbruce
Asked:
elcbruce
  • 3
  • 2
1 Solution
 
elcbruceAuthor Commented:
Correction, it doesn't work in release either.
0
 
ZoppoCommented:
Hi elcbruce,

IMO you can solve this in different ways - here two examples:

1. implement that function twice, once for UNICODE and once for ASCII, i.e.:

#ifdef UNICODE
      void Clear_Error() { Set_Error(0,S_OK,_T(""),"","","",""); ClearLast_WINAPI_Error();};
#else
      void Clear_Error() { Set_Error(0,S_OK,"","","","",""); ClearLast_WINAPI_Error();};
#endif // UNICODE

2. temporary change _T macro

#ifndef UNICODE
#undef _T
#define _T(x) x
#endif

If you need _T to be defined as before later on you can use pragmas push_macro and pop_macro to restore it.

Hope that helps,

ZOPPO

0
 
sarabandeCommented:
if you defined the UNICODE macro in non-unicode projects you actually made this project a UNICODE project what most likely fails if you didn't consequently use the T types (TCHAR, LPTSTR, and more)and _T macro for literals.

in my opinion there are only two ways senseful:

one is to let the project NON-UNICODE and explicity use W types (wchar_t, LPWSTR, ...) when calling into UNICODE functions.

another is to port the project from ANSI to UNICODE in the way zoppo has told.

Sara
0
Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
elcbruceAuthor Commented:

Thanks for your response, but you misunderstand the problem.

To clarify, UNICODE and _UNICODE are defined in the header file, not the project. The project has character set configured to "Not Set". According to documentation:
1) LPCTSTR is an LPCWSTR if UNICODE is defined, an LPCSTR otherwise.  
2) With _UNICODE defined, _T translates the literal string to the L-prefixed form; otherwise, _T translates the string without the L prefix.

The problem is that the _T macro is functioning as expected, but LPCTSTR is not, (it should translate to LPCWSTR, not LPCSTR).
0
 
sarabandeCommented:
it makes not much difference whether you define the UNICODE, _UNICODE in project preprocessor or in header files. if you would put it in stdafx.h and stdafx was included by all cpp files as first header the effect for the project would be the same.

but if you define the macros below stdafx.h in a header you make it worse cause beginning with the definition all your types LPTSTR, TCHAR, CString, ... would turn to wide char what normally makes a mess no one can solve.

you can't drive a project with a "floating" UNICODE macro definition.

Sara
0
 
elcbruceAuthor Commented:
Yea, I was afraid of that. Thanks to all.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

  • 3
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now