Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 10994
  • Last Modified:

Convert char, string, etc... to LPCTSTR?

I'm trying to use AVIStreamOpenFromFile
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/multimed/htm/_win32_avistreamopenfromfile.asp

and its second argument is LPCTSTR szFile. I am using the QT library rather than MFC. I cannot include <afx.h> library without getting errors, therefore I am not able to use CString. How would I pass a valid argument into that function with anything under the C++ standard library, such as char, string, etc...?

Thanks.
0
EVelasco
Asked:
EVelasco
1 Solution
 
rcarlanCommented:
The STL string class has a c_str() method. It returns a const pointer to char/wchar_t.

E.g.

std::string str = "dir\\file.ext";
LPCSTR psPath = str.c_str();

Radu
0
 
EVelascoAuthor Commented:
It still gives an error that I cannot convert LPCSTR to LPCTSTR.
0
 
nabehsCommented:
try

LPCTSTR psPath = str.c_str();

instead of

LPCSTR ...
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
rcarlanCommented:
Yes and no.

It depends if LPCTSTR is const char* or const wchar_t*. This is controlled by whether you have _MBCS or _UNICODE defined for your project.
Under _MBCS, LPCTSTR is actually identical to LPCSTR or const char*.
Under _UNICODE, LPCTSTR is LPCWSTR or const wchar_t*.

If you get an error that you cannot convert LPCSTR to LPCTSTR, it means you are compiling your project as Unicode. In this case you have to use std::wstring instead of std::string.

std::wstring str = L"dir\\file.ext";
LPCWSTR psPath = str.c_str();

If you want to stick to the ambient char approach (i.e. TCHAR, _T, LPCTSTR, etc), you should do something like this:

#if defined(_UNICODE)
    typedef std::wstring tstring;
#elif defined(_MBCS)
    typedef std::string tstring;
#endif

tstring str = _T("dir\\file.ext");
LPCTSTR psPath = str.c_str();

The above can be compiled under both _UNICODE and _MBCS without requiring any code changes.

Radu
0
 
EVelascoAuthor Commented:
_T uses the tchar.h library? I include that and get this error message:

Code:
#if defined(_UNICODE)
    typedef std::wstring filename;
#elif defined(_MBCS)
    typedef std::string filename;
#endif
...
filename file = _T("data/face2.avi");
LPCTSTR filepath = file.c_str();
...


Error:
 error C2440: 'initializing' : cannot convert from 'char [15]' to 'std::basic_string<_Elem,_Traits,_Ax>'
        with
        [
            _Elem=wchar_t,
            _Traits=std::char_traits<wchar_t>,
            _Ax=std::allocator<wchar_t>
        ]
0
 
rcarlanCommented:
This is very interesting. You have _UNICODE defined (because filename resolves to std::wstring), but _T resolves to nothing (it should resolve to L). Basically, after the pre-processing stage, you should've gotten this:

typedef std::wstring filename;
...
filename file = L"data/face2.avi";
LPCWSTR filepath = file.c_str();
...

But instead you got this:

typedef std::wstring filename;
...
filename file = "data/face2.avi";
LPCWSTR filepath = file.c_str();
...

I don't know what compiler you're using and what settings for your project, but I can only think of one of these as a cause for your problem:

1. The _UNICODE symbol is not defined at a project level, but it is defined in the compilation unit that contains the code fragment. You use pre-compiled headers and tchar.h is included directly or indirectly when building the pre-compiled header. However, because _UNICODE is not defined at that stage, _T resolves to nothing in the pre-compiled header and stays that way across all compilation units. You can solve this by either defining _UNICODE at the top of the header file that builds the pre-compiled header (e.g. stdafx.h) or by defining _UNICODE at a project level (e.g. in VS6: Project | Settings | C/C++ | General | Preprocessor definitions).

2. You have a header file that is included after tchar.h (possibly in the pre-compiled header) that overrides the _T macro definition. In VS6, if you have browser information available, you can jump to the definition of _T and examine it (press F12 while the insertion cursor is on _T in your source code).

3. You have a compiler switch that tells your compiler to treat "text" as Unicode text, and L"text" as SBCS/MBCS/ANSI/ASCII text (I'm not aware of such a switch). You can easily test if this is the case by replacing _T("data/face2.avi") with "data/face2.avi".


Good luck,
Radu
0
 
EVelascoAuthor Commented:
_UNICODE is automatically defined during compilation. I think it will now work by adding to the preprocessor. It worked by removing the _UNICODE definition, but I think I will want the _UNICODE defined so I will try that. I will add more points for your time and help. The instructions you posted in VS6, "VS6: Project | Settings | C/C++ | General | Preprocessor definitions"

any idea for Visual Studio?
If not, it's okay, you've already helped a lot. Thanks.
0
 
rcarlanCommented:
>>any idea for Visual Studio?

I don't understand the question.

Radu
0
 
CaxiCommented:
I'm guessing he is looking for an answer in if it isn't Visual Studio 6 but something a bit newer like 2005, 2008, or the newer 2010.

Changing the character set is located at:

Project | Properties | General | Character Set | Use Unicode Character Set instead of the default Use Multi-Byte Character Set.
0

Featured Post

[Webinar] Database Backup and Recovery

Does your company store data on premises, off site, in the cloud, or a combination of these? If you answered “yes”, you need a data backup recovery plan that fits each and every platform. Watch now as as Percona teaches us how to build agile data backup recovery plan.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now