Is there anything fundamentally or philosophically wrong with using _bstr_t as a parameter of a function rather than LPTSTR (or LPCTSTR)? I notice all Unicode Platform SDK functions take LPTSTR (or LPCTSTR) as parameters, never _bstr_t, but it tends to make little difference to the caller of the function. The only difference it makes is how the functions deals with the passed-in string.
I guess with _bstr_t, the caller can pass in a TCHAR*, LPTSTR, LPCTSTR, char* or wchar_t* and it will still work fine. But with LPTSTR (or LPCTSTR), the caller must pass in a TCHAR* type (e.g., LPTSTR, TCHAR*, _T("")).
Any ideas on what is the status quo (if any) or why one way is better than the other?