Please consider the following code, part of a custom CMyLog class which basically allows output to be written to a file like this:
log.Printf ( _T("This is a string with an %s"), _T("argument") );
BOOL CMyLog::Printf( LPCTSTR lpszFormatString, ... )
TCHAR szBuffer[ 1024 ];
int nCharsWritten = 0;
if ( m_fp )
// Deal with varying arguments
va_list args; // Our variable argument list
va_start ( args, lpszFormatString );
_vsntprintf_s ( szBuffer, 1024, lpszFormatString, args );
nCharsWritten = _ftprintf( m_fp, szBuffer );
fflush( m_fp ); // Output data immediately
ASSERT ( nCharsWritten >= 0 );
return nCharsWritten >= 0; // -ve value returned on failure
The problem is the current implementation allocated 1024 * ( sizeof TCHAR ) bytes every time I call it. This may be enough, it may not be. I am more concerned about the latter. If I pass more data than space there is in my buffer, the code falls over.
Now, obviously I could get the length of the string lpszFormatString with _tcslen and allocate based on that, but how on earth can I get the length of the varying argument list 'args' ? Is this possible?