We help IT Professionals succeed at work.

We've partnered with Certified Experts, Carl Webster and Richard Faulkner, to bring you two Citrix podcasts. Learn about 2020 trends and get answers to your biggest Citrix questions!Listen Now


Text field limit

hawkinsc asked
Medium Priority
Last Modified: 2008-02-01
I have a database with a long text field.  I am putting NT event logs into a text field using BCP after using Somarsoft DUMPEVT.  The text field is truncating the input at 256 characters.  How do I get the text field to be larger than 256 characters and take all the BCP input?
Watch Question

Victor SpiridonovDatabase Director

It is not possible to load data of type text using bcp if it is more then 255 bytes long. (see MS knowledge base  article Q101607). Try is to specify CHARACTER mode in bcp format file
and see if it works.


I know about Q101607.  For 500 points, I want someone to give me a working example of how to do this, not to tell me that it doesn't work.
Victor SpiridonovDatabase Director

Give me an example of data you are trying to input and I'll try to help you with format file.

It is not only bcp that enforces a 255 limit on inserts.  Even if you have the target column defined as a memo field, and even if you format an sql insert that contains your text info, SQL server will still truncate your input to 255 characters.  So you can't use a text field, you must use a memo field.  This causes other problems for you on the output side, since you can't use normal SQL queries on the memo data.  That is where..like selections don't work on memo fields, they are treated more like BLOBs than text fields.

So, you have two choices.  You can elect to preprocess your input, and break it down into multiple rows, each row having 255 or less characters and identical keys.  At least this way you can still use BCP, and you can query on the text info.

Or, you can write a custom insert program, updating all other columns normally, and using the .AppendChunk methods on you r long text field.  Problem with this is that it runs slower, and uses your transaction log as well, which BCP avoids.


Below is a sample line.  The ninth field is the text field that is being truncated at 256 characters. All others work well.

SEC,9/26/97 10:45:48,Security,578,Failure,Privilege Use ,Domain\Administrator,Computername,Privileged object operation:^`       Object Server:      Security^`       Object Handle:      4294967295^`       Process ID:      2155271744^`       Primary User Name:      SYSTEM^`       Primary Domain:      NT AUTHORITY^`       Primary Logon ID:      (0x0 0x3E7)^`       Client User Name:      Administrator^`       Client Domain:      Domain^`       Client Logon ID:      (0x0 0x72966)^`       Privileges:      SeIncreaseBasePriorityPrivilege


Not trying to be a jerk.  The table doesn't have a primary key.  Not necessary for our purposes.  I need to be able to do LIKE queries on the text field.  I would need the ninth field to be associated with the other data.  It seems like the AppendChuck method might work as long as I can LIKE query.  Running slower and transaction logs are not a problem.


To cymbolic - does the AppendChunk method still allow me to LIKE query the text field?  If so, give me more information on this method.  I can't find it in SQLBooksOnline.

No, the LIKE modifier will not work on page based data, like memo fields.  In order to use the LIKE and other normal Where Clause comparisons, you need to use a text or Varchar type field.

Based upon the regularity of your input data, I would write a simple text preprocessor, to break down your long text field into its named components (Primary Usr name, etc), removing the headers, and naming the columns as per type, then bcp in the actual data only.  Makes your resultant database more useable anyway, since plugging multiple infromation into a single field is on of the cardinal sins of normalization, and will make the database purists puke!  


On business trip until today. I think we are getting real close.  The data is just a dump from the NT Event Logs.  The ninth field is the event log message.  There really is no regularity.  The earlier sample is just an example of more than 256 characters.  It sounds like pre-processing the text file may work.  The text file is a comma delimited file with nine fields.  Each row is terminated with a line return.  I would imagine we need to write a program that would insert a comma after the 255th character in the ninth field to make a tenth field.  I have not encountered many fields greater than 512 characters.  If there were less than 255 characters, we would just insert a comma at the end of the row to keep the BCP columns in order.  We could dump the columns into a temporary table, probably.  Is there then a way we could rejoin the two - or maybe three - columns into a text field within SQL from the temporary table and append the rows to the permanent table?
Below you can see the cut'n'paste of the page in On Line Books of MS-SQL Server 6.5 explicating data type abbreviations in BCP...
The file storage type describes how data is stored in the file. You can copy data into a file as its database table type, as a character string in ASCII format, or as any datatype for which implicit conversion is supported for the datatype. User-defined datatypes are copied as their base types. The following table shows the storage types that can be used with bcp and the abbreviations (if any) that can be entered at the command prompt.
For the most compact storage, use the default; for ASCII files, use char.
Storage type      Abbreviation
char            c[har]
varchar            c[har]
text            T[ext]
binary            x
varbinary      x
image            I[mage]
datetime      d[ate] *
smalldatetime      D
decimal            n
numeric            n
float            f[loat] *
real            r
int            i[nt] *
smallint      s[mallint] *
tinyint            t[inyint] *
money            m[oney] *
smallmoney      M
bit            b[it]
timestamp      *      

Length is dependent on the data. For example, an int of 23 has a storage length of       2 bytes plus the prefix and terminator (if any); an int of 5,238,876 has a storage length       of 7 bytes plus the prefix and terminator (if any).      

Giving a type other than char or a datatype that represents an illegitimate implicit conversion causes bcp to fail. For example, you cannot use smallint for int data (you will get overflow errors), but you can use int for smallint.
When integer datatypes (float, money, and numeric) are stored as their database types, the data is written to the file in the operating-system data representation format, rather than in human-readable form.
Just try to set the data transfer from table to file using the native format, and not the char format, giving the bcp format file the right options....
Let me know about the results...
C ya man


I'm not going from table to file, but from file to table.  The file going in is definitely in character format.

500 points ! wow !
It is not an answer. However, for $500 i can write you a tool that import an NT event log in a SQL odbc datasource... but I guess this kind of tool already exists as a $20 shareware :)


Adjusted points to 520


So tell me where the shareware that will import an NT event log is and how to make it work.
Hawkinsc, email a sample your your file to tnorris@hti.net and I'll whip up something to do the upload to SQL Server. Should not be a problem. An alternative is to use MS Excel and import the file and a .CSV and manually "spilt" the ninth field into two separate fields, then export (via Excel with an ODBC connection or MS Access) into the SQL Server table.

How often are you wanting to "upload" the event logs? Daily? or every time a new event is recorded?

Good luck with your project!

BTW: Cymbolic is definitely on the right track, I concur with his conclusions :)

Preprocessing the file for load with an Ascii file parser is an easy task, and would run very fast.  I really think that with some examination of input file, you will see enough regularity in your text field content to assign a number of identified columns for repeating information types within your long text field that you can load as separate columns to your sql table, and take the remainder of the text and place it into a variable length text field.

I wouldn't mind looking at a sample either (cymbolic@aol.com), and I really don't want care who gets the points.



I have 900 points that we can split between the two of you.  I'll send you a copy of the text file that I sent Pegasus.  
After looking at your logs, I'm thinking that it would be very easy to enhance the preprocessing step to count and separate items of critical interest, such as failed logon attempts, etc.  We could do this by using an external file of phrases of interest, with one line entries that are read by the preprocess program, and scanned for in all the data that is processed.  That way you could modify what you are looking for as items of interest appear without haveing to make program changes.  You would get immediate "Heads up"  on problem entries, and we could even save the summarized item counts for time series analysis to pick up things like sudden or gradual upswings in failed logon attempts, etc.  If interested, email to: cymbolic@aol.com

Not the solution you were looking for? Getting a personalized solution is easy.

Ask the Experts
Access more of Experts Exchange with a free account
Thanks for using Experts Exchange.

Create a free account to continue.

Limited access with a free account allows you to:

  • View three pieces of content (articles, solutions, posts, and videos)
  • Ask the experts questions (counted toward content limit)
  • Customize your dashboard and profile

*This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.


Please enter a first name

Please enter a last name

8+ characters (letters, numbers, and a symbol)

By clicking, you agree to the Terms of Use and Privacy Policy.