Solved

Text field limit

Posted on 1997-12-04
18
577 Views
Last Modified: 2008-02-01
I have a database with a long text field.  I am putting NT event logs into a text field using BCP after using Somarsoft DUMPEVT.  The text field is truncating the input at 256 characters.  How do I get the text field to be larger than 256 characters and take all the BCP input?
0
Comment
Question by:hawkinsc
  • 9
  • 4
  • 2
  • +3
18 Comments
 
LVL 7

Expert Comment

by:spiridonov
ID: 1089476
It is not possible to load data of type text using bcp if it is more then 255 bytes long. (see MS knowledge base  article Q101607). Try is to specify CHARACTER mode in bcp format file
and see if it works.
0
 

Author Comment

by:hawkinsc
ID: 1089477
I know about Q101607.  For 500 points, I want someone to give me a working example of how to do this, not to tell me that it doesn't work.
0
 
LVL 7

Expert Comment

by:spiridonov
ID: 1089478
Give me an example of data you are trying to input and I'll try to help you with format file.
0
 
LVL 9

Expert Comment

by:cymbolic
ID: 1089479
It is not only bcp that enforces a 255 limit on inserts.  Even if you have the target column defined as a memo field, and even if you format an sql insert that contains your text info, SQL server will still truncate your input to 255 characters.  So you can't use a text field, you must use a memo field.  This causes other problems for you on the output side, since you can't use normal SQL queries on the memo data.  That is where..like selections don't work on memo fields, they are treated more like BLOBs than text fields.

So, you have two choices.  You can elect to preprocess your input, and break it down into multiple rows, each row having 255 or less characters and identical keys.  At least this way you can still use BCP, and you can query on the text info.

Or, you can write a custom insert program, updating all other columns normally, and using the .AppendChunk methods on you r long text field.  Problem with this is that it runs slower, and uses your transaction log as well, which BCP avoids.
0
 

Author Comment

by:hawkinsc
ID: 1089480
Below is a sample line.  The ninth field is the text field that is being truncated at 256 characters. All others work well.

SEC,9/26/97 10:45:48,Security,578,Failure,Privilege Use ,Domain\Administrator,Computername,Privileged object operation:^`       Object Server:      Security^`       Object Handle:      4294967295^`       Process ID:      2155271744^`       Primary User Name:      SYSTEM^`       Primary Domain:      NT AUTHORITY^`       Primary Logon ID:      (0x0 0x3E7)^`       Client User Name:      Administrator^`       Client Domain:      Domain^`       Client Logon ID:      (0x0 0x72966)^`       Privileges:      SeIncreaseBasePriorityPrivilege

0
 

Author Comment

by:hawkinsc
ID: 1089481
Not trying to be a jerk.  The table doesn't have a primary key.  Not necessary for our purposes.  I need to be able to do LIKE queries on the text field.  I would need the ninth field to be associated with the other data.  It seems like the AppendChuck method might work as long as I can LIKE query.  Running slower and transaction logs are not a problem.
0
 

Author Comment

by:hawkinsc
ID: 1089482
To cymbolic - does the AppendChunk method still allow me to LIKE query the text field?  If so, give me more information on this method.  I can't find it in SQLBooksOnline.
0
 
LVL 9

Expert Comment

by:cymbolic
ID: 1089483
No, the LIKE modifier will not work on page based data, like memo fields.  In order to use the LIKE and other normal Where Clause comparisons, you need to use a text or Varchar type field.

Based upon the regularity of your input data, I would write a simple text preprocessor, to break down your long text field into its named components (Primary Usr name, etc), removing the headers, and naming the columns as per type, then bcp in the actual data only.  Makes your resultant database more useable anyway, since plugging multiple infromation into a single field is on of the cardinal sins of normalization, and will make the database purists puke!  
0
 

Author Comment

by:hawkinsc
ID: 1089484
On business trip until today. I think we are getting real close.  The data is just a dump from the NT Event Logs.  The ninth field is the event log message.  There really is no regularity.  The earlier sample is just an example of more than 256 characters.  It sounds like pre-processing the text file may work.  The text file is a comma delimited file with nine fields.  Each row is terminated with a line return.  I would imagine we need to write a program that would insert a comma after the 255th character in the ninth field to make a tenth field.  I have not encountered many fields greater than 512 characters.  If there were less than 255 characters, we would just insert a comma at the end of the row to keep the BCP columns in order.  We could dump the columns into a temporary table, probably.  Is there then a way we could rejoin the two - or maybe three - columns into a text field within SQL from the temporary table and append the rows to the permanent table?
0
Free Gift Card with Acronis Backup Purchase!

Backup any data in any location: local and remote systems, physical and virtual servers, private and public clouds, Macs and PCs, tablets and mobile devices, & more! For limited time only, buy any Acronis backup products and get a FREE Amazon/Best Buy gift card worth up to $200!

 
LVL 2

Expert Comment

by:Wiz_of_Oz
ID: 1089485
Below you can see the cut'n'paste of the page in On Line Books of MS-SQL Server 6.5 explicating data type abbreviations in BCP...
-----------------------------------------------------------------
The file storage type describes how data is stored in the file. You can copy data into a file as its database table type, as a character string in ASCII format, or as any datatype for which implicit conversion is supported for the datatype. User-defined datatypes are copied as their base types. The following table shows the storage types that can be used with bcp and the abbreviations (if any) that can be entered at the command prompt.
For the most compact storage, use the default; for ASCII files, use char.
      
Storage type      Abbreviation
char            c[har]
varchar            c[har]
text            T[ext]
binary            x
varbinary      x
image            I[mage]
datetime      d[ate] *
smalldatetime      D
decimal            n
numeric            n
float            f[loat] *
real            r
int            i[nt] *
smallint      s[mallint] *
tinyint            t[inyint] *
money            m[oney] *
smallmoney      M
bit            b[it]
timestamp      *      

Length is dependent on the data. For example, an int of 23 has a storage length of       2 bytes plus the prefix and terminator (if any); an int of 5,238,876 has a storage length       of 7 bytes plus the prefix and terminator (if any).      

Giving a type other than char or a datatype that represents an illegitimate implicit conversion causes bcp to fail. For example, you cannot use smallint for int data (you will get overflow errors), but you can use int for smallint.
When integer datatypes (float, money, and numeric) are stored as their database types, the data is written to the file in the operating-system data representation format, rather than in human-readable form.
-----------------------------------------------------------------
Just try to set the data transfer from table to file using the native format, and not the char format, giving the bcp format file the right options....
Let me know about the results...
C ya man
0
 

Author Comment

by:hawkinsc
ID: 1089486
I'm not going from table to file, but from file to table.  The file going in is definitely in character format.
0
 

Expert Comment

by:TMS
ID: 1089487
500 points ! wow !
It is not an answer. However, for $500 i can write you a tool that import an NT event log in a SQL odbc datasource... but I guess this kind of tool already exists as a $20 shareware :)
0
 

Author Comment

by:hawkinsc
ID: 1089488
Adjusted points to 520
0
 

Author Comment

by:hawkinsc
ID: 1089489
So tell me where the shareware that will import an NT event log is and how to make it work.
0
 
LVL 3

Expert Comment

by:Pegasus100397
ID: 1089490
Hawkinsc, email a sample your your file to tnorris@hti.net and I'll whip up something to do the upload to SQL Server. Should not be a problem. An alternative is to use MS Excel and import the file and a .CSV and manually "spilt" the ninth field into two separate fields, then export (via Excel with an ODBC connection or MS Access) into the SQL Server table.

How often are you wanting to "upload" the event logs? Daily? or every time a new event is recorded?

Good luck with your project!
Pegasus

BTW: Cymbolic is definitely on the right track, I concur with his conclusions :)
0
 
LVL 9

Expert Comment

by:cymbolic
ID: 1089491
Preprocessing the file for load with an Ascii file parser is an easy task, and would run very fast.  I really think that with some examination of input file, you will see enough regularity in your text field content to assign a number of identified columns for repeating information types within your long text field that you can load as separate columns to your sql table, and take the remainder of the text and place it into a variable length text field.

I wouldn't mind looking at a sample either (cymbolic@aol.com), and I really don't want care who gets the points.

 
0
 

Author Comment

by:hawkinsc
ID: 1089492
I have 900 points that we can split between the two of you.  I'll send you a copy of the text file that I sent Pegasus.  
0
 
LVL 9

Accepted Solution

by:
cymbolic earned 520 total points
ID: 1089493
After looking at your logs, I'm thinking that it would be very easy to enhance the preprocessing step to count and separate items of critical interest, such as failed logon attempts, etc.  We could do this by using an external file of phrases of interest, with one line entries that are read by the preprocess program, and scanned for in all the data that is processed.  That way you could modify what you are looking for as items of interest appear without haveing to make program changes.  You would get immediate "Heads up"  on problem entries, and we could even save the summarized item counts for time series analysis to pick up things like sudden or gradual upswings in failed logon attempts, etc.  If interested, email to: cymbolic@aol.com
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

I wrote this interesting script that really help me find jobs or procedures when working in a huge environment. I could I have written it as a Procedure but then I would have to have it on each machine or have a link to a server-related search that …
Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Via a live example, show how to extract information from SQL Server on Database, Connection and Server properties
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.

758 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

24 Experts available now in Live!

Get 1:1 Help Now