Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

500.000 inserts go slow from .sql

Posted on 2005-04-07
13
Medium Priority
?
250 Views
Last Modified: 2008-03-10
I'm importing about 500.000 records in total from a .sql text file (into 3 tables), but is taking forever (too many hrs to wait it out). Any advice how i can make it faster?

This is the start of my query, and under is is code of (1 of the 3) tables:

SET NOCOUNT ON

Print 'Disabling all Constraints'
exec sp_msforeachtable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL'

Print 'Disabling all Triggers'
exec sp_msforeachtable 'ALTER TABLE ? DISABLE TRIGGER ALL'

PRINT 'Deleting from table: [dbo].[Parttype]'
DELETE FROM [dbo].[Parttype]

PRINT 'Deleting from table: [dbo].[OrderItem]'
DELETE FROM [dbo].[OrderItem]

PRINT 'Deleting from table: [dbo].[article]'
DELETE FROM [dbo].[article]

/* Insert scripts for table: [dbo].[article] */
PRINT 'Inserting rows into table: [dbo].[article]'

--SET IDENTITY_INSERT [dbo].[article] ON

INSERT INTO [dbo].[article] ([Id], [Brand], [TypeId], [CategoryId], [InCatalog], [SalesPrice], [CostPrice], [VATCategoryId], [StockLocationId], [Comments], [SupplierId], [ImagePath], [Number]) VALUES (176228, NULL, 3, NULL, 0, 9.83, 5.7997, NULL, NULL, 'BUMPER-BOTTOM YOK', NULL, NULL, '11288001')



table code:

CREATE TABLE [Article] (
  [Id] int NOT NULL,
  [Brand] nvarchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS,
  [TypeId] int,
  [CategoryId] int,
  [InCatalog] bit,
  [SalesPrice] money,
  [CostPrice] money,
  [VATCategoryId] int,
  [StockLocationId] int,
  [Comments] nvarchar(1000) COLLATE SQL_Latin1_General_CP1_CI_AS,
  [SupplierId] int,
  [ImagePath] varchar(500) COLLATE SQL_Latin1_General_CP1_CI_AS,
  [Number] varchar(1000) COLLATE SQL_Latin1_General_CP1_CI_AS,
  CONSTRAINT [PK_Article] PRIMARY KEY ([Id])
)
ON [PRIMARY]
GO
0
Comment
Question by:bertstevens
  • 7
  • 3
  • 2
  • +1
13 Comments
 
LVL 11

Expert Comment

by:pcsentinel
ID: 13727745
Check out BULK INSERT instead, its much faster

regards
0
 

Author Comment

by:bertstevens
ID: 13727804
for 250 points i hope to get a bit more explenation... i only have the .sql file with inserts.... how do i proceed from there?
0
 
LVL 23

Expert Comment

by:adathelad
ID: 13727818
You can use TRUNCATE TABLE <TableName> instead of DELETE to clear out the tables - this will be quicker as this means individual row deletions are not logged.
You could also remove any indexes on the tables while doing the INSERT so that they aren't constantly being updated.

Are you saying you have 500,000 individual INSERT statements?

If you had all the data stored as a CSV file for example, the import process would be a lot quicker as you could use DTS

0
Nothing ever in the clear!

This technical paper will help you implement VMware’s VM encryption as well as implement Veeam encryption which together will achieve the nothing ever in the clear goal. If a bad guy steals VMs, backups or traffic they get nothing.

 

Author Comment

by:bertstevens
ID: 13727883
yes, i have 500,000 individual INSERT statement (thats the way i recieve it...) . I could try to temp shut down the index but if that would help alot...? Clearing out the table is not the problem.
0
 
LVL 23

Accepted Solution

by:
adathelad earned 350 total points
ID: 13728001
It would be worth removing the indexes during the import - SQL then won't have to keep maintaining them. With this number of records, it should make a noticeable difference.

Other than that, I can't see any other way to speed this up as you already have the triggers disabled which would have been my other suggestion.
But if you could get the data supplied in another format (i.e. CSV) that would be 1000 times better. They must be generating this .sql script dynamically surely?!

Or, for quite a bit of work, you could write an application that takes the .sql script and strips all the data out into CSV format and import the CSV file. Not an ideal solution but thought I'd suggest it anyway.
0
 
LVL 11

Expert Comment

by:pcsentinel
ID: 13728063
e.g.

BULK INSERT Northwind.dbo.[Order Details]
   FROM 'f:\orders\lineitem.tbl'
   WITH
      (
         FIELDTERMINATOR = '|',
         ROWTERMINATOR = '|\n'
      )

the full list of options is quite intricate, but fully explained at

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_ba-bz_4fec.asp

regards

0
 

Author Comment

by:bertstevens
ID: 13728130
I tried removing the index, this improves preformance by about 15%. Of course i can't remove the index in production db, can i disable it during query?


pcsentinel: I can try your sollution but then i have to have a flag like FIELDTERMINATOR  but then something to ignore the insert part. So the 'INSERT INTO [dbo].[article] ([Id], [Brand], [TypeId], [CategoryId], [InCatalog], [SalesPrice], [CostPrice], [VATCategoryId], [StockLocationId], [Comments], [SupplierId], [ImagePath], [Number]) VALUES ('  part needs to be ignored and also the ')' at the end.
0
 
LVL 11

Expert Comment

by:pcsentinel
ID: 13728258
Is it possible to modify the input file?
0
 

Author Comment

by:bertstevens
ID: 13728289
yes.... i modified the input file , and now for the first table the rows look like:

177338, NULL, 3, NULL, 0, 11360.48, 6702.6832, NULL, NULL, 'PH FULL 150 DFI', NULL, NULL, '61-892944A02'

Is this usable? Or need i to remove the '  signs in the rows?

BULK INSERT Naut.dbo.[Article]
   FROM 'c:'\articles.txt'
   WITH
      (
         FIELDTERMINATOR = ',',
         ROWTERMINATOR = '\n'
      )
0
 

Author Comment

by:bertstevens
ID: 13728332
It complains: 'Bulk insert data conversion error (type mismatch) for row 1, column 4 (CategoryId).'  It's expecting a INT although the number is clearly a int?


BULK INSERT Naut.dbo.[Article]
   FROM 'c:\articles.txt'
   WITH
      (
         FIELDTERMINATOR = ',',
         ROWTERMINATOR = '\n'
      )
0
 

Author Comment

by:bertstevens
ID: 13728488
OK, problem solved by changing the input file and then using bulk insert as above. The type mismatches were caused by text in fields where numbers should go.
0
 

Author Comment

by:bertstevens
ID: 13728603
I still have 2 problems:

- NULL isn't recognized, is it possible to make bulk recognize this in stead of replacing it with 0?

- some of the fields contain the ',' sign, but thats also the field delimeter, how can i solve this? Can't bulk use the ' ' signs to know where a textfield ends?

0
 

Assisted Solution

by:jlilley
jlilley earned 150 total points
ID: 13730774
Handle NULL by omitting the value, e.g. instead of:
A,B,C,NULL,D
use
A,B,C,,D

Columns containing delimitiers must be quoted, for example:
A,B,list,of,words,C,D
becomes:
A,B,'list,of,words',C,D

Not sure about how to specify the quote character in the BULK INSERT statement though... maybe its double-quote instead of single-quote by default.

Finally, when you quote values containing the quote character you have to double the quote character whereever its found in the value, e.g.:
A,B,bob's garage,anytown USA,C,D
becomes
A,B,'bob''s garage,anytown USA',C,D
0

Featured Post

What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Ready to get certified? Check out some courses that help you prepare for third-party exams.
MSSQL DB-maintenance also needs implementation of multiple activities. However, unprecedented errors can hamper the database management. In that case, deploying Stellar SQL Database Toolkit ensures fast and accurate database and backup repair as wel…
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.
Viewers will learn how the fundamental information of how to create a table.
Suggested Courses

580 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question