• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 331
  • Last Modified:

bulk insert keeps blowing the log

sql v2k, i bulk insert into a view in order to process some data, then push it to another table.  that content of that 'data' was changed last night, so i'm trying to bulk it into the table, ensure all formatting is correct, and then approve the same change to production

it keeps blowing the log file --
Server: Msg 9002, Level 17, State 6, Line 1
The log file for database 'databasename' is full. Back up the transaction log for the database to free up some log space.
Note: Bulk Insert through a view may result in base table default values being ignored for NULL columns in the data file.
The statement has been terminated.

this is just a dev box, and just a test file, not close to production capacity, but it's completely idle except for me, i've got 2G data, 2G log (ample size for this effort).   this particular @sql is a part of my production logic which is in place and has been running for years w/out failure.  until now.

i didn't have the TABLOCK or ROWS_PER_BATCH in there before, so I put it in, no difference, it keeps puking.  curious, is there any way to batch through this differently than I am now?

SELECT @sql =
'BULK INSERT dbo.viewname FROM ''' + @inputfile + ''' WITH (firstrow=2, FIELDTERMINATOR = '','', TABLOCK,ROWS_PER_BATCH=50000,ROWTERMINATOR = ''' + NCHAR(10) + ''')'
0
dbaSQL
Asked:
dbaSQL
  • 5
  • 4
  • 2
2 Solutions
 
chapmandewCommented:
backup the transaction log then shrink it...OR set your recovery model to SIMPLE

backup log dbname to disk = 'c:\logfile.bak'

dbcc shrinkfile('dbname_log', 0)
0
 
dbaSQLAuthor Commented:
sorry, chap, i should have provided more detail.  i have already done both of those, several times.  the db is in simple recovery mode
0
 
chapmandewCommented:
how much free space do you have in the drive where your trans log is located?  you may have to break your file up into smaller files for insert purposes.
0
Cloud Class® Course: Certified Penetration Testing

This CPTE Certified Penetration Testing Engineer course covers everything you need to know about becoming a Certified Penetration Testing Engineer. Career Path: Professional roles include Ethical Hackers, Security Consultants, System Administrators, and Chief Security Officers.

 
dbaSQLAuthor Commented:
unfortunately, that's what i was thinking, too.  i'm down to 9mb on the data drive (it's dev, no panic :-) )
my logic as is, that's pretty much the only way to control the flow on the bulk insert, yes?
0
 
chapmandewCommented:
If you're down to 9mb then that is your problem.
yes
0
 
Guy Hengel [angelIII / a3]Billing EngineerCommented:
what recovery mode is your db?
if it's full, change to simple recovery mode.
0
 
dbaSQLAuthor Commented:
it is simple
0
 
Guy Hengel [angelIII / a3]Billing EngineerCommented:
2 questions:
* does the view have a INSTEAD OF trigger?
* what is the view code?
0
 
chapmandewCommented:
This is odd...by default the BULK INSERT statement will not fire triggers unless you use the FIRE_TRIGGERS clause of the statement
0
 
dbaSQLAuthor Commented:
The view is pretty simple -- i just use it to load the table via bulk insert, w/columns in the file that don't match the table.

create view viewname as
select [OrderNo],[MsgSource],[MsgType],[MsgIndex],[TimestampDate],[TimestampMS] FROM Database.dbo.tablename
0
 
dbaSQLAuthor Commented:
smaller file, it works fine.  i will keep tablock and rows_per_batch in there, going forward
but i think it was just a matter of too much data for too small a resource (dev bed)

thank you both for looking
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: C++ 11 Fundamentals

This course will introduce you to C++ 11 and teach you about syntax fundamentals.

  • 5
  • 4
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now