dbaSQL
asked on
bulk insert keeps blowing the log
sql v2k, i bulk insert into a view in order to process some data, then push it to another table. that content of that 'data' was changed last night, so i'm trying to bulk it into the table, ensure all formatting is correct, and then approve the same change to production
it keeps blowing the log file --
Server: Msg 9002, Level 17, State 6, Line 1
The log file for database 'databasename' is full. Back up the transaction log for the database to free up some log space.
Note: Bulk Insert through a view may result in base table default values being ignored for NULL columns in the data file.
The statement has been terminated.
this is just a dev box, and just a test file, not close to production capacity, but it's completely idle except for me, i've got 2G data, 2G log (ample size for this effort). this particular @sql is a part of my production logic which is in place and has been running for years w/out failure. until now.
i didn't have the TABLOCK or ROWS_PER_BATCH in there before, so I put it in, no difference, it keeps puking. curious, is there any way to batch through this differently than I am now?
SELECT @sql =
'BULK INSERT dbo.viewname FROM ''' + @inputfile + ''' WITH (firstrow=2, FIELDTERMINATOR = '','', TABLOCK,ROWS_PER_BATCH=500 00,ROWTERM INATOR = ''' + NCHAR(10) + ''')'
it keeps blowing the log file --
Server: Msg 9002, Level 17, State 6, Line 1
The log file for database 'databasename' is full. Back up the transaction log for the database to free up some log space.
Note: Bulk Insert through a view may result in base table default values being ignored for NULL columns in the data file.
The statement has been terminated.
this is just a dev box, and just a test file, not close to production capacity, but it's completely idle except for me, i've got 2G data, 2G log (ample size for this effort). this particular @sql is a part of my production logic which is in place and has been running for years w/out failure. until now.
i didn't have the TABLOCK or ROWS_PER_BATCH in there before, so I put it in, no difference, it keeps puking. curious, is there any way to batch through this differently than I am now?
SELECT @sql =
'BULK INSERT dbo.viewname FROM ''' + @inputfile + ''' WITH (firstrow=2, FIELDTERMINATOR = '','', TABLOCK,ROWS_PER_BATCH=500
ASKER
sorry, chap, i should have provided more detail. i have already done both of those, several times. the db is in simple recovery mode
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
unfortunately, that's what i was thinking, too. i'm down to 9mb on the data drive (it's dev, no panic :-) )
my logic as is, that's pretty much the only way to control the flow on the bulk insert, yes?
my logic as is, that's pretty much the only way to control the flow on the bulk insert, yes?
If you're down to 9mb then that is your problem.
yes
yes
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
it is simple
2 questions:
* does the view have a INSTEAD OF trigger?
* what is the view code?
* does the view have a INSTEAD OF trigger?
* what is the view code?
This is odd...by default the BULK INSERT statement will not fire triggers unless you use the FIRE_TRIGGERS clause of the statement
ASKER
The view is pretty simple -- i just use it to load the table via bulk insert, w/columns in the file that don't match the table.
create view viewname as
select [OrderNo],[MsgSource],[Msg Type],[Msg Index],[Ti mestampDat e],[Timest ampMS] FROM Database.dbo.tablename
create view viewname as
select [OrderNo],[MsgSource],[Msg
ASKER
smaller file, it works fine. i will keep tablock and rows_per_batch in there, going forward
but i think it was just a matter of too much data for too small a resource (dev bed)
thank you both for looking
but i think it was just a matter of too much data for too small a resource (dev bed)
thank you both for looking
backup log dbname to disk = 'c:\logfile.bak'
dbcc shrinkfile('dbname_log', 0)