Solved

Inserts running very slow from Application to Database in SQL Server 2012

Posted on 2016-10-11
5
82 Views
Last Modified: 2016-11-09
Hi,

We are facing slowness issue with Application Jobs. The Application job inserting records in SQL Server 2012 table (Table contains 100 million records) through Application.

No Primary key, no indexes on table.

Lastwaittype :

Writelog : 74%
Asynch_network_io: 15%

Please suggest fot troubleshooting.

Thanks,
Chandra
0
Comment
Question by:Chandra Mohan Kanithi
  • 2
  • 2
5 Comments
 
LVL 11

Accepted Solution

by:
Nakul Vachhrajani earned 250 total points
ID: 41839042
Hello!

Why don't you have any primary key and clustered indexes on the table?

Right now, you have a heap and that would mean that your selects basically do entire table scans, inserts & updates lock up the entire table most of the time due to lock escalations arising out of fragmentation.

1. The first step therefore would be to determine why you can't have clustered indexes on the table. If there is no other reason, suggest you define a clustered index on the table
2. Next would be checking your file growth numbers (both for your DB and for tempdb). If the process has to provision additional space from the OS, getting the space provisioned would require time and resources. Solution to this would be to have a higher initial size and higher file growth value (in fixed MB/GB, not in percentage)

Do share your inputs on these thoughts. We can start looking at the code once the design and environment is optimal.
0
 
LVL 11

Expert Comment

by:Nakul Vachhrajani
ID: 41839061
Also, update the statistics on your table.
0
 
LVL 26

Expert Comment

by:Zberteoc
ID: 41839161
1. Make sure that the growth setup for the log file and data file is set in MB and not percent and that is at least 512MB or even 1 GB.

2. Make sure that the VLF number in the log file is not bigger than 1000. If that is the case you will have to change it. You can use the script below to check the VLF size on all the databases on your server. The bottom part, commented, it to be used to change the VLF size:
-- modify size:
-- ALTER DATABASE [CustomerDataStore] MODIFY FILE ( NAME = N'Accounts_log', SIZE = 16777216KB , FILEGROWTH = 524288KB )


-- ##### SQL Version 2012; for before 2012 see bottom part
DECLARE @query varchar(1000),
 @dbname varchar(1000),
 @count int

SET NOCOUNT ON

DECLARE csr CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT name
FROM sys.databases

CREATE TABLE #loginfo
(
 dbname varchar(100),
 VLFs int)

OPEN csr

FETCH NEXT FROM csr INTO @dbname

WHILE (@@fetch_status <> -1)
BEGIN

CREATE TABLE #log_info
(
 RecoveryUnitId tinyint,
 fileid tinyint,
 file_size bigint,
 start_offset bigint,
 FSeqNo int,
[status] tinyint,
 parity tinyint,
 create_lsn numeric(25,0)
)

SET @query = 'DBCC loginfo (' + '''' + @dbname + ''') '

INSERT INTO #log_info
EXEC (@query)

SET @count = @@rowcount

DROP TABLE #log_info

INSERT #loginfo
VALUES(@dbname, @count)

FETCH NEXT FROM csr INTO @dbname

END

CLOSE csr
DEALLOCATE csr

create table #logUse(dbname varchar(100),LogSizeMB decimal(10,2), LogUsePerc  decimal(10,2), Stat bit)
insert into #logUse EXEC('dbcc sqlperf(logspace)')

SELECT 
	i.dbname,
	LogSizeMB,
	LogUsePerc,
	VLFs as VLFs
FROM 
	#loginfo i
	inner join #logUse u
		on u.dbname=i.dbname
--WHERE VLFs >= 50 --My rule of thumb is 50 VLFs. Your mileage may vary.
ORDER BY VLFs desc

DROP TABLE #loginfo
drop table #logUse


/*
-- ##### !!! DO NOT RUN DURING PEAK HOURS !!! #####
-- ##### How to change VLF number

USE <db_name> --Set db name before running using drop-down above or this USE statement

DECLARE @file_name sysname,
@file_size int,
@file_growth int,
@shrink_command nvarchar(max),
@alter_command nvarchar(max)

SELECT @file_name = name,
@file_size = (size / 128)
FROM sys.database_files
WHERE type_desc = 'log'

select @file_name,@file_size

SELECT @shrink_command = 'DBCC SHRINKFILE (N''' + @file_name + ''' , 0, TRUNCATEONLY)'
PRINT @shrink_command
EXEC sp_executesql @shrink_command

SELECT @shrink_command = 'DBCC SHRINKFILE (N''' + @file_name + ''' , 0)'
PRINT @shrink_command
EXEC sp_executesql @shrink_command

SELECT @alter_command = 'ALTER DATABASE [' + db_name() + '] MODIFY FILE (NAME = N''' + @file_name + ''', SIZE = ' + CAST(@file_size AS nvarchar) + 'MB)'
PRINT @alter_command
EXEC sp_executesql @alter_command
--*/

Open in new window

To understand the VLF numbers check these 2 links:

http://www.sqlskills.com/blogs/kimberly/transaction-log-vlfs-too-many-or-too-few/

http://www.sqlskills.com/blogs/paul/important-change-vlf-creation-algorithm-sql-server-2014/

Also what Nakul mentioned is important. You will have to create a PK on the table and make sure you use a column or a combination of columns that is always growing, i.e, one that is or has an ID column or an inserted date column at the beginning.
0
 
LVL 45

Assisted Solution

by:Vitor Montalvão
Vitor Montalvão earned 250 total points
ID: 41846354
We are facing slowness issue with Application Jobs.
(...)
 No Primary key, no indexes on table.
The slowness is only during insert?
Mind that indexes are needed for all non insert DML commands. If you only perform INSERTs in the table then it's acceptable to not have indexes but then what for you need a table where you don't perform any kind of queries?
0
 
LVL 45

Expert Comment

by:Vitor Montalvão
ID: 41866663
Chandra, a feedback will be appreciated.
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Suggested Solutions

Occasionally there is a need to clean table columns, especially if you have inherited legacy data. There are obviously many ways to accomplish that, including elaborate UPDATE queries with anywhere from one to numerous REPLACE functions (even within…
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

759 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

23 Experts available now in Live!

Get 1:1 Help Now