Solved

Sql Stored Procedure and records in "use"

Posted on 2016-08-25
10
44 Views
Last Modified: 2016-08-25
I have a VB App that pulls emails from our users subscribed database with a [status] = 0  and sends as an smtp email to a 3rd party api.
As it is successfully sent it updates the sql table [status] = 1

By itself it handles about 12,000 records an hour.

I need to double that ...

I know I can add as separate tasks or even another replica of the same app.

Here is my issue...
If I have one app pulling 10,000 addresses from the SQL Table in my stored procedure

How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
0
Comment
Question by:lrbrister
  • 3
  • 2
  • 2
  • +3
10 Comments
 
LVL 47

Expert Comment

by:Vitor Montalvão
ID: 41770272
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
If you work inside a transaction the records will be locked and no other process can access them until you release the lock with a COMMIT or ROLLBACK command.
0
 

Author Comment

by:lrbrister
ID: 41770276
So... lets say there are 15000 records
Two processes call the same stored procedure that gets 10,000 records at a time
First app runs and gets first 10k records

Second app runs 2 seconds later and will skip the first 10,000 records and just pulls the remaining 5,000 records?
0
 
LVL 33

Expert Comment

by:ste5an
ID: 41770278
12k rows is pretty few for SQL Server. No need for a replica.

You need to look at the bottleneck: Can that 3rd party API handle this? How long does your current program needs to send one mail? What size has one email? Do you have the necessary bandwidth?

My first attempt after looking at this information and those numbers would be distributing the load over more SMTP sender processes. Cause my tingling developer sense tell me, that it is just here, where we need to do something. Often a simple round-robin does the trick:

Just select rows by user modulo number_of_running_senders  and give each process a unique counter.
0
NAS Cloud Backup Strategies

This article explains backup scenarios when using network storage. We review the so-called “3-2-1 strategy” and summarize the methods you can use to send NAS data to the cloud

 

Author Comment

by:lrbrister
ID: 41770285
ste5an
There are usually 300-400 k records to run.

The third party api can handle petty much whatever we send it ...supposedly

However...
Each of these emails is sent with a unique HTML Content Body for that particular individual.
As the record is pulled in there is a "REPLACE" on the vb side... the HTML CAN be fairly extensive.
Then the record is send.

That's why the low rate
0
 
LVL 12

Expert Comment

by:Nakul Vachhrajani
ID: 41770294
You basically need a look-up/log table where the first process logs the Ids for the addresses being fetched.

When the replica app runs, it should check (by looking at the log table) that it is processing only values that the other app has not processed.

NOTE: Depending upon your implementation, you may want to clear out/truncate the log table once all E-mails have been sent.

Here's a quick sample.

USE tempdb;
GO
--Safety Check
IF OBJECT_ID('dbo.AddressTable','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.AddressTable;
END
GO

--Safety Check
IF OBJECT_ID('dbo.LogTable','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.LogTable;
END
GO

--Safety Check
IF OBJECT_ID('dbo.proc_SendEmails','P') IS NOT NULL
BEGIN
    DROP PROCEDURE dbo.proc_SendEmails;
END
GO

--Create required objects
CREATE TABLE dbo.AddressTable 
       ([PersonId]     INT           NOT NULL,
        [EmailAddress] NVARCHAR(255) NOT NULL,
        [Status]       BIT           NOT NULL
       );
GO

CREATE TABLE dbo.LogTable
      ([PersonId]    INT      NOT NULL,
       [ProcessDate] DATETIME NOT NULL
      );
GO
  
--Assume this SP is sending the E-mails                       
CREATE PROCEDURE dbo.proc_SendEmails 
        @batchSize INT
AS
BEGIN
    SET NOCOUNT ON;

    CREATE TABLE #someTable ([DummyValue] INT NULL,
                             [DummyDate]  DATETIME NULL
                            )

    --When you fetch the E-mails to process,
    --log the Ids into your log table
    INSERT INTO #someTable (DummyValue, DummyDate)
        OUTPUT inserted.DummyValue, 
               inserted.DummyDate  
        INTO dbo.LogTable ([PersonId], [ProcessDate])
    SELECT TOP (@batchSize) 
           at.PersonId AS DummyValue,
           GETDATE()   AS DummyDate
    FROM dbo.AddressTable AS at
    LEFT OUTER JOIN dbo.LogTable AS lt ON at.PersonId = lt.PersonId
    WHERE at.[Status] = 1
      AND at.PersonId IS NOT NULL
      AND lt.PersonId IS NULL;
END
GO

--Generate some test data. NOTE: I have taken test data from the AdventureWorks sample database
USE tempdb;
GO
INSERT INTO dbo.AddressTable (PersonId, EmailAddress, [Status])
SELECT ea.BusinessEntityID,
       ea.EmailAddress,
       CASE (ea.BusinessEntityID % 2) WHEN 1 THEN 1 ELSE 0 END AS [Status]
FROM AdventureWorks2012.Person.EmailAddress AS ea;
GO

--Now, keep running the SP and observe the values logged in dbo.LogTable
USE tempdb;
GO
EXEC dbo.proc_SendEmails @batchSize = 5;
GO

Open in new window

0
 
LVL 26

Expert Comment

by:Shaun Kline
ID: 41770311
If status is an integer data type, you could set the status to a value other than 0 or 1 to indicate the record is being processed when you retrieve the records. After the email is sent you could then set the status to 1.
0
 
LVL 69

Accepted Solution

by:
Scott Pletcher earned 300 total points
ID: 41770351
Don't "corrupt" the existing status column.  It confuses the meaning of that column and is less clear overall.  Instead, add another column(s) to indicate: "row has been pulled for processing but has not been processed yet".

This could be a simple bit flag and/or a datetime of when it was pulled (NULL = not pulled yet) and/or an identifier for what pulled it, etc..

Then code the pull query to skip any rows that are marked as already pulled.

You could consider adding a filtered index on those pull-marking columns if the base table is very large, to cut down the cost of scanning for non-pulled rows.
1
 
LVL 33

Assisted Solution

by:ste5an
ste5an earned 200 total points
ID: 41770391
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?

Round-robin:

DECLARE @Sample TABLE
    (
      UserID INT ,
      EmailAddress VARCHAR(255)
    );

INSERT  INTO @Sample
VALUES  ( 1, '' ),
        ( 2, '' ),
        ( 3, '' ),
        ( 4, '' ),
        ( 5, '' ),
        ( 7, '' );


DECLARE @NumberOfApps INT = 3;

SELECT  * ,
        S.UserID % @NumberOfApps AS AppID
FROM    @Sample S;

Open in new window

1
 

Author Closing Comment

by:lrbrister
ID: 41770399
A combination of these look like the best approach and easiest to easily "follow"
0
 
LVL 69

Expert Comment

by:Scott Pletcher
ID: 41770466
Fyi, you can mark the rows and pull data from them in the same statement:


DECLARE @unique_id_for_this_process_batch uniqueidentifier
SET @unique_id_for_this_process_batch = NEWID()

/* Mark rows as in-process while pulling details from them so that they can actually be processed. */
UPDATE TOP (10000) tn
SET process_start_date = GETDATE() /*, process_id = @unique_id_for_this_process_batch */
OUTPUT @unique_id_for_this_process_batch,
     INSERTED.process_start_date, INSERTED.id, ...<other_columns_needed_for_processing>
    INTO dbo.inprocess_data ( <column_names> )
FROM dbo.table_name tn
WHERE
    tn.process_start_date IS NULL

<code to process rows in "dbo.inprocess_data" table for the current unique id>
1

Featured Post

Master Your Team's Linux and Cloud Stack!

The average business loses $13.5M per year to ineffective training (per 1,000 employees). Keep ahead of the competition and combine in-person quality with online cost and flexibility by training with Linux Academy.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
SQL Help 27 43
CPU high usage when update statistics 2 28
MVC DDL Json Not Binding to Model 2 12
insert value of checklistbox checked 4 25
Load balancing is the method of dividing the total amount of work performed by one computer between two or more computers. Its aim is to get more work done in the same amount of time, ensuring that all the users get served faster.
This article aims to explain the working of CircularLogArchiver. This tool was designed to solve the buildup of log file in cases where systems do not support circular logging or where circular logging is not enabled
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

786 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question