Solved

Sql Stored Procedure and records in "use"

Posted on 2016-08-25
10
37 Views
Last Modified: 2016-08-25
I have a VB App that pulls emails from our users subscribed database with a [status] = 0  and sends as an smtp email to a 3rd party api.
As it is successfully sent it updates the sql table [status] = 1

By itself it handles about 12,000 records an hour.

I need to double that ...

I know I can add as separate tasks or even another replica of the same app.

Here is my issue...
If I have one app pulling 10,000 addresses from the SQL Table in my stored procedure

How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
0
Comment
Question by:lrbrister
  • 3
  • 2
  • 2
  • +3
10 Comments
 
LVL 45

Expert Comment

by:Vitor Montalvão
Comment Utility
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
If you work inside a transaction the records will be locked and no other process can access them until you release the lock with a COMMIT or ROLLBACK command.
0
 

Author Comment

by:lrbrister
Comment Utility
So... lets say there are 15000 records
Two processes call the same stored procedure that gets 10,000 records at a time
First app runs and gets first 10k records

Second app runs 2 seconds later and will skip the first 10,000 records and just pulls the remaining 5,000 records?
0
 
LVL 32

Expert Comment

by:Stefan Hoffmann
Comment Utility
12k rows is pretty few for SQL Server. No need for a replica.

You need to look at the bottleneck: Can that 3rd party API handle this? How long does your current program needs to send one mail? What size has one email? Do you have the necessary bandwidth?

My first attempt after looking at this information and those numbers would be distributing the load over more SMTP sender processes. Cause my tingling developer sense tell me, that it is just here, where we need to do something. Often a simple round-robin does the trick:

Just select rows by user modulo number_of_running_senders  and give each process a unique counter.
0
 

Author Comment

by:lrbrister
Comment Utility
ste5an
There are usually 300-400 k records to run.

The third party api can handle petty much whatever we send it ...supposedly

However...
Each of these emails is sent with a unique HTML Content Body for that particular individual.
As the record is pulled in there is a "REPLACE" on the vb side... the HTML CAN be fairly extensive.
Then the record is send.

That's why the low rate
0
 
LVL 11

Expert Comment

by:Nakul Vachhrajani
Comment Utility
You basically need a look-up/log table where the first process logs the Ids for the addresses being fetched.

When the replica app runs, it should check (by looking at the log table) that it is processing only values that the other app has not processed.

NOTE: Depending upon your implementation, you may want to clear out/truncate the log table once all E-mails have been sent.

Here's a quick sample.

USE tempdb;
GO
--Safety Check
IF OBJECT_ID('dbo.AddressTable','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.AddressTable;
END
GO

--Safety Check
IF OBJECT_ID('dbo.LogTable','U') IS NOT NULL
BEGIN
    DROP TABLE dbo.LogTable;
END
GO

--Safety Check
IF OBJECT_ID('dbo.proc_SendEmails','P') IS NOT NULL
BEGIN
    DROP PROCEDURE dbo.proc_SendEmails;
END
GO

--Create required objects
CREATE TABLE dbo.AddressTable 
       ([PersonId]     INT           NOT NULL,
        [EmailAddress] NVARCHAR(255) NOT NULL,
        [Status]       BIT           NOT NULL
       );
GO

CREATE TABLE dbo.LogTable
      ([PersonId]    INT      NOT NULL,
       [ProcessDate] DATETIME NOT NULL
      );
GO
  
--Assume this SP is sending the E-mails                       
CREATE PROCEDURE dbo.proc_SendEmails 
        @batchSize INT
AS
BEGIN
    SET NOCOUNT ON;

    CREATE TABLE #someTable ([DummyValue] INT NULL,
                             [DummyDate]  DATETIME NULL
                            )

    --When you fetch the E-mails to process,
    --log the Ids into your log table
    INSERT INTO #someTable (DummyValue, DummyDate)
        OUTPUT inserted.DummyValue, 
               inserted.DummyDate  
        INTO dbo.LogTable ([PersonId], [ProcessDate])
    SELECT TOP (@batchSize) 
           at.PersonId AS DummyValue,
           GETDATE()   AS DummyDate
    FROM dbo.AddressTable AS at
    LEFT OUTER JOIN dbo.LogTable AS lt ON at.PersonId = lt.PersonId
    WHERE at.[Status] = 1
      AND at.PersonId IS NOT NULL
      AND lt.PersonId IS NULL;
END
GO

--Generate some test data. NOTE: I have taken test data from the AdventureWorks sample database
USE tempdb;
GO
INSERT INTO dbo.AddressTable (PersonId, EmailAddress, [Status])
SELECT ea.BusinessEntityID,
       ea.EmailAddress,
       CASE (ea.BusinessEntityID % 2) WHEN 1 THEN 1 ELSE 0 END AS [Status]
FROM AdventureWorks2012.Person.EmailAddress AS ea;
GO

--Now, keep running the SP and observe the values logged in dbo.LogTable
USE tempdb;
GO
EXEC dbo.proc_SendEmails @batchSize = 5;
GO

Open in new window

0
Find Ransomware Secrets With All-Source Analysis

Ransomware has become a major concern for organizations; its prevalence has grown due to past successes achieved by threat actors. While each ransomware variant is different, we’ve seen some common tactics and trends used among the authors of the malware.

 
LVL 25

Expert Comment

by:Shaun Kline
Comment Utility
If status is an integer data type, you could set the status to a value other than 0 or 1 to indicate the record is being processed when you retrieve the records. After the email is sent you could then set the status to 1.
0
 
LVL 69

Accepted Solution

by:
ScottPletcher earned 300 total points
Comment Utility
Don't "corrupt" the existing status column.  It confuses the meaning of that column and is less clear overall.  Instead, add another column(s) to indicate: "row has been pulled for processing but has not been processed yet".

This could be a simple bit flag and/or a datetime of when it was pulled (NULL = not pulled yet) and/or an identifier for what pulled it, etc..

Then code the pull query to skip any rows that are marked as already pulled.

You could consider adding a filtered index on those pull-marking columns if the base table is very large, to cut down the cost of scanning for non-pulled rows.
1
 
LVL 32

Assisted Solution

by:Stefan Hoffmann
Stefan Hoffmann earned 200 total points
Comment Utility
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?

Round-robin:

DECLARE @Sample TABLE
    (
      UserID INT ,
      EmailAddress VARCHAR(255)
    );

INSERT  INTO @Sample
VALUES  ( 1, '' ),
        ( 2, '' ),
        ( 3, '' ),
        ( 4, '' ),
        ( 5, '' ),
        ( 7, '' );


DECLARE @NumberOfApps INT = 3;

SELECT  * ,
        S.UserID % @NumberOfApps AS AppID
FROM    @Sample S;

Open in new window

1
 

Author Closing Comment

by:lrbrister
Comment Utility
A combination of these look like the best approach and easiest to easily "follow"
0
 
LVL 69

Expert Comment

by:ScottPletcher
Comment Utility
Fyi, you can mark the rows and pull data from them in the same statement:


DECLARE @unique_id_for_this_process_batch uniqueidentifier
SET @unique_id_for_this_process_batch = NEWID()

/* Mark rows as in-process while pulling details from them so that they can actually be processed. */
UPDATE TOP (10000) tn
SET process_start_date = GETDATE() /*, process_id = @unique_id_for_this_process_batch */
OUTPUT @unique_id_for_this_process_batch,
     INSERTED.process_start_date, INSERTED.id, ...<other_columns_needed_for_processing>
    INTO dbo.inprocess_data ( <column_names> )
FROM dbo.table_name tn
WHERE
    tn.process_start_date IS NULL

<code to process rows in "dbo.inprocess_data" table for the current unique id>
1

Featured Post

Free Trending Threat Insights Every Day

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

Join & Write a Comment

Suggested Solutions

Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

7 Experts available now in Live!

Get 1:1 Help Now