Sql Stored Procedure and records in "use"

I have a VB App that pulls emails from our users subscribed database with a [status] = 0  and sends as an smtp email to a 3rd party api.
As it is successfully sent it updates the sql table [status] = 1

By itself it handles about 12,000 records an hour.

I need to double that ...

I know I can add as separate tasks or even another replica of the same app.

Here is my issue...
If I have one app pulling 10,000 addresses from the SQL Table in my stored procedure

How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
Larry Bristersr. DeveloperAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Vitor MontalvãoMSSQL Senior EngineerCommented:
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?
If you work inside a transaction the records will be locked and no other process can access them until you release the lock with a COMMIT or ROLLBACK command.
Larry Bristersr. DeveloperAuthor Commented:
So... lets say there are 15000 records
Two processes call the same stored procedure that gets 10,000 records at a time
First app runs and gets first 10k records

Second app runs 2 seconds later and will skip the first 10,000 records and just pulls the remaining 5,000 records?
ste5anSenior DeveloperCommented:
12k rows is pretty few for SQL Server. No need for a replica.

You need to look at the bottleneck: Can that 3rd party API handle this? How long does your current program needs to send one mail? What size has one email? Do you have the necessary bandwidth?

My first attempt after looking at this information and those numbers would be distributing the load over more SMTP sender processes. Cause my tingling developer sense tell me, that it is just here, where we need to do something. Often a simple round-robin does the trick:

Just select rows by user modulo number_of_running_senders  and give each process a unique counter.
Amazon Web Services

Are you thinking about creating an Amazon Web Services account for your business? Not sure where to start? In this course you’ll get an overview of the history of AWS and take a tour of their user interface.

Larry Bristersr. DeveloperAuthor Commented:
There are usually 300-400 k records to run.

The third party api can handle petty much whatever we send it ...supposedly

Each of these emails is sent with a unique HTML Content Body for that particular individual.
As the record is pulled in there is a "REPLACE" on the vb side... the HTML CAN be fairly extensive.
Then the record is send.

That's why the low rate
Nakul VachhrajaniTechnical Architect, Capgemini IndiaCommented:
You basically need a look-up/log table where the first process logs the Ids for the addresses being fetched.

When the replica app runs, it should check (by looking at the log table) that it is processing only values that the other app has not processed.

NOTE: Depending upon your implementation, you may want to clear out/truncate the log table once all E-mails have been sent.

Here's a quick sample.

USE tempdb;
--Safety Check
IF OBJECT_ID('dbo.AddressTable','U') IS NOT NULL
    DROP TABLE dbo.AddressTable;

--Safety Check
IF OBJECT_ID('dbo.LogTable','U') IS NOT NULL
    DROP TABLE dbo.LogTable;

--Safety Check
IF OBJECT_ID('dbo.proc_SendEmails','P') IS NOT NULL
    DROP PROCEDURE dbo.proc_SendEmails;

--Create required objects
CREATE TABLE dbo.AddressTable 
       ([PersonId]     INT           NOT NULL,
        [EmailAddress] NVARCHAR(255) NOT NULL,
        [Status]       BIT           NOT NULL

      ([PersonId]    INT      NOT NULL,
       [ProcessDate] DATETIME NOT NULL
--Assume this SP is sending the E-mails                       
CREATE PROCEDURE dbo.proc_SendEmails 
        @batchSize INT

    CREATE TABLE #someTable ([DummyValue] INT NULL,
                             [DummyDate]  DATETIME NULL

    --When you fetch the E-mails to process,
    --log the Ids into your log table
    INSERT INTO #someTable (DummyValue, DummyDate)
        OUTPUT inserted.DummyValue, 
        INTO dbo.LogTable ([PersonId], [ProcessDate])
    SELECT TOP (@batchSize) 
           at.PersonId AS DummyValue,
           GETDATE()   AS DummyDate
    FROM dbo.AddressTable AS at
    LEFT OUTER JOIN dbo.LogTable AS lt ON at.PersonId = lt.PersonId
    WHERE at.[Status] = 1
      AND at.PersonId IS NOT NULL
      AND lt.PersonId IS NULL;

--Generate some test data. NOTE: I have taken test data from the AdventureWorks sample database
USE tempdb;
INSERT INTO dbo.AddressTable (PersonId, EmailAddress, [Status])
SELECT ea.BusinessEntityID,
       CASE (ea.BusinessEntityID % 2) WHEN 1 THEN 1 ELSE 0 END AS [Status]
FROM AdventureWorks2012.Person.EmailAddress AS ea;

--Now, keep running the SP and observe the values logged in dbo.LogTable
USE tempdb;
EXEC dbo.proc_SendEmails @batchSize = 5;

Open in new window

Shaun KlineLead Software EngineerCommented:
If status is an integer data type, you could set the status to a value other than 0 or 1 to indicate the record is being processed when you retrieve the records. After the email is sent you could then set the status to 1.
Scott PletcherSenior DBACommented:
Don't "corrupt" the existing status column.  It confuses the meaning of that column and is less clear overall.  Instead, add another column(s) to indicate: "row has been pulled for processing but has not been processed yet".

This could be a simple bit flag and/or a datetime of when it was pulled (NULL = not pulled yet) and/or an identifier for what pulled it, etc..

Then code the pull query to skip any rows that are marked as already pulled.

You could consider adding a filtered index on those pull-marking columns if the base table is very large, to cut down the cost of scanning for non-pulled rows.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
ste5anSenior DeveloperCommented:
How do I make sure the replica app or second task doesn't pull the same emails into it's own 10,000 addresses recordset?


      UserID INT ,
      EmailAddress VARCHAR(255)

VALUES  ( 1, '' ),
        ( 2, '' ),
        ( 3, '' ),
        ( 4, '' ),
        ( 5, '' ),
        ( 7, '' );

DECLARE @NumberOfApps INT = 3;

        S.UserID % @NumberOfApps AS AppID
FROM    @Sample S;

Open in new window

Larry Bristersr. DeveloperAuthor Commented:
A combination of these look like the best approach and easiest to easily "follow"
Scott PletcherSenior DBACommented:
Fyi, you can mark the rows and pull data from them in the same statement:

DECLARE @unique_id_for_this_process_batch uniqueidentifier
SET @unique_id_for_this_process_batch = NEWID()

/* Mark rows as in-process while pulling details from them so that they can actually be processed. */
UPDATE TOP (10000) tn
SET process_start_date = GETDATE() /*, process_id = @unique_id_for_this_process_batch */
OUTPUT @unique_id_for_this_process_batch,
     INSERTED.process_start_date, INSERTED.id, ...<other_columns_needed_for_processing>
    INTO dbo.inprocess_data ( <column_names> )
FROM dbo.table_name tn
    tn.process_start_date IS NULL

<code to process rows in "dbo.inprocess_data" table for the current unique id>
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.