Need to design efficient SQL transaction log for incoming changes

I need to design a transaction log for the records in this table.  I have an Azure function that connects to a remote API and downloads these records on an ongoing basis.  I need to track changes in incremental order with the goal of efficiently being able to see those changes from another Azure function.  I can use SP's or triggers or I'm open to any idea for the design. My most important goal is to see new deposits and match them to awaiting transactions without exploring the blockchain.

Capture.JPG
IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[LiveCoinPaymentBalances]') AND type in (N'U'))
DROP TABLE [dbo].[LiveCoinPaymentBalances]
GO
IF NOT EXISTS (select * from INFORMATION_SCHEMA.tables where TABLE_NAME = 'LiveCoinPaymentBalances')
BEGIN
CREATE TABLE [dbo].[LiveCoinPaymentBalances](
	[LiveCoinPaymentBalancesID] [int] IDENTITY(1,1) NOT NULL,	
	[Type]  NVARCHAR(128) NOT NULL,
	[Currency]  NVARCHAR(128) NOT NULL,
	[Value]  DECIMAL(36,18) NOT NULL,		
	[UTCTimeStamp] DATETIME NOT NULL,
	[TimeStamp] DATETIME NOT NULL,					 
	CONSTRAINT [PK_LiveCoinPaymentBalances] PRIMARY KEY CLUSTERED 
	(
		[LiveCoinPaymentBalancesID] ASC
	))	
END
GO

Open in new window

CAMPzxzxDeathzxzxAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Éric MoreauSenior .Net ConsultantCommented:
1
Mark WillsTopic AdvisorCommented:
Nice article Eric

Temporal Tables are very interesting and sounds like it would suit your needs....

There is also this example for Azure : https://docs.microsoft.com/en-gb/azure/sql-database/sql-database-temporal-tables
0
CAMPzxzxDeathzxzxAuthor Commented:
Interesting!  I'm rushing over to my Azure VM to check my SQL version right now.
0
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

CAMPzxzxDeathzxzxAuthor Commented:
It's 2012.  I cannot upgrade SQL without a whole lot of advance planning for which I do not have the time.  There are several Websites running from there.  Your answer is the sort of solution for which I am looking.  I'm thinking a trigger will do the same thing.  I could use the original SP inserting the data from the API as an insert or update and then use a trigger to collect and push to the history table.  Got any better ideas?
0
Mark WillsTopic AdvisorCommented:
2012, that's a pity... Still, the more traditional approach would be a trigger. I am guessing the SP will insert New and update Existing ? In which case, can the SP insert all the new transactions into History ?
0
Scott PletcherSenior DBACommented:
I'm extra-super busy today, but

IF it's just a question of time frame:

My most important goal is to see new deposits and match them to awaiting transactions without exploring the blockchain.

The core issue you have there is that the table is not clustered properly.
Indeed, this is the prototypical example of the performance and maintenance issues when clustering on identity by default, which is a terrible, but oh-so-common, practice.

Cluster the table first on either UTCTimeStamp <or> TimeStamp, whichever you use, or could use, when querying the table.  Use identity as the second key column to insure uniqueness.  You can leave the ID alone as a PK, it will just be nonclustered.

ALTER TABLE dbo.LiveCoinPaymentBalances DROP CONSTRAINT PK_LiveCoinPaymentBalances;
CREATE UNIQUE CLUSTERED INDEX LiveCoinPaymentBalances__CL ON dbo.LiveCoinPaymentBalances (
    TimeStamp, LiveCoinPaymentBalancesID ) WITH ( FILLFACTOR = 99, SORT_IN_TEMPDB = ON )
    ON [PRIMARY] /* change filegroup name to match what you need */
    ;
ALTER TABLE dbo.LiveCoinPaymentBalances ADD CONSTRAINT PK_LiveCoinPaymentBalances
    PRIMARY KEY NONCLUSTERED ( LiveCoinPaymentBalancesID ) WITH ( FILLFACTOR = 100, SORT_IN_TEMPDB = ON )
    ON [PRIMARY] /* change filegroup name to match what you need */

SELECT ...
FROM dbo.LiveCoinPaymentBalances
WHERE TimeStamp >= DATEADD(DAY, -14, GETDATE()) /* most recent 2 weeks, or whatever you consider as "current" */
...

Once that's in place, you'll limit your current queries to recent data and it won't matter for performance if you have 10, 20 or 100 years of history in the table.  

Archiving is really a separate issue, and more of a business one than a technical one.
0
Mark WillsTopic AdvisorCommented:
Scott, they are the same 4 balances per currency and no doubt continually updated. It is the data being used to update that represents the transactional 'challenge' - well - from what it sounds like.....

Ever played with Currency / Money Markets / cryptocurrency ? Fluctuations are very much real time, and having to traverse a lot of history to get balances could become painful. And a whole lot more (security etc).

I agree with your statements / comments above, for the requirement of transaction / history logging the actual transactions. So, your comments would be most appropriate for that part of the problem - not so much for the balances though - in my opinion.
0
CAMPzxzxDeathzxzxAuthor Commented:
I agree.  Let's look at this from the desired result.  I will have a Azure function constantly looking for changes and matching those to transactions waiting for deposits.  I plan to allow 1 hour to make a deposit or the transaction is voided.  No two transactions can specify the same quantity of a given type during that 60 minutes.  I know that sounds like a stupid way to match deposits to transactions but I have no other way to do it without going underneath to the block chain.  So the new transaction calls for depositing 100 BTC into the BTC wallet.  The second function starts looking for a deposit of 100 BTC.  It changes a status field and the trade transaction continues.  The date time and the quantity of change are the keys for the match!  That's quantity change needs to be efficient.
0
CAMPzxzxDeathzxzxAuthor Commented:
Mark - In which case, can the SP insert all the new transactions into History ?  Yes but would that be real efficient?  I'm only needing changes with sum changes ready to be examined for matches
0
Mark WillsTopic AdvisorCommented:
Isnt there the risk of double spending - and the reason for blockchain in the first place ?
0
Scott PletcherSenior DBACommented:
The date time and the quantity of change are the keys for the match!  

Then cluster first on that date and time.  The performance will then be very good no matter what you do from there.  The single most important performance element is proper clustering of tables.  After that, if necessary, you can consider everything else about performance.
0
Mark WillsTopic AdvisorCommented:
Well, it wouldnt be real efficient, but the transactions will autoexpire - so they just need a background tidyup.  There would need to be multiple asynch processes - the SP wont be doing it all, just "putting" the transactions. Another process will retire the voided transactions, another matching etc...

Or am I oversimplifying it ?
0
CAMPzxzxDeathzxzxAuthor Commented:
There are 8 functions running without errors at this point.  This is only a single exchange now but will be multiple exchanges in the future.  So Yes - I will be rolling through the changes looking for matches and voiding 60+ minute transactions.  BTW for anyone interested, creating Azure functions using VS2017 is my new favorite programming.  Easy to build, easy to test and easy to publish.  Each function has an error trap that ties everything together and a db solution for turning them on or off or changing the run timing.
1
CAMPzxzxDeathzxzxAuthor Commented:
This is slightly dated but the result of my database work so far.

DBTradeDiagram.JPG
0
Mark WillsTopic AdvisorCommented:
That is the best endorsement I have heard for Azure functions yet :)

Very early in the morning over here, and have to get some sleep - preferably before my day starts in 3 hours.

Will check back in a while.

EDIT: just saw the db diagram as I was typing, will definitely take a closer look - when I can see properly :)
0
Scott PletcherSenior DBACommented:
I know you all don't want to hear this, but ...
You won't ever be able to design a truly efficient db with identities automatically clustering every table.  I know you don't want to believe that, but it's the truth.  Every tuning contract I do, the first thing I have to do is figure out the best clustering key(s) for every major table.

Good luck with this.
0
CAMPzxzxDeathzxzxAuthor Commented:
Scott - I'm 100% with you but I have my ways.  I use snippets to create new script objects like tables.  That template is always clustering the ID.  The tables you see here in EX are almost always in the newest stages of development.  I'm in the authoring stages where I do not look for tuning issues at all.  I have to prove the model is right by writing code against it.  The ends justify the means - I'm here because I'm just like any other DB/C#/etc. developer - I need things :)

You are a superb DB guy and I love your help for which you have given me plenty - you rock
0
Scott PletcherSenior DBACommented:
But proper data modeling has NOTHING to do with performance.  That comes later.  

That template is always clustering the ID.

I know.  That's the single most damaging myth in db "design" (it's overly generous to label simply adding an identity column to every table as being part of a "design" process).  But I guess because it's so simple to do, everybody does it.

It prevents determining the proper logical structures, enormously complicates any attempt at normalization, and causes horrendous overall performance issues, including wasted indexes, i.e., forced building of "covering" indexes for virtually every major query because the clustered index is so useless for the table's actual needs and uses.  But, yes, it is extremely simple.
0
CAMPzxzxDeathzxzxAuthor Commented:
Scott - Do you have advice on the best way for me to accomplish my task?  Maybe an insert update SP with a trigger summing changes into another table.
0
Mark WillsTopic AdvisorCommented:
*laughing* Or, at the very least Scott, even a suggestion as to what a transaction / history table might look like :)

The current SP maintains the balances table - is that correct ?

Can you describe what your current SP is doing ?
0
Scott PletcherSenior DBACommented:
Yes.

First, step back from the details of the db-related elements on focus on just the data requirements.

Is the data listed above all of the data being received?  If so, the design won't be too complicated.

You'll still want an identity column, just as a unique row marker, but you don't want to cluster on it first (the ident will be last, just to insure a unique key).  After examining the processing requirements, we can decide on the leading clus key(s).

All our currencies are 3-char codes, period.  Verify whether your data source is using standard 3-char currency codes exclusively, then change the data type to char(3).  This provides better efficiency for search.

Also for efficiency, you'll want to encode type, i.e., assign a numeric value to represent each different textual value.  I would think a small int would be plenty (~64K values).  That does mean when new data comes in, you'll have to check for new types to add them to the "master" types table before fully processing the input.


Now, can you give details of the specific processing you need?

My most important goal is to see new deposits and match them to awaiting transactions without exploring the blockchain.

What indicates a new deposit?  What indicates a waiting transaction?  Where is the model/structure of that data?
0
CAMPzxzxDeathzxzxAuthor Commented:
Mark

IF EXISTS (SELECT * FROM SYSOBJECTS WHERE ID = OBJECT_ID(N'[dbo].[usp_AddOrUpdateLiveCoinPaymentBalances]') AND OBJECTPROPERTY(id, N'IsProcedure') = 1)
DROP PROCEDURE [dbo].[usp_AddOrUpdateLiveCoinPaymentBalances];
GO
CREATE PROC [dbo].[usp_AddOrUpdateLiveCoinPaymentBalances]
(			
		@Type  NVARCHAR(128),
		@Currency  NVARCHAR(128),
		@Value  DECIMAL(36,18)			
)		
AS 
BEGIN	
IF NOT EXISTS (SELECT * FROM [dbo].[LiveCoinPaymentBalances] WHERE [Currency] = @Currency)
INSERT INTO [dbo].[LiveCoinPaymentBalances]
     (
			[Type],
			[Currency],
			[Value],
			[UTCTimeStamp],
			[TimeStamp]
	)
	VALUES
	(		
			@Type,
			@Currency,
			@Value,
			GETUTCDATE(),
			GETDATE()	
	)
ELSE
UPDATE 	[dbo].[LiveCoinPaymentBalances]
	SET			
			[Value] = @Type,
			[UTCTimeStamp] = GETUTCDATE(),
			[TimeStamp] = GETDATE()
	WHERE [Currency] = @Currency 
END
GO

Open in new window

0
CAMPzxzxDeathzxzxAuthor Commented:
Scott

"My most important goal is to see new deposits and match them to awaiting transactions without exploring the blockchain."


here is the code that kicks off the trade..

I will look for SellLiveCoinMarketOrder.TimeStamp > 60 where SellLiveCoinMarketOrder.DBStatus = "Waiting for Deposit" and I'll set SellLiveCoinMarketOrder.DBStatus = "Cancelled No Deposit"

I will test to make sure that there is no matching quantity / Fiat waiting for deposit or I'll flag the end user to change quantity.

Then I will stage the  LiveCoinMarketOrder

LiveCoinMarketOrder SellLiveCoinMarketOrder = new LiveCoinMarketOrder();
                SellLiveCoinMarketOrder.LiveCoinTradeID = LiveCoinTradeID;
                SellLiveCoinMarketOrder.OrderType = 1;
                SellLiveCoinMarketOrder.ExchangeID = ExchangeID;
                SellLiveCoinMarketOrder.Symbol = InCrypto.Symbol;
                SellLiveCoinMarketOrder.Quantity = decimal.Parse(Quantity);
                SellLiveCoinMarketOrder.TimeStamp = System.DateTime.Now.ToUniversalTime();
                SellLiveCoinMarketOrder.DBStatus = "Waiting for Deposit";
                db.LiveCoinMarketOrder.Add(SellLiveCoinMarketOrder);
                db.SaveChanges();

Open in new window


Next I'm going to create a function that looks for SellLiveCoinMarketOrder.DBStatus = "Waiting for Deposit".  Using the new table I will quickly find any matches of quantity and currency and turn the SellLiveCoinMarketOrder.DBStatus = "Deposit complete"

Now another function is looking for the DBStatus = "Deposit complete" and the process will proceed.

About the only other thing I can add to this is the DB diagram above where I have everything connected to LiveCoinInfoCluster
0
Mark WillsTopic AdvisorCommented:
This is the crucial step : "Using the new table I will quickly find any matches of quantity and currency "

And the details are pretty much the MarketOrder ?
0
CAMPzxzxDeathzxzxAuthor Commented:
I'll need to take SellLiveCoinMarketOrder.Symbol = InCrypto.Symbol  and  SellLiveCoinMarketOrder.Quantity = decimal.Parse(Quantity); get the Fiat and start looking for the quantity deposit
0
Mark WillsTopic AdvisorCommented:
Because Quantity is being placed for a nominated symbol, then there must also be an associated value because Fiat is physical currency. So the transaction is for quantity and value - ie "I want 10 at $100" - so we cant just match on quantity because the value is unknown - is it not ? And if not, then we will need to capture that value anyway as part of the transaction (history)

Attributes of the transaction would need to include
TransactionType
ExchangeID (only 1 at the moment with plans to expand)
Symbol  /  Crypto Symbol
Quantity
Value
TimeStamp / UTC (azure) timestamp
Along with sufficient identifiers as to origins of transact (ie user/wallet/order)

Are there any other attributes of Transaction ?

e.g. do we need to include DBStatus
0
Mark WillsTopic AdvisorCommented:
Starting to feel more like the blockchain :)

And it is that time again....
0
CAMPzxzxDeathzxzxAuthor Commented:
OK here is my first try at the trigger which got me a lot of errors -

Error converting data type nvarchar to numeric

DROP TRIGGER [dbo].[LiveCoinPaymentBalanceChangesTrigger]
GO
CREATE TRIGGER [dbo].[LiveCoinPaymentBalanceChangesTrigger] ON [dbo].[LiveCoinPaymentBalances]
AFTER INSERT, UPDATE
AS	
	DECLARE @Type  NVARCHAR(128)
	DECLARE @Currency  NVARCHAR(128)
	DECLARE @Value  DECIMAL(36,18)
	
	SELECT @Type = (SELECT Type FROM INSERTED)
	SELECT @Currency = (SELECT Currency FROM INSERTED)
	SELECT @Value = (SELECT Value FROM INSERTED)
	
	DECLARE @OldValue  DECIMAL(36,18)
	SELECT @OldValue = (SELECT [Value] FROM [dbo].[LiveCoinPaymentBalanceChanges] WHERE [Currency] = @Currency)
		
BEGIN
	IF @Type = 'available'
	BEGIN
	IF EXISTS (SELECT * FROM [dbo].[LiveCoinPaymentBalanceChanges] WHERE [Currency] = @Currency AND [Value] <> @Value)
	
	UPDATE 	[dbo].[LiveCoinPaymentBalanceChanges]
	SET			
			[Value] = @Value,
			[Change] = (@Value - @OldValue),			
			[TimeStamp] = GETDATE()
	WHERE [Currency] = @Currency AND [Value] <> @Value 
	END

	BEGIN
	IF NOT EXISTS (SELECT * FROM [dbo].[LiveCoinPaymentBalanceChanges] WHERE [Currency] = @Currency)
	
	INSERT INTO [dbo].[LiveCoinPaymentBalanceChanges]
(			
			[Currency],
			[Value],
			[Change],
			[TimeStamp]
)
VALUES
(		
			@Currency,
			@Value,
			0,
			GETDATE()	
) 
	END
	
END
GO

Open in new window

0
Anthony PerkinsCommented:
You are making the wrong assumption that a TRIGGER executes per row, it does not, it executes per statement.
This may be a tad closer to reality (totally untested):
DROP TRIGGER dbo.LiveCoinPaymentBalanceChangesTrigger;
GO[subtitle][/subtitle]
CREATE TRIGGER dbo.LiveCoinPaymentBalanceChangesTrigger
ON dbo.LiveCoinPaymentBalances
AFTER INSERT, UPDATE
AS

BEGIN
        IF EXISTS (
            SELECT 1
            FROM	dbo.LiveCoinPaymentBalanceChanges lcpbc
			INNER JOIN Inserted i ON lcpbc.Currency = i.Currency AND lcpbc.Value <> i.[Value]
		WHERE	[Type] = 'available'
			)
            UPDATE	lcpbc
            SET	[Value] = i.[Value],
			Change = (i.[Value] - lcpbc.Value),
			[TimeStamp] = GETDATE()
            FROM  dbo.LiveCoinPaymentBalanceChanges lcpbc 
			INNER JOIN Inserted i ON lcpbc.Currency = i.Currency AND lcpbc.Value <> i.[Value]
            WHERE	[Type] = 'available'


        IF NOT EXISTS (
            SELECT	1
            FROM     dbo.LiveCoinPaymentBalanceChanges
	                    INNER JOIN Inserted i ON lcpbc.Currency = i.Currency
        )
            INSERT    dbo.LiveCoinPaymentBalanceChanges (Currency, [Value], Change, [TimeStamp])
            SELECT   Currency, [Value], 0, GETDATE()
            FROM       Inserted

END;
GO

Open in new window

0
CAMPzxzxDeathzxzxAuthor Commented:
I'm getting an error that I can't find the problem - "Error converting data type nvarchar to numeric"


IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[LiveCoinPaymentBalances]') AND type in (N'U'))
DROP TABLE [dbo].[LiveCoinPaymentBalances]
GO
IF NOT EXISTS (select * from INFORMATION_SCHEMA.tables where TABLE_NAME = 'LiveCoinPaymentBalances')
BEGIN
CREATE TABLE [dbo].[LiveCoinPaymentBalances](
	[LiveCoinPaymentBalancesID] [int] IDENTITY(1,1) NOT NULL,	
	[Type]  NVARCHAR(128) NOT NULL,
	[Currency]  NVARCHAR(128) NOT NULL,
	[Value]  DECIMAL(36,18) NOT NULL,		
	[UTCTimeStamp] DATETIME NOT NULL,
	[TimeStamp] DATETIME NOT NULL,					 
	CONSTRAINT [PK_LiveCoinPaymentBalances] PRIMARY KEY CLUSTERED 
	(
		[LiveCoinPaymentBalancesID] ASC
	))	
END
GO


IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[LiveCoinPaymentBalanceChanges]') AND type in (N'U'))
DROP TABLE [dbo].[LiveCoinPaymentBalanceChanges]
GO
IF NOT EXISTS (select * from INFORMATION_SCHEMA.tables where TABLE_NAME = 'LiveCoinPaymentBalanceChanges')
BEGIN
CREATE TABLE [dbo].[LiveCoinPaymentBalanceChanges](
	[LiveCoinPaymentBalanceChangesID] [int] IDENTITY(1,1) NOT NULL,	
	[Currency]  NVARCHAR(128) NOT NULL,
	[Value]  DECIMAL(36,18) NOT NULL,
	[Change]  DECIMAL(36,18) NOT NULL,		
	[TimeStamp] DATETIME NOT NULL,
	[Matched] BIT NOT NULL,					 
	CONSTRAINT [PK_LiveCoinPaymentBalanceChanges] PRIMARY KEY CLUSTERED 
	(
		[LiveCoinPaymentBalanceChangesID] ASC
	))	
END
GO


DROP TRIGGER [dbo].[LiveCoinPaymentBalanceChangesTrigger]
GO

CREATE TRIGGER [dbo].[LiveCoinPaymentBalanceChangesTrigger] ON [dbo].[LiveCoinPaymentBalances]
AFTER INSERT, UPDATE
AS	

BEGIN
        IF EXISTS (
            SELECT 1
            FROM	dbo.LiveCoinPaymentBalanceChanges lcpbc
			INNER JOIN Inserted i ON lcpbc.Currency = i.Currency AND lcpbc.Value <> i.[Value]
		WHERE	[Type] = 'available'
			)
            UPDATE	lcpbc
            SET	[Value] = i.[Value],
			Change = (i.[Value] - lcpbc.Value),
			[TimeStamp] = GETDATE()
            FROM  dbo.LiveCoinPaymentBalanceChanges lcpbc 
			INNER JOIN Inserted i ON lcpbc.Currency = i.Currency AND lcpbc.Value <> i.[Value]
            WHERE	[Type] = 'available'


        IF NOT EXISTS (
            SELECT	1
            FROM     dbo.LiveCoinPaymentBalanceChanges l
	                    INNER JOIN Inserted i ON l.Currency = i.Currency
        )
            INSERT    dbo.LiveCoinPaymentBalanceChanges (Currency, [Value], Change, [TimeStamp], [Matched])
            SELECT   [Currency], [Value], 0, GETDATE(), 0
            FROM       Inserted

END;
GO

Open in new window

0
Anthony PerkinsCommented:
It worked fine for me, after I corrected your script and did the following dumb insert:
insert LiveCoinPaymentBalances([Type, Currency, [Value], UTCTimeStamp, TimeStamp)
values ('abc', 'xyz', 123.456, getdate(), getdate())

select *
from LiveCoinPaymentBalances

select *
from LiveCoinPaymentBalanceChanges

Open in new window

0
Mark WillsTopic AdvisorCommented:
Be careful with EXISTS - it will bail from any other checking if it finds a match

So if your transaction is 'Blah1' and 'Blah2' then if 'Blah1' exists, 'Blah2' doesnt get checked.

Not so much of a problem if only single transactions, but then it might be better to incorporate into the where clause (ie where not exists) rather checking first and then repeating the join, or, having established the existence (or not) then just do the action.

Again, the approach is fine for just a single entry in INSERTED or DELETED, not so good for multi-entries within a single transaction.

Does that make sense ?

Have a read of the discussion in the thread : https://www.experts-exchange.com/questions/28394418/Getting-data-for-edited-values-in-update-trigger-in-sql-server.html#a39946681
0
Anthony PerkinsCommented:
The comment is not a solution.
0
Mark WillsTopic AdvisorCommented:
And it doesnt address the title of the question in a transaction file design :)  

just saying....
0
CAMPzxzxDeathzxzxAuthor Commented:
I closed the question because nobody has entered any better solution than an insert / update trigger. The temporal capabilities of SQL 2016 sounds like a good way to go but I can't upgrade my SQL version right now.

Mark Wills made an excellent point "Because Quantity is being placed for a nominated symbol, then there must also be an associated value because Fiat is physical currency. So the transaction is for quantity and value - ie "I want 10 at $100" - so we cant just match on quantity because the value is unknown - is it not ? And if not, then we will need to capture that value anyway as part of the transaction (history)
"

This has made me add another entire Azure function to capture transaction records and implement a checks and balances structure for accounting.  I was going to add transactions anyway but Mark made me realize that I needed it sooner than later.
1

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Mark WillsTopic AdvisorCommented:
Hi CAMPzxzxDeathzxzx,

Seems your question is still open....

Are you waiting on us for anything more to add to the discussion ?
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.