[Last Call] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Odd insert/update problem

Posted on 2006-05-13
37
Medium Priority
?
190 Views
Last Modified: 2012-06-21
Hello Experts:

I have a table I insert/update to every hour.  I pull my information from a CSV file that is uploaded to the sql box.  Now for months the CSV has had 8000+ rows.  Recently we changed the data pull to monthly so the CSV now has around 800 rows.  Just a smaller pull.  Everything should be ok on SQLs end because my DTS is an insert/update based on one column.

For some reason though I lose everything in the table that ISN'T in the CSV.  I've gone over my dts so many times I can't see any errors at all.  What could cause the loss?  I have a table named finance_load.  This is the table that the CSV is loaded to.  Then upon the insert/update I do a lot of converting from varchar to money and datetime and so on.  

Here is the insert/update I use:

insert into finance(dealnum, type, status, dealdte, custnum, lstname, deskmgr, dealtype, s1num, s2num, finum, bankname, amtfin, term, entrate, docfee, fee1name, fee2, fee2name, fee3, fee3name, fee4, fee4name, finres, ins2comm, ins3comm, weowestktot, weowestkbck, scom1, scom2, scom3, tr1stkno, tr2stkno, payoff1, payoff2, acvtrade1, acvtrade2, trade1, trade2, gapreserve, servcontres, buyrate, totcashdown, rebate, lifereserve, ahreserve, grossprofit, cashdown, s1totcom, s2totcom, weowesalefrt, weowesalebck, weowecombck, weowecomfrt, vehgross, grosspybl, [new/used], stknum, surpluscash, CATEGORY)
select finance_load.dealnum,
finance_load.type,
finance_load.status,
finance_load.dealdte,
finance_load.custnum,
finance_load.lstname,
finance_load.deskmgr,
finance_load.dealtype,
finance_load.s1num,
finance_load.s2num,
finance_load.finum,
finance_load.bankname,
convert(money,finance_load.amtfin),
finance_load.term,
finance_load.entrate,
convert(money,finance_load.docfee),
finance_load.fee1name,
convert(money,finance_load.fee2),
finance_load.fee2name,
convert(money,finance_load.fee3),
finance_load.fee3name,
convert(money,finance_load.fee4),
finance_load.fee4name,
convert(money,finance_load.finres),
convert(money,finance_load.ins2comm),
convert(money,finance_load.ins3comm),
finance_load.weowestktot,
finance_load.weowestkbck,
convert(money,finance_load.scom1),
convert(money,finance_load.scom2),
convert(money,finance_load.scom3),
finance_load.tr1stkno,
finance_load.tr2stkno,
finance_load.payoff1,
finance_load.payoff2,
finance_load.acvtrade1,
finance_load.acvtrade2,
finance_load.trade1,
finance_load.trade2,
convert(money,finance_load.gapreserve),
convert(money,finance_load.servcontres),
finance_load.buyrate,
convert(money,finance_load.totcashdown),
convert(money,finance_load.rebate),
convert(money,finance_load.lifereserve),
convert(money,finance_load.ahreserve),
convert(money,finance_load.grossprofit),
convert(money,finance_load.cashdown),
convert(money,finance_load.s1totcom),
convert(money,finance_load.s2totcom),
convert(money,finance_load.weowesalefrt),
convert(money,finance_load.weowesalebck),
convert(money,finance_load.weowecombck),
convert(money,finance_load.weowecomfrt),
convert(money,finance_load.vehgross),
convert(money,finance_load.grosspybl),
finance_load.[new/used],
finance_load.stknum,
convert(money,finance_load.surpluscash),
FINANCE_LOAD.CATEGORY
from finance_load left outer join finance on finance_load.dealnum = finance.dealnum
where finance.dealnum is null

update finance
set dealnum = finance_load.dealnum,
type = finance_load.type,
status = finance_load.status,
dealdte = finance_load.dealdte,
custnum = finance_load.custnum,
lstname = finance_load.lstname,
deskmgr = finance_load.deskmgr,
dealtype = finance_load.dealtype,
s1num = finance_load.s1num,
s2num = finance_load.s2num,
finum = finance_load.finum,
bankname = finance_load.bankname,
amtfin = convert(money,finance_load.amtfin),
term = finance_load.term,
entrate = finance_load.entrate,
docfee = convert(money,finance_load.docfee),
fee1name = finance_load.fee1name,
fee2 = convert(money,finance_load.fee2),
fee2name = finance_load.fee2name,
fee3 = convert(money,finance_load.fee3),
fee3name = finance_load.fee3name,
fee4 = convert(money,finance_load.fee4),
fee4name = finance_load.fee4name,
finres = convert(money,finance_load.finres),
ins2comm = convert(money,finance_load.ins2comm),
ins3comm = convert(money,finance_load.ins3comm),
weowestktot = finance_load.weowestktot,
weowestkbck = finance_load.weowestkbck,
scom1 = convert(money,finance_load.scom1),
scom2 = convert(money,finance_load.scom2),
scom3 = convert(money,finance_load.scom3),
tr1stkno = finance_load.tr1stkno,
tr2stkno = finance_load.tr2stkno,
payoff1 = finance_load.payoff1,
payoff2 = finance_load.payoff2,
acvtrade1 = finance_load.acvtrade1,
acvtrade2 = finance_load.acvtrade2,
trade1 = finance_load.trade1,
trade2 = finance_load.trade2,
gapreserve = convert(money,finance_load.gapreserve),
servcontres = convert(money,finance_load.servcontres),
buyrate = finance_load.buyrate,
totcashdown = convert(money,finance_load.totcashdown),
rebate = convert(money,finance_load.rebate),
lifereserve = convert(money,finance_load.lifereserve),
ahreserve = convert(money,finance_load.ahreserve),
grossprofit = convert(money,finance_load.grossprofit),
cashdown = convert(money,finance_load.cashdown),
s1totcom = convert(money,finance_load.s1totcom),
s2totcom = convert(money,finance_load.s2totcom),
weowesalefrt = convert(money,finance_load.weowesalefrt),
weowesalebck = convert(money,finance_load.weowesalebck),
weowecombck = convert(money,finance_load.weowecombck),
weowecomfrt = convert(money,finance_load.weowecomfrt),
vehgross = convert(money,finance_load.vehgross),
grosspybl = convert(money,finance_load.grosspybl),
[new/used] = finance_load.[new/used],
stknum = finance_load.stknum,
surpluscash = convert(money,finance_load.surpluscash),
CATEGORY = FINANCE_LOAD.CATEGORY
from finance_load left outer join finance on finance_load.dealnum = finance.dealnum
where finance_load.dealnum = finance.dealnum

Sorry about the lengths of these...lots of columns!  Anyway, I have a few other steps I perform in this table but the one I think is causing problems is this one:

delete finance_load
from finance_load where dealnum is null

I do this because I kept running into lines of data that were just NULL.  So I was keeping it clean by just deleting any rows that had NULL dealnums.  Then once our smaller CSV came in all the older data vanished and my final table simply matches the load table.  Why?  Help!

0
Comment
Question by:jay-are
  • 19
  • 15
  • 2
  • +1
37 Comments
 
LVL 2

Expert Comment

by:Omnibuzz
ID: 16676490
try to delete the rows with dealnum = null from finance_load first and then do the insert/update.
Its just a guess.
0
 
LVL 27

Expert Comment

by:MikeToole
ID: 16677019
I can't see anything that would cause the problem - Your Delete should only hit the rows with a Null Dealnum.
Try another approach: put a Not Null constraint on the Dealnum column in the Finance table, that way there's no need for the Delete step. You might the find out where all the Null rows have been coming from, which could be a pointer to the cause of the problem.
Hope this helps.
0
 

Author Comment

by:jay-are
ID: 16677346
Could indexing be causing this problem?  I just recently re-indexed this entire db...

I removed the delete null step and still duplicated the problem.  I have some additional steps that convert blank fields to NULL but none of them are dealnum.  I just don't see how it deletes lines that don't exist in the finance_load table...
0
Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677362
what is your primary  key? dealnum?
if its not what is ?
you are inserting records with a null deal number into the finance table
then you are updating the table matching on dealnum
so that the last null record in load will populate all the records in finance that have a null dealnum.


0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677366
are you sure its deleting them? and not just overwriting them repeatedly with other values??
0
 

Author Comment

by:jay-are
ID: 16677373
Well I took a look at the CSV that has around 800 rows.   There isn't a single null dealnum in it.  I've also removed that step for now.

I haven't set a primary key for the finance table as far as I know.  It is indexed.  Status & Dealnum are both clustered, and stknum is also indexed.

The table goes from 8800 rows (2 years of data) to 800 rows (past 2 months of data).  Yes it is actually removing all rows that aren't in the finance_load table.
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677410
Are you sure there are no triggers or other sql statments being run also? you say you have a pk but what is it?
0
 

Author Comment

by:jay-are
ID: 16677415
Tell me where to look.  I don't know of any pk's for that table.  
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677426
look in the design of the table. Do you have access to the design?
0
 

Author Comment

by:jay-are
ID: 16677433
Yes, I don't see dealnum or any column defined as the pk.  
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677442
is dealNum a candidate for a primary key?
did you find any triggers?
0
 

Author Comment

by:jay-are
ID: 16677446
Well technically speaking there shouldn't ever be a duplicate dealnum.

I've never messed with triggers before but when I go to "manage triggers" on the finance table I see it has <new> and this text:
CREATE TRIGGER [TRIGGER NAME] ON [dbo].[FINANCE]
FOR INSERT, UPDATE, DELETE
AS

0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677449
if all you see is new then you dont have any. thats good.
are you sure its this import process thats removed the records? could it be another process?
0
 

Author Comment

by:jay-are
ID: 16677469
Yeah it has to be.  I'm sitting here changing between the current CSV and the one that contains all the history.  

fimast.csv  (792 rows)
fimastcur.csv (8816 rows)

I'll import the big csv, table contains 8816 rows.  Then I'll pull in the current csv and the table goes to 792 rows.  
0
 

Author Comment

by:jay-are
ID: 16677472
The only other steps I do are:

update finance_load
set dealdte = convert(datetime,dealdte)

update finance_load
set s1num = null
where s1num = ''

update finance_load
set s2num = null
where s2num = ''

I do a lot of updating for almost every column except dealnum.  I change blank fields to NULL like you see above.  The final step of the DTS involves me pulling in a diff csv that updates about 6 columns in the finance table.  
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677498
OK Just an idea
create a new table with just the DealNum from your finance table so you have your 8816 records

do your insert  of just the dealnum fields from your load
check the no of records
do your update
check the number of records
If this is working slowly ( well a few at a time) add your real fields back in

because your new table is nat called the same as your finance table then if there are any triggers they wont fire





0
 

Author Comment

by:jay-are
ID: 16677522
Ok I created 2 new tables:
financetest
financetest_load

the csv's are first imported to the load table then insert/updated to financetest table based on dealnum just like the other package.  When switching between csvs (history & current) I retain all 8000+ rows.

0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677523
Ok so build your new table and process up till its the same structure as the origional and test again
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677528
where are you running these queries from ?
0
 

Author Comment

by:jay-are
ID: 16677530
query analyzer

I'll start adding columns but that's a lot of columns...
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677551
Add them say 10 at a time
0
 

Author Comment

by:jay-are
ID: 16677576
Ok it must have something to do with stknum.

I added that column to my insert/update.  I pulled from the current 2 months CSV and I lost every stknum above row 792.  I still have 8816 rows of dealnums but I lost all the stknums...

???
I opened the actual csv in excel and it does indeed have 8816 stknums.  Well some are missing but I still have stknums down to the last row.
0
 

Author Comment

by:jay-are
ID: 16677580
wait...nevermind.  I reran the dts again with both csvs and I have all the stknums this time...

0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677585
What is your datatype in your destination and what is your datatype (and size) in your import?
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677586
So all is good?
0
 

Author Comment

by:jay-are
ID: 16677592
import = varchar
destination = varchar

Only others that are different in finance table are money/date fields.
0
 

Author Comment

by:jay-are
ID: 16677594
well all is good in the financetest table

not my live finance table though.  
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677599
So old table names fails
new table names suceed?
change the table names over and try again
0
 

Author Comment

by:jay-are
ID: 16677601
Well these test tables I built seem to be working on the few columns I've added.  They retain the history.

0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16677605
keep adding the fields till the new structure and code match the old.

0
 

Author Comment

by:jay-are
ID: 16677607
Ok so I started removing steps in my dts to see if one in particular is causing the loss of rows.

Here it is:

Delete finance
from finance
where finance.status <> 'F' or finance.status <> 'I'

Why is this causing the problem?
0
 

Author Comment

by:jay-are
ID: 16677614
that's the very first step of this dts package

I want to remove all deals that are NOT in F or I status.  Is this not the proper way to do that?
0
 

Author Comment

by:jay-are
ID: 16677792
any ideas?
0
 

Author Comment

by:jay-are
ID: 16677936
Is this because I'm trying to delete dealnum and its a clustered index?
0
 
LVL 27

Accepted Solution

by:
MikeToole earned 2000 total points
ID: 16678647
This is the problem!
One of these two conditions has to be true - if it's F it's not T and vice-versa. When you or tem together the result will always be true.
The correct logic is:
Where finance.status <> 'F' AND finance.status <> 'I'

Another way to put it that is easier to read is:
Where finance.status not in ('F' , 'I')
0
 

Author Comment

by:jay-are
ID: 16678946
Correct.  I really thought I had to use OR there.  I knew it had to be something simple.  Thank you for your help!

StephenCairns:  Thank you for sticking in there and helping!
0
 
LVL 10

Expert Comment

by:StephenCairns
ID: 16679706
Not a problem

If you had mentioned the delete statement we would have got there much faster
Cheers
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
Windocks is an independent port of Docker's open source to Windows.   This article introduces the use of SQL Server in containers, with integrated support of SQL Server database cloning.
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Suggested Courses

830 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question