Avatar of Martin Griffiths
Martin Griffiths
 asked on

Problem deleting duplicates using a CTE when there are table joins

I've used CTEs to remove duplicates successfully before, but this time round I wanted to delete them from one table having joined to other tables and it comes back with the error:

View or function 'dupnotesCTE' is not updatable because the modification affects multiple base tables. See the attached code for my sql statement.

Not sure where I've gone wrong here...can someone help please?
;WITH dupnotesCTE([FileNo]
      ,[CommentDate]
      ,[CommentTime]
      ,[CommentUser]
      ,[Cat]
      ,[Pri]
      ,[Line]
      ,[Note], Ranking)
AS
(
SELECT	p.[FileNo]
      ,p.[CommentDate]
      ,p.[CommentTime]
      ,p.[CommentUser]
      ,p.[Cat]
      ,p.[Pri]
      ,p.[Line]
      ,p.[Note]
	,Ranking = row_number() OVER(PARTITION BY p.[FileNo]
      ,p.[CommentDate]
      ,p.[CommentTime]
      ,p.[CommentUser]
      ,p.[Cat]
      ,p.[Pri]
      ,p.[Line]
      ,p.[Note] ORDER BY p.[FileNo]
      ,p.[CommentDate]
      ,p.[CommentTime]
      ,p.[CommentUser]
      ,p.[Cat]
      ,p.[Pri]
      ,p.[Line]
      ,p.[Note])
FROM filenotes p
join detailtable d on ltrim(rtrim(p.FileNo)) = ltrim(rtrim(d.FileNo))
join country c on d.country_no = c.country_no
where c.continent in ('Asia','Europe')
)

delete from dupnotesCTE
where ranking > 1

Open in new window

Microsoft SQL Server 2005

Avatar of undefined
Last Comment
Martin Griffiths

8/22/2022 - Mon
Erick37



...

delete p
FROM filenotes p
JOIN dupnotesCTE c ON c.FileNo = p.fileNo --??
where c.ranking > 1
Martin Griffiths

ASKER
This won't work though will it as, say you have a pair of duplicates A and B. You have A in dupnotesCTE with rank 1, and B in dupnotesCTE with rank 2. When you link from filenotes to dupnotes you'll have 4 records:

filenotes.A and dupnotes.A (rank 1)
filenotes.A and dupnotes.B (rank 2)
filenotes.B and dupnotes.A (rank 1)
filenotes.B and dupnotes.B (rank 2)

Your "where c.ranking > 1" will remove the 2nd and 4th combinations but you're left with two records i.e. combinations 1 and 3 above.

May be mistaken, but my query seems to reflect this.
Erick37

Does filenotes table have an identity column or primary key that you could use to join the cte?

This is the best money I have ever spent. I cannot not tell you how many times these folks have saved my bacon. I learn so much from the contributors.
rwheeler23
Martin Griffiths

ASKER
No it doesn't sorry.
Erick37

So all of the columns with the duplicate FileNo are also duplicate?
Martin Griffiths

ASKER
Yes, all the fields per fileno duplicate are identical i.e. a true duplicated record with the same data in each column:

[FileNo],[CommentDate],[CommentTime],[CommentUser],[Cat],[Pri],[Line],[Note]
⚡ FREE TRIAL OFFER
Try out a week of full access for free.
Find out why thousands trust the EE community with their toughest problems.
Erick37

Having an identity column would greatly simplify identifying the duplicates.

One possibility would be to copy the records over to a temp table with an identity column.
Remove the duplicates from the temp table.
Delete all the records from the user table that you originally copied to the temp table.
Insert the cleaned records back into filenotes table.

Here are some other alternatives:
"How to delete duplicate records or rows among identical rows in a table where no primary key exists"
http://www.kodyaz.com/articles/delete-duplicate-records-rows-in-a-table.aspx
Martin Griffiths

ASKER
I've added an identity column to the filenotes table now. How should it look now? Do I do something like at the bottom of your last link in method 3?
ASKER CERTIFIED SOLUTION
Erick37

THIS SOLUTION ONLY AVAILABLE TO MEMBERS.
View this solution by signing up for a free trial.
Members can start a 7-Day free trial and enjoy unlimited access to the platform.
See Pricing Options
Start Free Trial
GET A PERSONALIZED SOLUTION
Ask your own question & get feedback from real experts
Find out why thousands trust the EE community with their toughest problems.
Martin Griffiths

ASKER
That's worked a dream thanks Eric.
I started with Experts Exchange in 2004 and it's been a mainstay of my professional computing life since. It helped me launch a career as a programmer / Oracle data analyst
William Peck