SQL Query Help to remove duplicate records

Hello there,
 
I have this query which returns some cols from tables some of the cols are not duplicate but I want to get distinct records based on one of the cols(dbo.supplierOrder.ID  ). how can I do it. below is my query. I also attached shot of the result.

SELECT
        dbo.supplierOrder.ID  as SoID,
        dbo.supplierOrder.sectionID,
        dbo.supplierOrder.branchID,
        dbo.supplierOrder.supplierID,
        dbo.Supplier.engName as supplierengName,
        dbo.Supplier.perName as supplierperName,
        dbo.supplierOrder.supplierOrderNo,
        dbo.supplierOrder.supplierOrderCreated,
        dbo.supplierOrder.supplierOrderConfirmStatus,
        dbo.supplierOrder.orderCompleteStatus,
        dbo.supplierOrder.confirmDate,
        dbo.supplierOrderDetails.id  as SodID,
        dbo.supplierOrderDetails.productID,
        dbo.supplierOrderDetails.orderQty,
        dbo.supplierOrderDetails.isApprove,
        dbo.supplierOrderDetails.isComplete  
    FROM
        dbo.supplierOrderDetails  
    INNER JOIN
        dbo.supplierOrder  
            ON      (
                dbo.supplierOrderDetails.supplierOrderID = dbo.supplierOrder.ID
            )  
    INNER JOIN
        dbo.Supplier  
            ON      (
                dbo.supplierOrder.supplierID = dbo.Supplier.id
            )  
    WHERE
        dbo.supplierOrder.supplierOrderConfirmStatus = 1  
        AND dbo.supplierOrder.orderCompleteStatus = 0  
        AND dbo.supplierOrderDetails.isComplete = 1 ;   

Open in new window

zolfAsked:
Who is Participating?
 
PortletPaulConnect With a Mentor freelancerCommented:
ok, but what about..."distinct" works across the whole row, if some part of a row is different to other rows, then it is distinct

If you want something other than that, then "select distinct" is not the technique to help. At this point I don't know what you want as the result.

Perhaps you could provide the "expected result"?
0
 
PortletPaulfreelancerCommented:
You forgot the intended image, but I'm not sure that will be enough

Chances are you need to use a "group by" clause,
 or you need to use row_number() over() ,
so that you may select, for example, the most recent record.

see "Select Distinct is returning duplicates ..."
and DISTINCT and GROUP BY... and why does it not work for my query?
0
 
zolfAuthor Commented:
sorry about that. I have attached the shot now..

11
0
Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

 
zolfAuthor Commented:
PortletPaul

thanks a lot for pointing that out. I understood my mistake and need to re-design my code.cheers!!
0
 
skullnobrainsCommented:
group by SoID ?
0
 
zolfAuthor Commented:
by: skullnobrains

thanks for your comments but that wont work I get this error ---- is invalid in the select list because it is not contained in either an aggregate function or the GROUP BY clause.
0
 
skullnobrainsCommented:
you need to apply an aggregate function to each of the fields in the select list (except SoID)

for example

SELECT
        dbo.supplierOrder.ID  as SoID,
        max(dbo.supplierOrder.sectionID),
...
group by SoID

then i have no idea which aggregate function would be needed to fit your needs
min, max, sum, count, avg should be available and perhaps a few others

--

if you want a complete specific row rather than aggregates, you can use something like

select
ROW_NUMBER() over (PARTITION BY SoID order by SupplierOrderCreated) as num
...
where num = 1

with your existing query and no group by clause
the above would select the row containing the oldest order for each SoID
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.