Solved

postgresql data push speed improvement

Posted on 2012-03-20
11
424 Views
Last Modified: 2012-04-11
I have a large data push query that i've inherited. It does a couple things every minute. It's highly probable that I dont' fully understand a lot here, so I hope you can take a look and let me know.

The first file creates temporary tables so as not to lock down the main order entry server because once those tables are created, a comparison is performed to determine if any changes have been made to past orders. Then this comparison is made to see which records should update, and which should be added.

This happens across two different machines, and is the reason I believe that the script executes so slowly. I have read that joins between databases on different machines are inherently inefficient. I can understand why, and that is basically what I am trying to prevent.

At the very end, another query is called and the gets some shipping / tracking information, but I think that isn't too important.

I think my main idea is just that it would be simpler to push all this data over to the second server then do the join on that computer. The data is all just text, and not much of it. It must be the comparison that slows it down. So I am supposing that the best way to do this is to make the temporary tables, push all the tables over to the other server into temporary tables there, and then join with the tables currently there into the new joined table, then run the check to see what has been updated and what has been added so that only those rows must be updated in the final set of (non-temporary) tables

Does this seem a good plan, or have I misinterpreted the queries somehow? Do you think this will speed up my (very) slow datapush measurably? The code is below, but if the logs of how long the push takes to complete would be helpful, I'd be glad to post some.

Thanks a million


1st query to execute:
IncrementalDataExport
USE [ISCExport]
GO
/****** Object:  StoredProcedure [dbo].[a_IncrementalDataExport]    Script Date: 01/16/2012 15:30:41 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

-- =============================================
-- Author:		<Author,,Name>
-- Create date: <Create Date,,>
-- Description:	<Description,,>
-- =============================================
ALTER PROCEDURE [dbo].[a_IncrementalDataExport]
	-- Add the parameters for the stored procedure here

AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;
	
--THIS ROUTINE EXPORTS DATA FROM ISC DATABASE TO ISCExport database for processing

--*************************************
--insert new customers
--*************************************
--build local table
insert into customers
(customer,
name)
select CustID,CustName from isc..tblarcust c
LEFT JOIN customers cu ON cu.customer = c.CustId
WHERE cu.customer is null

--*************************************
--ADD NEW SALES ORDERS
--*************************************
declare @Message varchar(1500)
declare @Count int
declare @Time varchar(50)

Set @Message = 'New Sales Orders: '
select @Count=count(*)
FROM isc..tblSOTransHeader H
LEFT JOIN tblSOTransHeaderExport X on X.TransId = H.TransId
WHERE X.transid is null
Set @Message = @Message + cast(@Count as varchar(10))
select @Time = cast(getdate() as varchar(20))
Set @Time = 'EXPORT STARTED AT ' + @Time
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLog.txt','*****************************************','APPEND'
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLog.txt',@Time,'APPEND'
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLog.txt',@Message,'APPEND'

--add new records to temp table
insert into tblSOTransHeaderEXPORT
(TransID
,CustPONum
,shipvia
,transdate
,reqshipdate
,actshipdate
,ts
,SalesTax
,Freight
,Misc
,TaxableSales
,NonTaxableSales
,KSI_JobName
,KSI_OrderDescription
,ShipToName
,ShiptoAddr1
,ShipToAddr2
,ShipToCity
,ShipToRegion
,ShipToPostalCode
,ShipToID
,CustID
,name
,address1
,address2
,city
,region
,postalcode
,KSISoldBy)

select 
H.TransID
,H.CustPONum
,H.shipvia
,H.transdate
,H.reqshipdate
,H.actshipdate
,convert(int,H.ts)
,H.SalesTax
,H.Freight
,H.Misc
,H.TaxableSales
,H.NonTaxableSales
,H.KSI_JobName
,H.KSI_OrderDescription
,H.ShipToName
,H.ShiptoAddr1
,H.ShipToAddr2
,H.ShipToCity
,H.ShipToRegion
,H.ShipToPostalCode
,H.ShipToID
,H.CustID
,C.CustName
,C.Addr1
,C.Addr2
,C.City
,C.Region
,C.PostalCode
,h.KSI_SoldBy


FROM isc..tblSOTransHeader H
LEFT JOIN  isc..tblARCust C ON C.custid = H.custId
LEFT JOIN tblSOTransHeaderExport X on X.TransId = H.TransId
WHERE X.transid is null




--update temp table with changes
update tblSOTransHeaderEXPORT set
CustPoNum = H.CustPONum
,shipvia = H.shipvia
,transdate = H.transdate
,reqshipdate = H.reqshipdate
,actshipdate = H.actshipdate
,ts = convert(int,H.ts)
,SalesTax = H.SalesTax
,Freight = H.Freight
,Misc = H.Misc
,TaxableSales = H.TaxableSales
,NonTaxableSales = H.NonTaxableSales
,KSI_JobName = H.KSI_JobName
,KSI_OrderDescription = H.KSI_OrderDescription
,ShipToName = H.ShipToName
,ShiptoAddr1 = H.ShiptoAddr1
,ShipToAddr2 = H.ShipToAddr2
,ShipToCity = H.ShipToCity
,ShipToRegion = H.ShipToRegion
,ShipToPostalCode = H.ShipToPostalCode
,ShipToID = H.ShipToID
,CustID = H.CustID
,name = c.custname
,address1 = c.addr1
,address2 = c.addr2
,city = c.region
,postalcode = c.postalcode
FROM isc..tblSOTransHeader H
LEFT JOIN  isc..tblARCust C ON C.custid = H.custId
INNER JOIN tblSOTransHeaderEXPORT E ON E.transid = H.transid
WHERE E.ts < convert(int,H.ts)

--*************************************
--ADD SALES ORDER LINES
--*************************************
--insert into temp table
insert into tblSOTransDetailEXPORT
(TransID
,EntryNum
,ItemId
,JobID
,JobCompleteYN
,Descr
,AddnlDescr
,QtyOrdSell
,UnitsSell
,ConversionFactor
,UnitPriceSell
,UnitCostSell
,ts)
select
d.TransID
,d.EntryNum
,d.ItemId
,d.JobID
,d.JobCompleteYN
,d.Descr
,d.AddnlDescr
,d.QtyOrdSell
,d.UnitsSell
,d.ConversionFactor
,d.UnitPriceSell
,d.UnitCostSell
,convert(int,d.ts)
from isc..tblSOTransDetail d
left join tblSOTransDetailEXPORT x on x.transid = d.transid and x.entrynum = d.entrynum
WHERE x.transid is null and d.itemid not like '.%'

 --update temp table with changes
 UPDATE x SET
ItemId = d.itemid
,JobID = d.jobid
,JobCompleteYN = d.jobcompleteyn
,Descr = d.descr
,AddnlDescr = d.addnldescr
,QtyOrdSell = d.qtyordsell
,UnitsSell = d.unitssell
,ConversionFactor = d.conversionfactor
,UnitPriceSell = d.unitpricesell
,UnitCostSell = d.unitcostsell
,ts = convert(int,d.ts)
FROM isc..tblSOTransDetail d
LEFT JOIN tblSOTransDetailEXPORT x on x.transid = d.transid and x.entrynum = d.entrynum
WHERE d.ts > x.ts

--*************************************
--ADD PRODUCTS
--*************************************
--insert into temp table
insert into tblInItemEXPORT
(itemid,
descr,
itemtype,
uombase,
ts)
select
i.itemid,
i.descr,
i.itemtype,
i.uombase,
convert(int,i.ts)
from isc..tblInItem i
left join tblInItemEXPORT x on x.itemid = i.itemid
WHERE x.itemid is null and i.itemid not like '.%'
and x.itemid is null

--update changes in temp table
UPDATE x SET
descr = i.descr,
itemtype = i.itemtype,
productline = i.productline,
uombase = i.uombase
FROM isc..tblinitem i
LEFT JOIN tblinitemexport x on x.itemid = i.itemid
WHERE i.ts > convert(int,x.ts)

END

Open in new window



2nd query to execute:
incrementalProcessing
USE [ISCExport]
GO
/****** Object:  StoredProcedure [dbo].[a_IncrementalProcessingChiHost]    Script Date: 03/14/2012 08:24:29 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO



-- =============================================
-- Author:		<Author,,Name>
-- Create date: <Create Date,,>
-- Description:	<Description,,>
-- =============================================
ALTER PROCEDURE [dbo].[a_IncrementalProcessingChiHost] 
	-- Add the parameters for the stored procedure here

AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

--*************************************
--insert new customers
--*************************************
--insert into web table
insert into iscchihost...customers
(active,
customer,
name)
select 'Active',CustID,custname from isc..tblarcust c
LEFT JOIN iscchihost...customers cu ON cu.customer = c.CustId
WHERE cu.customer is null

--*************************************
--insert addresses
--*************************************
--NOTE - DATABASE HAS BEEN CHANGED TO SAVE SALES ORDER SHIPTO ADDRESSES IN SALES ORDERS
--DON'T THINK IT WILL BE NECESSARY TO SAVE CUSTOMER ADDRESS DATA ANYMORE

  --*************************************
--ADD PRODUCTS
--*************************************
insert into iscchihost...products
(product,
active,
description,
type,
category_id,
uom_id1,
trackbyserial,
trackbylot,
trackbydate,
trackbybulk)

select i.itemid,
'Active',
i.descr,
i.itemtype,
c.id as categoryid,
uom.id as uomid,
0,
0,
0,
0
from isc..tblinitem i
left join iscchihost...unit_of_measures uom on uom.unit_of_measure = i.uombase
left join iscchihost...categories c on c.category = i.productline
left join iscchihost...products p on p.product = i.itemid
left join tblinitemexport x on x.itemid = i.itemid
where p.product is null

--update changes to web table
UPDATE p SET
description = i.descr,
type = i.itemtype,
category_id = c.id,
uom_id1 = uom.id,
ts = i.ts
from tblinitemexport i
left join iscchihost...unit_of_measures uom on uom.unit_of_measure = i.uombase
left join iscchihost...categories c on c.category = i.productline
left join iscchihost...products p on p.product = i.itemid
left join tblinitemexport x on x.itemid = i.itemid
WHERE x.ts > i.ts

declare @Message varchar(100)

set @Message = 'Products added/updated AT '
select @Message = @Message + convert(varchar, getdate(),9) 
---select @Message = @Message + cast(getdate() as varchar(40))
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'


--*************************************
--ADD PRODUCT SITES
--*************************************
insert into iscchihost...product_sites
(product_id,
site_id,
uom_id1,
factor1,
created,
created_by,
created_with,
edited,
edited_by,
edited_with)
SELECT
p.id,
1,
p.uom_id1,
p.factor1,
GETDATE(),
'admin',
'SQL manager',
GETDATE(),
'admin',
'SQL manager'
from iscchihost...products p
left join iscchihost...product_sites ps on ps.product_id = p.id
where ps.id is null

--*************************************
--ADD NEW SALES ORDERS
--*************************************
--insert from temp table to web table
insert into iscchihost...sales_orders
(status,
customer_id,
site_id,
shipping_address_id,
order_number,
order_generation,
reference,
ship_via_id,
order_date,
due_date,
created,
ts,
sales_tax,
freight,
misc,
taxable,
nontaxable,
job_name,
order_description,
ship_to_name,
ship_to_address1,
ship_to_address2,
ship_to_city,
ship_to_state,
ship_to_zipcode,
name,
address1,
address2,
city,
state,
zipcode,
KSISoldBy)

select 'Open',
c.id as customer_id,
1 as site_id,
null,--a.id as shipping_address_id,
so.transid,
1,
custponum as reference,
s.id as shipviaid,
so.transdate as order_date,
so.reqshipdate as due_date,
so.transdate,
so.ts,
so.SalesTax,
so.Freight,
so.Misc,
so.TaxableSales,
so.NonTaxableSales,
so.KSI_JobName,
so.KSI_OrderDescription,
so.ShipToName,
so.ShipToAddr1,
so.ShipToAddr2,
so.ShipToCity,
so.ShipToRegion,
so.ShipToPostalCode,
so.Name,
so.Address1,
so.Address2,
so.City,
so.Region,
so.PostalCode,
so.KSISoldBy

from tblsotransheaderexport so
left join iscchihost...customers c on c.customer = so.custid
--left join iscchihost...addresses a on a.table_name = 'customers' and a.table_id = c.id
--and a.address_code = so.shiptoid
left join iscchihost...ship_vias s on s.ship_via =  so.shipvia
LEFT JOIN iscchihost...sales_orders wso on wso.order_number = so.transid
where wso.order_number is null



set @Message = 'New Sales Orders added AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'


--UPDATE WEB TABLE WITH CHANGES
UPDATE iscchihost...sales_orders SET
status = 'Open',
reference = x.custponum,
ship_via_id = s.id,
order_date = x.transdate,
due_date = x.reqshipdate,
ts = x.ts,
sales_tax = x.salestax,
freight = x.freight,
misc = x.misc,
taxable = x.taxablesales,
nontaxable = x.nontaxablesales,
job_name = x.KSI_JobName,
order_description = x.KSI_OrderDescription,
ship_to_name = x.shiptoname,
ship_to_address1 = x.shiptoaddr1,
ship_to_address2 = x.shiptoaddr2,
ship_to_city = x.shiptocity,
ship_to_state = x.shiptoregion,
ship_to_zipcode = x.shiptopostalcode,
name = x.name,
address1 = x.address1,
address2 = x.address2,
city = x.region,
zipcode = x.postalcode,
KSISoldBy = x.KSISoldBy
--select *
FROM tblSOTransHeaderEXPORT x
LEFT JOIN iscchihost...sales_orders so on so.order_number = x.transid
left join iscchihost...ship_vias s on s.id =  so.ship_via_id
WHERE x.ts > so.ts



set @Message = 'Sales Orders updated AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'

--*************************************
--ADD SALES ORDER LINES
--*************************************

--insert from temp table to web table
insert into iscchihost...sales_order_lines
(sales_order_id,
status,
line_number,
product_site_id,
description,
description2,
order_quantity,
order_uom_id,
order_factor,
sell_price,
unit_cost,
created,
created_by,
created_with,
edited,
edited_by,
edited_with,
ts)

select 
so.id,
'Open',
t.entrynum,
ps.id,
descr,
convert(varchar(1500),addnldescr),
qtyordsell,
uom.id,
conversionfactor,
unitpricesell,
unitcostsell,
getdate(),
'admin',
'import process',
getdate(),
'admin',
'import process',
t.ts


--select t.*,so.id,sol.id
 from tblsotransdetailEXPORT t
 left join iscchihost...sales_orders so on t.transid = so.order_number
 left join iscchihost...products p on t.itemid = p.product
 left join iscchihost...product_sites ps on p.id = ps.product_id and ps.site_id = 1
 left join iscchihost...unit_of_measures uom on uom.unit_of_measure = unitssell
 left join iscchihost...sales_order_lines sol on sol.sales_order_id = so.id
						and sol.line_number = t.entrynum
 where t.itemid  not like '.%' and sol.id is null and ps.id is not null
 
 
set @Message = 'SOLs added AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'


--update web table with changes
UPDATE sol SET
product_site_id = ps.id,
status = CASE WHEN t.jobcompleteyn = 1 THEN 'Updated' ELSE 'Open' END,
description = t.descr,
description2 = convert(varchar(1500),t.addnldescr),
order_quantity = t.qtyordsell,
order_uom_id = uom.id,
order_factor = t.conversionfactor,
sell_price = t.unitpricesell,
unit_cost = t.unitcostsell,
edited = getdate(),
edited_by = 'admin',
edited_with = 'Import Process',
ts = t.ts
--select *
from tblsotransdetailEXPORT t
left join iscchihost...sales_orders so on t.transid = so.order_number
left join iscchihost...products p on t.itemid = p.product
left join iscchihost...product_sites ps on p.id = ps.product_id and ps.site_id = 1
left join iscchihost...unit_of_measures uom on uom.unit_of_measure = unitssell
left join iscchihost...sales_order_lines sol on sol.sales_order_id = so.id
and sol.line_number = t.entrynum and ps.id is not null
WHERE t.ts > sol.ts and t.transid < 1137805


set @Message = 'SOLs updated AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'



--*************************************
--ADD PRODUCTION ORDERS FROM SALES ORDER LINES (MULIT QUERY OPERATION)
--*************************************
declare @temp table(id int identity(1,1),
new_order_num int,
status varchar(10),
site_id int,
product_id int,
order_quantity int,
order_date datetime,
due_date datetime,
created datetime,
created_by varchar(50),
created_with varchar(50),
edited datetime,
edited_by varchar(50),
edited_with varchar(50),
sales_order_line_id int,
sales_order_number int,
sales_order_generation int,
max_production_order_generation int)

insert into @temp (
status,
site_id,
product_id,
order_quantity,
order_date,
due_date,
created,
created_by,
created_with,
edited,
edited_by,
edited_with,
sales_order_line_id,
sales_order_number,
sales_order_generation,
max_production_order_generation)

select
'Open',
so.site_id,
ps.product_id,
sol.order_quantity,
so.order_date,
so.due_date,
getdate(),
'admin',
'import process',
getdate(),
'admin',
'import process',
sol.id,
so.order_number,
1,
m.maxgen

from iscchihost...sales_orders so
left join iscchihost...sales_order_lines sol on sol.sales_order_id = so.id
left join iscchihost...product_sites ps on ps.id = sol.product_site_id
--left join iscchihost...production_orders po on po.order_number = so.order_number
left join iscchihost...production_orders po on po.sales_order_line_id = sol.id
LEFT JOIN (select order_number, max(order_generation) as maxgen from iscchihost...production_orders
group by order_number) m on m.order_number = so.order_number
WHERE po.sales_order_line_id is null and sol.id is not null
order by so.order_number,sol.id

declare @maxgen int
declare @maxid int
declare @currid int
declare @currsalesorder int
declare @lastsalesorder int
declare @gen int
set @lastsalesorder = 0
set @currid = 1


select @maxid = max(id) from @temp

while @currid <= @maxid
	BEGIN
		select @currsalesorder = sales_order_number, @maxgen = max_production_order_generation from @temp where id = @currid
		IF @lastsalesorder <> @currsalesorder
			BEGIN
				if (@maxgen >0)
				begin
					set @gen = @maxgen + 1
				end
				else
				begin
					set @gen = 1
				end
				update @temp set sales_order_generation = @gen where id = @currid
				set @lastsalesorder = @currsalesorder
			END
		ELSE
			BEGIN
				set @gen = @gen + 1
				update @temp set sales_order_generation = @gen  where id = @currid
			END
		set @currid = @currid + 1
	END 
	select * from @temp
	
	
set @Message = 'Production orders created AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'

--INSERT PRODUCTION ORDERS FROM TEMP TABLE
insert into iscchihost...production_orders
(status,
order_number,
order_generation,
site_id,
product_id,
quantity,
order_date,
due_date,
created,
created_by,
created_with,
edited,
edited_by,
edited_with,
sales_order_line_id,
processes_created)

select
[status],
--new_order_num,
sales_order_number,
sales_order_generation,
site_id,
product_id,
order_quantity,
order_date,
due_date,
created,
created_by,
created_with,
edited,
edited_by,
edited_with,
sales_order_line_id,
'No'
from @temp



set @Message = 'Production orders added ->Chihost AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'

--ADD PRODUCTION ORDER PROCESSES FOR NEW ORDERS
insert into iscchihost...production_order_processes (
production_order_id,
status,
process_id,
sequence,
description,
created,
created_by,
created_with,
edited,
edited_by,
edited_with)

select po.id,'Open',p.id,p.id*10,p.process,getdate(),'admin','import routine',getdate(),'admin','import routine' from iscchihost...production_orders po 
cross join iscchihost...processes p
where po.id not in (select distinct production_order_id from iscchihost...production_order_processes group by production_order_id)


update iscchihost...production_orders set processes_created = 'Yes' WHERE processes_created = 'No'


set @Message = 'Processes added AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'

---declare @Message varchar(100)  comment out sah 3/11/12

--*************************************
--CHECK FOR NEW SALES ORDERS IN AR AND SET TO SHIPPED
--*************************************
update so set status = 'Invoiced'
FROM tblsotransheaderexport x
inner join ISC..TBLARhistheader h on x.transid = h.transid
inner join iscchihost...sales_orders so on so.order_number = x.transid
WHERE so.status <> 'Invoiced'


set @Message = 'Invoiced Status updated AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'


--*************************************
--IMPORT SHIPPING DATA FROM FEDEX
--*************************************
EXEC a_IncrementalShippingExport

set @Message = 'EXPORT COMPLETE AT '
select @Message = @Message + convert(varchar, getdate(),9) 
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt',@Message,'APPEND'
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt','*****************************************','APPEND'
execute spWriteStringToFile 'E:\Flow ISC Backup\ISCExportLogtest.txt','','APPEND'

END

Open in new window



3rd query to execute
IncrementalShippingExport
INCREMENTAL SHIPPING EXPORT

USE [ISCExport]
GO
/****** Object:  StoredProcedure [dbo].[a_IncrementalShippingExport]    Script Date: 01/16/2012 16:23:24 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:		<Author,,Name>
-- Create date: <Create Date,,>
-- Description:	<Description,,>
-- =============================================
ALTER PROCEDURE [dbo].[a_IncrementalShippingExport] 
	-- Add the parameters for the stored procedure here
AS
BEGIN
	-- SET NOCOUNT ON added to prevent extra result sets from
	-- interfering with SELECT statements.
	SET NOCOUNT ON;

    -- Insert statements for procedure here
--drop table #Temp
	SELECT 
	s.id
	,s.reference
	,s.ship_name
	,s.ship_address1
	,s.ship_address2
	,s.ship_address3
	,s.ship_city
	,s.ship_state
	,s.ship_postal_code
	,s.ship_country
	,s.tracking_number
	,s.weight
	,s.height
	,s.width
	,s.length
	,s.ship_via
	,s.freight_charges
	,s.discounted_charges
	,s.package_contents
	,s.design_description
	,s.action
	,s.created_date
	into #Temp
	from shippingexport s
	LEFT JOIN (select max(shippingexportid)as shippingexportid from shippingexportpending group by shippingexportid) p
	on p.shippingexportid = s.id
	WHERE p.shippingexportid is null and len(s.reference) > 0 
	

	--ADD NEW RECORDS INTO WEB TABLE
	INSERT INTO iscCHIHOST...shipping_export
	(import_id
	,reference
	,ship_name
	,ship_address1
	,ship_address2
	,ship_address3
	,ship_city
	,ship_state
	,ship_postal_code
	,ship_country
	,tracking_number
	,weight
	,height
	,width
	,length
	,ship_via
	,freight_charges
	,discounted_charges
	,package_contents
	,design_description
	,action
	,created_date
	,status)

	SELECT 
	s.id
	,s.reference
	,s.ship_name
	,s.ship_address1
	,s.ship_address2
	,s.ship_address3
	,s.ship_city
	,s.ship_state
	,s.ship_postal_code
	,s.ship_country
	,s.tracking_number
	,s.weight
	,s.height
	,s.width
	,s.length
	,s.ship_via
	,s.freight_charges
	,s.discounted_charges
	,s.package_contents
	,s.design_description
	,s.action
	,s.created_date
	,'Active'
	from shippingexport s
	LEFT JOIN iscCHIHOST...shipping_export x
	on x.import_id = s.id
	WHERE x.import_id is null


	declare @MaxId as int
	declare @counter as int
	declare @REF as varchar(50)
	declare @POS as int
	declare @currid as int
	declare @workref as varchar(50)
	declare @insertref as varchar(50)
	declare @error as varchar(50)

	select @MaxId = max(id) from #Temp
	set @counter = 1
	set @currid = 0
	set @ERROR = ''
--select @Maxid, @currid
	WHILE (@currid < @MaxId)
		BEGIN  --L2
			set @ERROR = ''
			set @POS = 1
			SELECT top 1 @currid = id, @REF = reference FROM #Temp WHERE id > @currid ORDER BY id
			SET @WORKREF = @REF
			SET @WORKREF = REPLACE(@WORKREF,'&',',')

			WHILE (@POS <= LEN(@WORKREF))
				BEGIN
					IF(SUBSTRING(@WORKREF,@POS,1) IN ('1','2','3','4','5','6','7','8','9','0',' ',',','&'))
						BEGIN
							SET @POS = @POS + 1

						END
					ELSE
						BEGIN
							SET @POS = @POS + 1
							SET @ERROR = 'FAIL'
						END
				END
				--select @WORKREF as workref,@ERROR as error,@currid as currid

			IF (@ERROR ='')
				BEGIN  --L3

					IF (CHARINDEX(',',@WORKREF)>0)
					BEGIN  --L4
						SET @POS = CHARINDEX(',',@WORKREF)
						SET @INSERTREF = LTRIM(RTRIM(SUBSTRING(@WORKREF,0,@POS)))
											--select @insertref,@workref,@POS
						INSERT INTO shippingexportPENDING
							(ShippingExportID
							,reference
							,original_reference
							,ship_name
							,ship_address1
							,ship_address2
							,ship_address3
							,ship_city
							,ship_state
							,ship_postal_code
							,ship_country
							,tracking_number
							,weight
							,height
							,width
							,length
							,ship_via
							,freight_charges
							,discounted_charges
							,package_contents
							,design_description
							,action
							,created_date)
							SELECT
							#Temp.id
							,@INSERTREF AS reference
							,@REF
							,#Temp.ship_name
							,#Temp.ship_address1
							,#Temp.ship_address2
							,#Temp.ship_address3
							,#Temp.ship_city
							,#Temp.ship_state
							,#Temp.ship_postal_code
							,#Temp.ship_country
							,#Temp.tracking_number
							,#Temp.weight
							,#Temp.height
							,#Temp.width
							,#Temp.length
							,#Temp.ship_via
							,#Temp.freight_charges
							,#Temp.discounted_charges
							,#Temp.package_contents
							,#Temp.design_description
							,#Temp.action
							,#Temp.created_date
							FROM #Temp WHERE id = @currid
							SET @WORKREF = SUBSTRING(@WORKREF,@POS + 1,100)
					END  --L4
					
					IF (CHARINDEX(',',@WORKREF)>0)
					BEGIN  --L4
						SET @POS = CHARINDEX(',',@WORKREF)
						SET @INSERTREF = LTRIM(RTRIM(SUBSTRING(@WORKREF,0,@POS)))

						INSERT INTO shippingexportPENDING
							(ShippingExportID
							,reference
							,original_reference
							,ship_name
							,ship_address1
							,ship_address2
							,ship_address3
							,ship_city
							,ship_state
							,ship_postal_code
							,ship_country
							,tracking_number
							,weight
							,height
							,width
							,length
							,ship_via
							,freight_charges
							,discounted_charges
							,package_contents
							,design_description
							,action
							,created_date)
							SELECT
							#Temp.id
							,@INSERTREF AS reference
							,@REF
							,#Temp.ship_name
							,#Temp.ship_address1
							,#Temp.ship_address2
							,#Temp.ship_address3
							,#Temp.ship_city
							,#Temp.ship_state
							,#Temp.ship_postal_code
							,#Temp.ship_country
							,#Temp.tracking_number
							,#Temp.weight
							,#Temp.height
							,#Temp.width
							,#Temp.length
							,#Temp.ship_via
							,#Temp.freight_charges
							,#Temp.discounted_charges
							,#Temp.package_contents
							,#Temp.design_description
							,#Temp.action
							,#Temp.created_date
							FROM #Temp WHERE id = @currid
							SET @WORKREF = SUBSTRING(@WORKREF,@POS + 1,100)
					END  --L4
					
					IF (CHARINDEX(',',@WORKREF)>0)
					BEGIN  --L4
						SET @POS = CHARINDEX(',',@WORKREF)
						SET @INSERTREF = LTRIM(RTRIM(SUBSTRING(@WORKREF,0,@POS)))
						INSERT INTO shippingexportPENDING
							(ShippingExportID
							,reference
							,original_reference
							,ship_name
							,ship_address1
							,ship_address2
							,ship_address3
							,ship_city
							,ship_state
							,ship_postal_code
							,ship_country
							,tracking_number
							,weight
							,height
							,width
							,length
							,ship_via
							,freight_charges
							,discounted_charges
							,package_contents
							,design_description
							,action
							,created_date)
							SELECT
							#Temp.id
							,@INSERTREF AS reference
							,@REF
							,#Temp.ship_name
							,#Temp.ship_address1
							,#Temp.ship_address2
							,#Temp.ship_address3
							,#Temp.ship_city
							,#Temp.ship_state
							,#Temp.ship_postal_code
							,#Temp.ship_country
							,#Temp.tracking_number
							,#Temp.weight
							,#Temp.height
							,#Temp.width
							,#Temp.length
							,#Temp.ship_via
							,#Temp.freight_charges
							,#Temp.discounted_charges
							,#Temp.package_contents
							,#Temp.design_description
							,#Temp.action
							,#Temp.created_date
							FROM #Temp WHERE id = @currid
							SET @WORKREF = SUBSTRING(@WORKREF,@POS + 1,100)
					END --L4
					
					IF (CHARINDEX(',',@WORKREF)=0 and len(@WORKREF)>0 and @ERROR = '')
					BEGIN --L4
						SET @INSERTREF = LTRIM(RTRIM(@WORKREF))
						INSERT INTO shippingexportPENDING
							(ShippingExportID
							,reference
							,original_reference
							,ship_name
							,ship_address1
							,ship_address2
							,ship_address3
							,ship_city
							,ship_state
							,ship_postal_code
							,ship_country
							,tracking_number
							,weight
							,height
							,width
							,length
							,ship_via
							,freight_charges
							,discounted_charges
							,package_contents
							,design_description
							,action
							,created_date)
							SELECT
							#Temp.id
							,@INSERTREF AS reference
							,@REF
							,#Temp.ship_name
							,#Temp.ship_address1
							,#Temp.ship_address2
							,#Temp.ship_address3
							,#Temp.ship_city
							,#Temp.ship_state
							,#Temp.ship_postal_code
							,#Temp.ship_country
							,#Temp.tracking_number
							,#Temp.weight
							,#Temp.height
							,#Temp.width
							,#Temp.length
							,#Temp.ship_via
							,#Temp.freight_charges
							,#Temp.discounted_charges
							,#Temp.package_contents
							,#Temp.design_description
							,#Temp.action
							,#Temp.created_date
							FROM #Temp WHERE id = @currid

					END  --L4
				END --L3 
	END	--L2
		
	INSERT INTO iscchihost...shipping
	(import_id
	,sales_order_id
	,reference
	,original_reference
	,ship_name
	,ship_address1
	,ship_address2
	,ship_address3
	,ship_city
	,ship_state
	,ship_postal_code
	,ship_country
	,tracking_number
	,weight
	,height
	,width
	,length
	,ship_via
	,freight_charges
	,discounted_charges
	,package_contents
	,design_description
	,action
	,created_date
	,status)

	SELECT 
	X.shippingexportid
	,so.id as sales_order_id
	,X.reference
	,X.original_reference
	,X.ship_name
	,X.ship_address1
	,X.ship_address2
	,X.ship_address3
	,X.ship_city
	,X.ship_state
	,X.ship_postal_code
	,X.ship_country
	,X.tracking_number
	,X.weight
	,X.height
	,X.width
	,X.length
	,X.ship_via
	,X.freight_charges
	,X.discounted_charges
	,X.package_contents
	,X.design_description
	,X.action
	,X.created_date
	,'Active'
	FROM ISCEXPORT..SHIPPINGEXPORTPENDING X
	LEFT JOIN iscchihost...sales_orders so on so.order_number = ltrim(rtrim(X.reference))
	LEFT JOIN iscchihost...shipping s on s.import_id = X.shippingexportid
	WHERE X.action is null and s.import_id is null
	ORDER BY X.id
	 
	UPDATE s
	SET status = 'Deleted'
	from iscchihost...shipping s
	LEFT JOIN shippingexport X on X.id = s.import_id
	WHERE s.status = 'Active'  and X.Action = 'DELETED'
	
	UPDATE so
	SET act_ship_date = s.shipdate
	FROM iscchihost...sales_orders so
	INNER JOIN (select sales_order_id,min(created_date) as shipdate from iscchihost...shipping group by sales_order_id) s
	ON s.sales_order_id = so.id
	WHERE so.act_ship_date is null

	
END

Open in new window


Thanks again for looking!
0
Comment
Question by:hibbsusan
  • 6
  • 4
11 Comments
 
LVL 39

Expert Comment

by:lcohan
ID: 37747496
I think you are right and one other reason it may be slow besides the joins accross different servers/databases is the fact that data is sent out to txt files then imported again. We have somewhat similar scenario where RUBY code does the sync from SQL to Postgres and viceversa (for a ecommerce website) and we are in the process of rewriting everything by using SQL SSIS and either native postgres ODBC drivers or PGNP OLEDB driver.
http://www.pgoledb.com/

So far we are having much better results in terms of speed, flow control, error trapping/logging, history, etc by doing this and as much as it looks a lot of code changes and testing it IS much worthed.

One comment though - "At the very end, another query is called and the gets some shipping / tracking information, but I think that isn't too important. " - everything is important in a on line ordering system and shipping/tracking is one of them so it all must be done regardless.
0
 

Author Comment

by:hibbsusan
ID: 37748055
thank you for your reply. do you just run a ruby script as a cronjob to do the transfer?

 also, i meant the shipping/tracking info is less important in terms pinpointing where the query slows. but I may be wrong about that as well.

thanks again
0
 
LVL 39

Expert Comment

by:lcohan
ID: 37749067
Yeah, basicaly ruby code runs every 15 seconds and checks a sync queue and if there's something in it will process by running source code (SQL sp's or PG functions) to get the source data set(s) and then pass it as parameter to destination code (SQL sp's or PG functions) to update the destination 1row at a time!!! which is insane in a bussy ecommerce website - well when it all started wasn't that busy but now with over 1000 transactions per second that is no longer keeping up.
0
 

Author Comment

by:hibbsusan
ID: 37749119
the fact that data is sent out to txt files then imported again

Can you tell me what lines these things (sent out and imported) are happening in?

Thanks!
0
 
LVL 39

Expert Comment

by:lcohan
ID: 37749282
My bad - after another more careful look at your scripts the execute spWriteStringToFile is just writing .TXT log files and not data so as you mentioned nost likely the performance issues are around JOINs between different server/db's like ISCEXPORT and iscchihost.
0
What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

 
LVL 39

Expert Comment

by:lcohan
ID: 37749359
Still your described scenario for change is the best solution by far.
I suggest to have the "staging" tables that will host this raw data from one side in the same DB with target tables so code objects can take advantage of potential indexes to be added on both these staging tables and destination.
0
 

Author Comment

by:hibbsusan
ID: 37749435
Without having to re-write hundreds of lines of queries that are far above my head, what do you think is the quickest way to do this?

Could I do a sort of

SELECT * FROM (several tables)

And then create those tables identically on the other computer, then recycle the script that checks to see what needs to update, by just putting it on the other computer?
0
 
LVL 39

Accepted Solution

by:
lcohan earned 500 total points
ID: 37749743
I don't think that is possible without any code changes.
You could indeed bring accorss staging data from all those tables like "ISCHIHOST...table_name" to do the join's on one side but you will still have to cnhange some code. I think it may be the best to re-write possibly using SSIS or if you want just minimal work like described you must be sure to have a good test environment and thorough test plan before making any changes.
0
 

Author Comment

by:hibbsusan
ID: 37754410
Hello,
I wonder if you could come back to this for just a moment. I have another question about SSIS:

We make the connection between the local SQLServer databases & the remote PostgresSQL DB (which is defined in SQLServer as a linked server) using the native ODBC driver for Postgresql.   Our job currently executes as a SQLServer Agent scheduled job (which calls the three SPs) in SQLServer Management Studio.  How does that relate to SSIS?  Would setting it up as an SSIS package make it run significantly more efficiently?  Or would that just be a different way to do the same thing?

Thanks again
0
 
LVL 39

Expert Comment

by:lcohan
ID: 37777435
"Our job currently executes as a SQLServer Agent scheduled job (which calls the three SPs) in SQLServer Management Studio."
Your job does not run the three SPs in MAnagement studio right? Only a USER (usualy a human) can run them in Management Studio  and the SQL Job will run them as T-SQL commands in a Step right?

"How does that relate to SSIS?  Would setting it up as an SSIS package make it run significantly more efficiently?  Or would that just be a different way to do the same thing?"

In SSIS you should be able to set up a workflow on success/failure, have more control in other words, add more logging etc. And somewhat yes - it is a different way to do the same thing.
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

Suggested Solutions

The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Many companies are looking to get out of the datacenter business and to services like Microsoft Azure to provide Infrastructure as a Service (IaaS) solutions for legacy client server workloads, rather than continuing to make capital investments in h…
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Via a live example, show how to extract information from SQL Server on Database, Connection and Server properties

743 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now