[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 557
  • Last Modified:

DTS - Package fails

I have a package that fails...interestingly is that there is no record of it in the SQL logs off of enterprise manager...yet I see it in the event logs...

"SQL SERVER SCHEDULED JOB 'INDI_AT_CASH" (0XFDB5E...08EEE) - STATUS FAILED - INVOKED ON: 2010-02-25 08:00:02 - MESSAGE: THE JOB FAILED. THE JOB WAS INVOKED BY SCHEDULE 46 (INDI_AT_CASH). THE LAST STEP TO RUN WAS STEP 1 (INDI_AT_CASH).

I am using sql server 2000
0
GlobaLevel
Asked:
GlobaLevel
  • 6
  • 4
1 Solution
 
BodestoneCommented:
If you navigate to the SQL agent in EM and expand to the jobs, right click on the job and chose job history.
On the window you now have expand the job until you see step 1.

The error for step 1 should give more information about the failure.
If you are still at a loss paste the error here as we can have a look.
0
 
GlobaLevelAuthor Commented:

Bodestone..i did that..in the job hist:..pretty much get the samething...

The job failed.  The Job was invoked by Schedule 48 (INDI_ATPM_MOVEMENTS).  The last step to run was step 1 (INDI_ATPM_MOVEMENTS).

...it has failed for the past week...
0
 
BodestoneCommented:
That sounds like the top level job failure message though.
You can expand that to view the individual messages for each step within the job.
The one for step 1 should be more detailed.
0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
BodestoneCommented:
My apologies. It's the tick box near the top right: Show Step Details.
2k5 SSMS makes it a tad easier and I forgot.
0
 
GlobaLevelAuthor Commented:
wow...good stuff...

Executed as user: MONTANA\LogShippingUserSCC. ...Step_DTSDataPumpTask_1   DTSRun OnError:  DTSStep_DTSDataPumpTask_1, Error = -2147217873 (80040E2F)      Error string:  Error at Destination for Row number 1. Errors encountered so far in this task: 1.      Error source:  DTS Data Pump      Help file:        Help context:  0      Error Detail Records:      Error:  -2147217873 (80040E2F); Provider Error:  1 (1)      Error string:  [Microsoft][ODBC driver for Oracle][Oracle]ORA-00001: unique constraint (BOS_ADMINIS.PK_AMS_ATPM_MOVEMENT) violated      Error source:  Microsoft OLE DB Provider for ODBC Drivers      Help file:        Help context:  0      DTSRun OnError:  DTSStep_DTSDataPumpTask_1, Error = -2147213206 (8004206A)      Error string:  The number of failing rows exceeds the maximum specified.      Error source:  Microsoft Data Transformation Services (DTS) Data Pump      Help file:  sqldts80.hlp      Help context:  0      Error Detail Records:      Error:  -21...  Process Exit Code 1.  The step failed.
0
 
BodestoneCommented:
I take it this is a job to export data from an MSSQL DB to an oracle DB or vice versa

It looks like there is a primary key or unique constraint on the oracle DB that does not exist on the SQL Server.
You  need to find the table in source database that inserts into the BOS_ADMINIS table in the destination.
And the specific column (probably AMS_ATPM_MOVEMENT)

Assuming the table names are the same each side (change the names to protect the innocent) then a sctript similar to the following on the source database should help identify the issue.

SELECT AMS_ATPM_MOVEMENT, COUNT(*)
FROM BOS_ADMINIS
GROUP BY AMS_ATPM_MOVEMENT
HAVING COUNT(*) > 1

Open in new window

0
 
GlobaLevelAuthor Commented:
we have this reoccuring problem that when the network goes down which it does often...and the package is in the middle of sending data from sql to oracle the transfer is not always complete even though some of the data is passed on through...so the next day the package goes to run again and the column on the sql side that says that data has been successfully sent is set to 100 instead of 101 which means successful transmission...so the data is half sent and half not sent ....and therefore not on the sql the column does not contain any 101s which would tell us that the transfer was complete...so we doe this query to find out which data has been sent and not sent and then do an update to the oracle..but we need a programmatic way to resilve this , as this manual process is not always beneficial....so fi you have any ideas..code, please let me know..thanks...
0
 
BodestoneCommented:
OK, what you want to do is try and modify the query the package uses to add data to the destination table.

Instread of
INSERT BOS_ADMINIS
SELECT *
FROM sourceData

You need something like

INSERT BOS_ADMINIS
SELECT * 
FROM sourceData
WHERE NOT EXISTS(SELECT 1 FROM BOS_ADMINIS WHERE AMS_ATPM_MOVEMENT = sourceData.AMS_ATPM_MOVEMENT)

Open in new window

0
 
BodestoneCommented:
If changing the DTS import script is a hassle then you could change its destination to a staging table and the method to delete all rows from the staging table before opulation.

Then you could add a job step afterwards similar to:
INSERT BOS_ADMINIS
SELECT * 
FROM BOS_ADMINIS_STAGING
WHERE NOT EXISTS(SELECT 1 FROM BOS_ADMINIS WHERE AMS_ATPM_MOVEMENT = BOS_ADMINIS_STAGING.AMS_ATPM_MOVEMENT)

Open in new window

0
 
GlobaLevelAuthor Commented:
I now have two tasks...


1) I need to write a general package that will identify if other packages have fialed in the sql server agent....
  a) send notifcation via ftp or emal
  b) auto correct the failed packages and relaunch them...


2) after number one has launched I need to come back to this project and launch this program inside a dts package that will go and correct the failed ATPM_MOVEMENT package....I believe i need to write a stored proc  to accomplish this....

please see attached code...
CREATE #tempTable

SELECT *, 
  1 AS ID_COMPANY,
  'DTS' AS USUARIO,
  CURRENT_TIMESTAMP   AS CREATE_DATE,
  1 AS OPTIMIST_LOCK
INSERT INTO #tempTable
FROM TR_CIERRE_RECAUDACION_ATPM_MONEDA  with(nolock)  

WHERE DUPLICADO < 100
  and fecha_hora > '2007-12-10 00:00:00'

-- -=-=-=-=-=
--
-- if the temp table isnt empty than run update statement
-- to set the set values in sql to 101 so that the load to the back office can close
-- and transactions can finish
-- we dont want any uncommitted transactions...
-- 
-- -=-=-=-=-=-


IF #tempTable <> NULL THEN


update movimientos_ATPM  set duplicado = 102
where id_lugar +identificativo_caja +tipo_movimiento+n_transito+tipo_moneda+
orden_moneda+fecha_hora+fecha_hora_atpm
in 
(
 Select a.id_places+
 a.box_id +
a.cd_atpm_movement_type +
 a.id_transit+
a.cd_atpm_cash_type +
a.sequential_coin +
a.dt_trans_date_ts+
 a.dt_atpm_date_ts 
FROM movimientos_ATPM b, AMS_INDIANA..BOS_ADMINIS.AMS_CASH_ATPM_MOVEMENTS a 
WHERE b.DUPLICADO < 100
And a.id_places = b.id_lugar 
and a.box_id = b.identificativo_caja 
and a.cd_atpm_movement_type = b.tipo_movimiento
and a.id_transit = b.n_transito
and a.cd_atpm_cash_type = b.tipo_moneda
and a.sequential_coin = b.orden_moneda
and a.dt_trans_date_ts = b.fecha_hora
and a.dt_atpm_date_ts = b.fecha_hora_atpm )




DROP #tempTable

Open in new window

0

Featured Post

[Webinar] Improve your customer journey

A positive customer journey is important in attracting and retaining business. To improve this experience, you can use Google Maps APIs to increase checkout conversions, boost user engagement, and optimize order fulfillment. Learn how in this webinar presented by Dito.

  • 6
  • 4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now