[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Issue with SSIS using Timberline Datasource

Posted on 2008-11-17
11
Medium Priority
?
2,726 Views
Last Modified: 2013-11-30
I'm trying to schedule a SSIS job to run nightly using Timberline as a datasource.  Here's my issue:

The Timberline ODBC driver has a constraint which will only allow the use of drive letters when pointing to its source.  The driver will not accept a UNC path.  Because of this constraint, the package has to be run when a user is logged in.  When run as either a scheduled system task or a SQL agent job, the package fails because the package does not have access to the mapped drives (N and M).  The login script which maps the drives are only executed when a user logs in.  

Heres a couple things I tried already:

1. Hardcode the connection string (Driver=Timberline Data;DBQ=//UNC Path/;UID=XX;PWD=XXXX) within the package to use the UNC path to the Timberline data.   --> There is still something within Timberline which points to od.scr in M:\Accounting\Screen (This is true if I copy down the data locally)

2. Create a windows service in my test environment here to map the drive when Windows  load.  
--> Same issue.  The script used to map the drive by the Service will not run unless someones logged in.

3.  Tried  creating a DSN using the Pervasive driver but because Timberline does not use DDF files (from what I can tell), this was not possible..
0
Comment
Question by:hchan_resolve
11 Comments
 
LVL 3

Expert Comment

by:dro_law
ID: 22979248
have you tried using a credential & proxy for the ssis task that does have access to the mapped drives?
0
 
LVL 17

Expert Comment

by:HoggZilla
ID: 22979281
Can you map the drives on the SQL Server where the Agent is running?
0
 

Author Comment

by:hchan_resolve
ID: 22979284
Permissions is not the issue.  When the job is scheduled, the user impersonating the SQL Agent service does not have access to the mapped drives because the login script is not run.  
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 

Author Comment

by:hchan_resolve
ID: 22979303
Sure... the drives is mapped when I'm logged in and the SSIS package runs sucessfully.  The problem is when the job is scheduled to run and a user is not logged in to run it manually.
0
 

Author Comment

by:hchan_resolve
ID: 22979323
This is the error I get if I try to run the job when the connection string is hardcoded to either a local drive or a unc path:

TITLE: Package Validation Error
------------------------------

Package Validation Error

------------------------------
ADDITIONAL INFORMATION:

Error at COMPANY_INFORMATION [DataReader Source [150]]: System.Data.Odbc.OdbcException: ERROR [S1000] [TimberlineODBC][TimberlineODBCEngine ODBC Driver][DRM File Library]Drive or directory unavailable [IO-WIN 3]
M:\Accounting\Screen\OD.scr
ERROR [IM006] [Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttr failed
ERROR [01000] [Microsoft][ODBC Driver Manager] The driver doesn't support the version of ODBC behavior that the application requested (see SQLSetEnvAttr).
   at System.Data.Odbc.OdbcConnection.HandleError(OdbcHandle hrHandle, RetCode retcode)
   at System.Data.Odbc.OdbcConnectionHandle..ctor(OdbcConnection connection, OdbcConnectionString constr, OdbcEnvironmentHandle environmentHandle)
   at System.Data.Odbc.OdbcConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
   at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup)
   at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection)
   at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
   at System.Data.Odbc.OdbcConnection.Open()
   at Microsoft.SqlServer.Dts.Runtime.ManagedHelper.GetManagedConnection(String assemblyQualifiedName, String connStr, Object transaction)
   at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSConnectionManager90.AcquireConnection(Object pTransaction)
   at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.AcquireConnections(Object transaction)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostAcquireConnections(IDTSManagedComponentWrapper90 wrapper, Object transaction)
Error at COMPANY_INFORMATION [DTS.Pipeline]: component "DataReader Source" (150) failed validation and returned error code 0x80131937.

Error at COMPANY_INFORMATION [DTS.Pipeline]: One or more component failed validation.

Error at COMPANY_INFORMATION: There were errors during task validation.

 (Microsoft.DataTransformationServices.VsIntegration)

------------------------------
BUTTONS:

OK
------------------------------
 
0
 
LVL 17

Expert Comment

by:HoggZilla
ID: 22979326
The SQL Agent user, can you login to the Server with this account and map the drives? But then what you are saying is that the login does not occur to map the drives when the job runs. Hmmm.
0
 
LVL 17

Expert Comment

by:HoggZilla
ID: 22981770
From what I read, connecting via Pervasive is the way to go. But you tried that? Can we walk through it.
0
 

Author Comment

by:hchan_resolve
ID: 22985065
Sure...

I created a system DSN using the Pervasive ODBC Client interface driver.  The driver points to the timberline server,  From there, I entered credentials to create a Pervasive Engine DSN.  

Here's the problem... Within the engine DSN, I have to create a new database pointing to the Timberline data.  The problem is that during part of the creation, I have to specify a dictionary location.   Timberline from what I can tell does not use DDF files.  There's 2 shares that contain timberline data.  The first share has the data I want the DB to point to.  The send share does have DDF files but everythign combined is less then 2 mbs.  I verifed that the DDF filles on the second share does not contain the structure for the timberline tables by bounding to them when I initially created the database.
0
 
LVL 17

Expert Comment

by:HoggZilla
ID: 22990614
I know you have probably searched everywhere, but just to be sure ... did you read this one?
http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/Q_21426283.html?sfQueryTermInfo=1+timberlin 
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 23000933
>>The Timberline ODBC driver has a constraint which will only allow the use of drive letters when pointing to its source.  The driver will not accept a UNC path. <<
Than you are out of luck, at least as far as using SQL Server Agent.  SQL Server Agent cannot used mapped drives.  Consider using some other scheduler, perhaps even Windows Scheduled Tasks.
0
 

Accepted Solution

by:
hchan_resolve earned 0 total points
ID: 23014766
Solution has been resolved.  The trick was to map the drives and execute the SSIS package in one process.  We try trying to do this in multiple process which would now work.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In part one, we reviewed the prerequisites required for installing SQL Server vNext. In this part we will explore how to install Microsoft's SQL Server on Ubuntu 16.04.
Microsoft Access has a limit of 255 columns in a single table; SQL Server allows tables with over 255 columns, but reading that data is not necessarily simple.  The final solution for this task involved creating a custom text parser and then reading…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Via a live example, show how to extract information from SQL Server on Database, Connection and Server properties
Suggested Courses

873 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question