SQL server data import problem

Bill Doherty
Bill Doherty used Ask the Experts™
on
Hello

I have run into an issue I can not figure out. I am running a 2014 sql express server that is connected to and importing data from a 2008 R2 sql server.  Each server is Hosted within the same Virtual Machine host using the same data storage but on individual VMs. When trying an import the from the express machine the script runs well but hangs because of some type of temp disk space utilization that that fills the drive with over 40GB. On reboot the disk space returns to normal. The script is pulling only about 6000 records from the other data base and when complete the dtabase file is less than 50MB.

We copied the data from the remote server and placed it on the 2014 express server changed the script to look remotely and the import happened as it should.

My questions are why and how does the fail happen?  Why would a large temp file of some type be written when importing between VMs. How do I over come this?  

I will provide any other information you may need and thank you for your help.
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Raja Jegan RSQL Server DBA & Architect, EE Solution Guide
Awarded 2009
Distinguished Expert 2018

Commented:
>> The script is pulling only about 6000 records from the other data base and when complete the dtabase file is less than 50MB.

Before going to other pointers, hope you are aware of the 10 GB Data file limit of SQL Server 2014 Express edition. This includes your current data across all other tables and Index space usages.
Kindly check the current free space in your Database by using either sp_spaceused or by using the shrink file option to see the exact space currently used in your database..

If you have sufficient data space available, then check the below items:
1. This current table where 6000 records are pulled out, does it has any varchar(max) or nvarchar(max) or varbinary(max) or any other blob data types. If so, then 6000 records can easily fill up the remaining space.
2. Table which you are currently loading data, does it have lot of indexes already present. If so, new data being added will increase the size of those indexes as well.
3. In your package, have you configured the BLOBTempStoragePath and BufferTempStoragePath options. These options will improve the performance of data load by creating temp files. This can explain why the temp file increases to huge value.
4. What is your current Page file size, during critical operations like this Page file will increase from the minimal value to more value like this causing your drive to fill up faster. This can be the reason why you get the free space back after server reboot
On reboot the disk space returns to normal.
Bill DohertyNetwork Administrator

Author

Commented:
Thank you Raja Jegan,

The page file is set to about 16GB, and the Fie size uses by the db is 330MB. As for the other checks I am trying to figure them out.  

However, I am still trying wrap my head around as to why if both DB the target and the source are on the same SQL server all works as it should, in less than thirty seconds.  It is when the source sits on its server and the import is pulled via the connection that the problem arises. If left alone it it seems to finish after about two hours and the temp files do not get cleared.  Other than pointing to another server for the source what is different to cause this problem?

I will look into the other suggestions and get back to you, again thank you for your reply.
Raja Jegan RSQL Server DBA & Architect, EE Solution Guide
Awarded 2009
Distinguished Expert 2018

Commented:
>> It is when the source sits on its server and the import is pulled via the connection that the problem arises.

When SSIS package executes, it will try to import or get all of the required data into its cache for better performance.
So, if the source is on another server, then the server where the package is executed will try to get all the data utilizing your Network bandwidth and then your BLOBTempStoragePath/BufferTempStoragePath files as required.
Kindly try just importing your Source table directly to a temp table in the destination server and check whether the SELECT is taking more records or is it the INSERT operation so that we can focus more on the root cause of the issue.
To identify the above, you can just run a profiler trace while running the above test..
Ensure you’re charging the right price for your IT

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden using our free interactive tool and use it to determine the right price for your IT services. Start calculating Now!

Bill DohertyNetwork Administrator

Author

Commented:
Raja,

Thank you I will try to do this as we have set up a copy of the database for testing. Unfortunately I will not get to this until later next week due to the celebrated holidays. Thank you for your continued assistance.
Raja Jegan RSQL Server DBA & Architect, EE Solution Guide
Awarded 2009
Distinguished Expert 2018

Commented:
Sure, I'll wait for your update..
Happy Holidays!!
Bill DohertyNetwork Administrator

Author

Commented:
Just to let you know, I am awaiting support from vendor to undo tests they made with scripts so I can test the disk space theory.
Raja Jegan RSQL Server DBA & Architect, EE Solution Guide
Awarded 2009
Distinguished Expert 2018

Commented:
Sure, keep us posted..
Bill DohertyNetwork Administrator

Author

Commented:
Hello,

This problem still persists. I have not been successful with some of the testing. Increasing the disk space allowed the script to finish. However It did not do what it was suppose to (import new data) nor did it release the temp file and restore 50GB of disk space it used.  And as always rebooting the server restored the disk space.

wrestling with vendor support, who thinks its a server issue, whereas I think its a script issue. in either case I am still researching possibilities.
Raja Jegan RSQL Server DBA & Architect, EE Solution Guide
Awarded 2009
Distinguished Expert 2018

Commented:
>> And as always rebooting the server restored the disk space.

Kindly let me know which file is growing in size and getting cleared off during server reboot..
1. Paging File - If it is Paging file, then I would recommend you to keep it in a dedicated drive sized around 1.5 times the RAM size available in your Server.
2. tempdb database files - If it is tempdb, then it should be moved to a dedicated drive and ensure that your tempdb data files are split to 4 or 8 files depending upon the Server cores available.
3. Or any other log files - If it is any other files, then I would need more info to suggest better..

>> I have not been successful with some of the testing.

Kindly let me know whether you are able to observe any difference while doing testing with some approaches and if so please provide them to identify your issue better..

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial