troubleshooting Question

Memory Issues When Repeatedly Querying Table with Image Column

Avatar of Curto123
Curto123 asked on
.NET ProgrammingMicrosoft SQL Server 2008
2 Comments1 Solution495 ViewsLast Modified:
I have C# code that I am using to extract documents from a SQL Server table and saving them to the file system.  Under normal circumstances this works fine.  It is not a production application, just something I use occasionally to convert data from a legacy system to my product.

I've run into a legacy system that has roughly 150,000 rows in it's document table.  What I see is memory consumption by SQL Server inch up until it is consuming 93%-98% of the available memory, then throws an error that SQL Server is not responding ("No process is on the other end of the pipe").  Meanwhile, the actual conversion executable is chewing up 30 or so megs, then down to 15, back up to 40, etc.  So it is not the actual BinaryWriter action that is consuming resources, it is the selection of these records.

Each set of document records is keyed on a Case Number and the Document Name.  I am selecting the Case Numbers with a group by and for each Case Number I am using a DataTableReader to extract the documents.

Is there another way to do this?  Or are there any hacks to get SQL Server to free up memory?  I understand that allowing SQL Server to use all the available memory is a good thing, and that it will free up memory when it is requested by the OS, but if it chews it up to the point where there is none available for itself, well....

I am running this on SQL Server 2008 R2 (64-bit), Windows 7 with 12 Gb of RAM.

Any help is most appreciated.  Code is attached (hopefully it is readable).
public static void ProcessDocuments(ref ProgressBar prg, ref Label lbl, ref Label lblGeneralProgress)
        {
            prg.Value = 0;
            DateTime startTime = DateTime.Now;
            DateTime endTime = DateTime.Now;
            TimeSpan duration = endTime - startTime;

            Database db = DatabaseFactory.CreateDatabase("DACMSConnectionString");
            Database pbkDb = DatabaseFactory.CreateDatabase("PbKConnectionString");

            DbCommand dbCommand = pbkDb.GetSqlStringCommand(@"Delete tblCsDocument");
            pbkDb.ExecuteNonQuery(dbCommand);
            dbCommand = pbkDb.GetSqlStringCommand(@"Delete tblRfDocument");
            pbkDb.ExecuteNonQuery(dbCommand);

            dbCommand = db.GetSqlStringCommand(@"Select doc.CaseNum, cxref.ReferralId, cxref.FileNumber
                                                              From DACaseDocument doc
                                                                Join TEMP_DACaseCaseXRef cxref on cxref.CaseNum = doc.CaseNum
                                                              --Where cxref.ReferralId <= 20000  
                                                              --Where cxref.ReferralId > 20000
                                                              Group By doc.CaseNum, cxref.ReferralId, cxref.FileNumber");
            DataSet dsDocs = db.ExecuteDataSet(dbCommand);
            prg.Maximum = dsDocs.Tables[0].Rows.Count;
            foreach (DataRow drDoc in dsDocs.Tables[0].Rows)
            {
                if ((prg.Value % 1000) == 0)
                {
                    endTime = DateTime.Now;
                    duration = endTime - startTime;
                    double per1000Time = duration.TotalSeconds * ((prg.Maximum - prg.Value) / 1000);
                    string estCompletionTime = Convert.ToString(endTime.AddSeconds(per1000Time));
                    lblGeneralProgress.Text = "Document Load - " + duration.ToString() + " per 1000 (" + prg.Value.ToString() + " - Est. Completion " + estCompletionTime + ")";
                    Application.DoEvents();
                    startTime = DateTime.Now;
                    // THis is no help...
                    //dbCommand = db.GetSqlStringCommand(@"DBCC FREESYSTEMCACHE ('All') WITH MARK_IN_USE_FOR_REMOVAL");
                    //db.ExecuteNonQuery(dbCommand);
                }

                lbl.Text = "Processing " + prg.Value.ToString() + " of " + prg.Maximum.ToString() + " - " + drDoc["CaseNum"].ToString();
                Application.DoEvents();

                if (!String.IsNullOrEmpty(drDoc["FileNumber"].ToString().Trim()))
                    WriteDocuments(drDoc["CaseNum"].ToString(), drDoc["FileNumber"].ToString().Trim(), drDoc["ReferralId"].ToString().Trim(), "C");
                else
                    WriteDocuments(drDoc["CaseNum"].ToString(), drDoc["FileNumber"].ToString().Trim(), drDoc["ReferralId"].ToString().Trim(), "R");

                prg.Value++;
            }
        }

        public static void WriteDocuments(string caseNum, string fileNumber, string referralId, string caseOrReferralFlag)
        {
            Database db = DatabaseFactory.CreateDatabase("DACMSConnectionString");
            DbCommand dbCommand = db.GetSqlStringCommand(@"Select DocumentText, Document, InsertDtTm From DACaseDocument Where CaseNum = '" + caseNum + "'");
            DataTable dtDocuments = db.ExecuteDataSet(dbCommand).Tables[0];

            if (dtDocuments.Rows.Count > 0)
            {
                string filePath;

                if (caseOrReferralFlag == "C")
                    filePath = ConfigurationManager.AppSettings["CaseDocDir"] + fileNumber + "\\";
                else
                    filePath = ConfigurationManager.AppSettings["RefDocDir"] + referralId + "\\";

                DirectoryInfo di = new DirectoryInfo(filePath);
                //if (di.Exists)
                //{
                //    foreach (FileInfo fi in di.GetFiles())
                //    {
                //        fi.Delete();
                //    }
                //    di.Delete();
                //}
                di.Create();

                long len = (long)0;
                Byte[] buffer = new Byte[0];
                using (DataTableReader reader = new DataTableReader(dtDocuments))
                {
                    while (reader.Read())
                    {
                        try
                        {
                            len = reader.GetBytes(0, 0, null, 0, 0);
                            buffer = new Byte[len];
                            reader.GetBytes(0, 0, buffer, 0, (int)len);

                            //len = reader.GetBytes(10, 0, null, 0, 0);
                            //buffer = new Byte[len];
                            //reader.GetBytes(10, 0, buffer, 0, (int)len);
                        }
                        catch
                        {
                            buffer = new Byte[0];
                        }

                        if (buffer.Length > 0)
                        {
                            DateTime docCreateDate = Convert.ToDateTime(reader["InsertDtTm"]);
                            string dateTimeSuffix = docCreateDate.ToString(" dddd, MMM dd yyyy") + " at " + docCreateDate.ToString("hhmmss tt") + ".doc";
                            string fileName = RemoveUnsafeChars(reader["Document"].ToString().Trim()) + dateTimeSuffix;
                            BinaryWriter bw = new BinaryWriter(File.Open(filePath + fileName, FileMode.Create));
                            bw.Write(buffer);
                            bw.Flush();
                            bw.Close();

                            if (caseOrReferralFlag == "C")
                                TblCsDocument.Insert(fileNumber, fileName, clsGeneral.FormatDateTimeToStore(reader["InsertDtTm"]),
                                                            clsGeneral.FormatDateTimeToStore(reader["InsertDtTm"]), null);
                            else
                                TblRfDocument.Insert(Convert.ToDecimal(referralId), fileName, clsGeneral.FormatDateTimeToStore(reader["InsertDtTm"]),
                                                            clsGeneral.FormatDateTimeToStore(reader["InsertDtTm"]), null);

                        }
                    }
                    
                }
                
                buffer = null;
            }
            dtDocuments.Dispose();
        }
Join the community to see this answer!
Join our exclusive community to see this answer & millions of others.
Unlock 1 Answer and 2 Comments.
Join the Community
Learn from the best

Network and collaborate with thousands of CTOs, CISOs, and IT Pros rooting for you and your success.

Andrew Hancock - VMware vExpert
See if this solution works for you by signing up for a 7 day free trial.
Unlock 1 Answer and 2 Comments.
Try for 7 days

”The time we save is the biggest benefit of E-E to our team. What could take multiple guys 2 hours or more each to find is accessed in around 15 minutes on Experts Exchange.

-Mike Kapnisakis, Warner Bros