In SqlBulkCopy, how to avoid duplicate insertion of records in the table?

Hi,

I am using SqlBulkCopy to insert millions of data from table1 to table2. I am having two questions:I am using sql serevr 2008 and c#

1. while inserting to table2, how to avoid duplicate insertion of records?
2. If there is any issue on 100 record, the how to continue insertion for the remaining records? Record that is failed will save in the log file with the error message

thanks
Saroj13Asked:
Who is Participating?
 
Saroj13Author Commented:
Hi,

Here is my code for BulkInsert. I need to create a composite key (IP, datetime,querytime). this should be unique for each insertion.  Please make change in the code
1: for avoiding duplicate nows to be inserting
2. in case of failure to insert some record, continue the process of insertion for other records. failed records will be saved in the log file.

   SqlConnection conn = new SqlConnection(sDBConn);
                conn.Open();

                DataTable table = new DataTable("ParsedDataTable");
                DataColumn column;
                DataRow row;
                column = new DataColumn();
                int blockSize = 1000;

                table.Columns.Add("IP", typeof(String));
                table.Columns.Add("DateTime", typeof(DateTime));
                table.Columns.Add("QueryTerm", typeof(String));
                 table.Columns.Add("UserRole", typeof(String));
                table.Columns.Add("Results", typeof(int));
         
  foreach (Record item in recordList)
                {
                    row = table.NewRow();

                     row["IP"] = item.IP;
                    row["DateTime"] = item.DateTime;
                    row["QueryTerm"] = item.sQueryTerm;
                    row["UserRole"] = item.UserRole;
                    row["Results"] = item.Results;
 
                   table.Rows.Add(row);

                    if (table.Rows.Count >= blockSize)
                    {
                        IsSuccess = BulkInsert(table, conn);
                    }//if

                }//foreach

                IsSuccess = BulkInsert(table, conn);
                conn.Close();
                return IsSuccess;
            }//try
            catch (Exception e)
            {
                logEntry = "";
                logEntry = logEntry + "------------------Error # ----------------";
                logEntry = logEntry + e.Message + "\n";

                //add more error before this line
                logEntry = logEntry + "\n\n";
                File.AppendAllText(logFilePath, logEntry);
                return false;
            }//catch
        }//end InsertRecords

        bool BulkInsert(DataTable source, SqlConnection cn)
        {
            try
            {
                using (var bulk = new SqlBulkCopy(cn))
                {
                    bulk.DestinationTableName = source.TableName;

                                   bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("IP", "IP"));  // DataTable Col Name, DB Table Col Name
                    bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("DateTime", "DateTime"));
                    bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("QueryTerm", "QueryTerm"));
                                        bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("UserRoles", "UserRoles"));
                                        bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Results", "Results"));
                                    bulk.WriteToServer(source);
                    bulk.Close();
                }

                source.Rows.Clear();
                return true;
            }//try
            catch (Exception ex)
            {
                ex.Message.ToString();
                return false;
            }//catch
        }//BulkInsert
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.