trevor1940
asked on
C#: How do you read a text file into a array removing dupilcate lines from a Second Log file
Hi
I have a list of URL's in a file that I need to process
and a List of finished URL's in another Log file
after I've finished with a URL I write / append it to the Log File
How do check the current URL isn't in the Log File therefore not already processed
I have a list of URL's in a file that I need to process
and a List of finished URL's in another Log file
after I've finished with a URL I write / append it to the Log File
How do check the current URL isn't in the Log File therefore not already processed
using System;
using System.IO;
using System.Linq;
public class Program
{
public static void Main()
{
string LogFile = System.Environment.CurrentDirectory + "\\Log.txt";
var URLFromLog = File.ReadAllLines(LogFile).Select(l => l.Trim());
var URLFromURLs = File.ReadAllLines(@"Path\To\URLs.txt").Select(l => l.Trim()); // This File may contain Duplicates and changes each time the program runs
// Only Add to URLs if unique and not already in the Log File
// This doesn't look right but unsure what it should be
var URLs = URLFromLog.Distinct().Concat(URLFromURLs).ToList();
foreach (string sURL in URLs)
{
var URL = sURL;
// Do Stuff with URL
File.AppendAllText(LogFile, sURL + Environment.NewLine);
}
}
}
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER