php sort for dups in two files

OK here's the problem I have two files block.txt and edit.txt
they contain domain entries on per line (.aol.com)
I would like to merge the twp file together but not have any dups. Both files have different amounts a entries (2400 and 420). Here's what I have so far:

set_time_limit(50);

$MAX=99999;
$free=0;
$empty="empty\n";

$array = @File("$type");
$total = count($array);

$free=$total;

// Time to open the entry file

$filename="Edit2.txt";
$fp=fopen("$filename","r")or die("Can not open file");

// Read the file into an array

if ($fp) {  $array = explode("\n", fread($fp, filesize($filename))); }

// Close the file

fclose($fp);

// Echo the second entry in the file

echo "sencond entery in new file: $array[1]<br>\n";


// Count the number of entries in the file

$entry=count($array);


// Echo the number of entries.

echo "entery count total: $entry<br>\n";



// Open the blocking file

$fp=fopen("block","r")or die("can not open the block file");
if($fp) { $arrayb = explode("\n", fread($fp, filesize(block))); }
fclose($fp);

echo "first entery in block file: $arrayb[0]<br>\n";
$blocking=count($arrayb);
echo "total blocking: $blocking\n<br>";


for($k = 0 ; $k < $blocking ; $k++) {
      //$d=$k;
      //if(trim($array[$k] != $arrayb[$d])) {
      for($d = 0; $d < $entry ; $d++) {      
                  if(trim($array[$k] != $arrayb[$d])) {
            
                        //$outputfile .= "$array[$k]\n";
                  $t=1;
                        echo "file ok $k add:$array[$k] and block:$arrayb[$d]<br>";
                  }
            if($t == 1){
                  $outputfile .= "$array[$k]\n";
                  $t=0;
            }
            else {$t=0;}
            
            
            
      }
      //else { $d++; }
}

echo $outputfile;

I know it's no right and right now I only need it to print a web page I can change it to files later.

TIA
LVL 1
jscartAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

snoyes_jwCommented:
You could read everything from both files into an array, then use array_unique() to delete all the duplicates, and write the results back out to a file.
0
shmertCommented:
Just an implementation of snoyes' post:

$array = array_unique(array_merge(file('block.txt'), file('edit.txt')));

// Note: the line breaks will still be there for each element, so you may want to iterate through and trim() each entry.
// This should be a very speedy way to do it, though.

foreach($array AS $key=>$value) {
    $array[$key] = rtrim($value);
}
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
wide_awakeCommented:
If you're running in a unix environment, you can use the "sort" command to do it for you.

$array = explode("\n", `cat block.txt > tmp.txt; cat edit.txt >> tmp.txt; sort -u tmp.txt`);

-mark

0
wide_awakeCommented:
even easier:

$array = explode("\n", `sort -u block.txt edit.txt`);
0
jscartAuthor Commented:
THanks for the suggestions I'll start trying them and then score.
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
PHP

From novice to tech pro — start learning today.