?
Solved

urgent help required on hashes

Posted on 2003-03-01
9
Medium Priority
?
127 Views
Last Modified: 2010-03-05
hi all,


i have got three hash arrays as given below which contain data as described below.

the hash arrays have the data in them ahich i have stored already.

what i really want to do is to combine this three arrays in such a way that i get a resulting hash array or i get the output as website address followed by description followed by frequency followed by frequency.

the three hash arrays have the key as common i.e website address is same in this three hash arrays.

is there any way i could combine this 3 arrays so as to get the desired output.

should i use hashes of hashes.iam not really sure how to use them.

%urlHash = (); contains website address and description
%urlHash1 = ();contains website address and some words
%site=(); contains website address and frequency

i want the resukting hash

%result(); which will contain website address followed by description from urlhash array followed by frequency from site array then urlhash1 array.

then i want to sort this so that according to highest frequency i get the website first.

is this possible.please help.help would be very very appreciated.


many thanks,
tie
0
Comment
Question by:ironlady
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 5
  • 4
9 Comments
 

Author Comment

by:ironlady
ID: 8047667
plz help

thanks
0
 
LVL 22

Expert Comment

by:pjedmond
ID: 8048019
#!/usr/bin/perl

my %site;
my %nos;
my $key;
my $key2;

# Site is effectively hash{site}=hits
$site{a}=1;
$site{b}=1;
$site{c}=10;
$site{d}=41;
$site{e}=5;
$site{f}=11;
$site{g}=3;
$site{h}=1;
$site{i}=1;
$site{j}=8;
$site{k}=8;


# First collect a load of nos relating to the pages
# Effectively gets rid of duplicates:

foreach $value (values %site) {
        $nos{$value}=$value
}

# Now check each of the hit numbers that we have
# And inner loop searches through all sites to see whether there is a site
# that has that hi number - if so then print it


foreach $key (sort {$a <=> $b} keys %nos) {
        foreach $key2 (keys %site) {
                if ($key eq $site{$key2}) {
                        print "$key2 website, hashusingkey2 description, $site{$key2} which is no of hits , another hash etc\n";
                }
        }
}


Will work quite happily. Of course you may wish to improve the sort routines using a customised sort, or go for a better algorithm if speed is important.
0
 

Author Comment

by:ironlady
ID: 8048101
hi piedmond,

thanks for the answer but unfortunately as iam new to perl or may be because u have different variables iam not getting what is going on and iam not able to really understand how to change my variables so as to run your code.

i would be highly highly grateful if u could further help me on this.

if possible can u use urlhash,urlhash1 and site as array names.


kind regards,

tie
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 

Author Comment

by:ironlady
ID: 8048589
plz help


help would be really appreciated


thanks,
tie
0
 
LVL 22

Accepted Solution

by:
pjedmond earned 160 total points
ID: 8048849
I have! - And I've commented it? The only bit that is different is the dummy data that I've put in? I can't help any further with this, and neither can anyone else. This is a fully working script, complete with dummy data!


----------------------8X-------------------------------
#!/usr/bin/perl

my %site;
my %nos;
my $key;
my $key2;

# Site is effectively hash{site}=hits
#
#--------------------Here starteth the dummy data!
$site{'http://www.yahoo.com'}=1;
$site{'http://www.hotmail.com'}=1;
$site{'http://www.aol.com'}=10;
$site{d}=41;     # etc - just change the letters for sites.
$site{e}=5;
$site{f}=11;
$site{g}=3;
$site{h}=1;
$site{i}=1;
$site{j}=8;
$site{k}=8;
#--------------------Here endeth the dummy data!

# First collect a load of nos relating to the pages
# Effectively gets rid of duplicates:
# Nos is a temporary hash that stores valid numbers of hits
# that are found in the %site hash

foreach $value (values %site) {
       $nos{$value}=$value
}

# Now check each of the hit numbers that we have
# And inner loop searches through all sites to see whether there is a site
# that has that hit number - if so then print it
# Outer loop is sorted by number of hits
# Inner loop finds sites with that number of hits.
# In the inner loop, instead of printing to the screen
# you could assign to a hash %result()

foreach $key (sort {$a <=> $b} keys %nos) {
       foreach $key2 (keys %site) {
               if ($key eq $site{$key2}) {
                       print "$key2 website, $urlHash{$key2} description, $site{$key2} which is no of hits , $urlHash1{$key2},another hash etc\n"; # Note - you will need to have data in %urlHash and %urlhash1 for this line to work!!!!!
               }
       }
}

# Changed the hash, another hash etc, but that's it.
#

----------------------8X-------------------------------

If with the alterations and additions to the comments, you still have difficulty with understanding it, then I suggest coppying it completely, and running it with extra print commands inserted. Try changing some of the demo data. Once you've understood the above, I suspect that you will be able to understand a little how hashes work. I hope that helps, because what you are now asking me to do is to teach
0
 
LVL 22

Expert Comment

by:pjedmond
ID: 8048866
you perl (from scratch?) rather than help with the question.

Your best bet is to sit down and learn about hashes properly before trying this exercise, otherwise you get into the situation you are now, where you don't understand the information that is being passed to you. Work hard on the basics and the rest is easy:)
0
 

Author Comment

by:ironlady
ID: 8049100
thanks a lot piedmond

i have figured it out  and now its working properly

i think i just got panicked as i have this as assignment
but i sat down with ur code and i got it.

and by the way iam not too bad at hashes!!!!

thanks for ur advice and iam really grateful for ur help!!
i really appreciate that

thanks,
tie
0
 

Author Comment

by:ironlady
ID: 8049103
thanks a lot
0
 
LVL 22

Expert Comment

by:pjedmond
ID: 8049104
you perl (from scratch?) rather than help with the question.

Your best bet is to sit down and learn about hashes properly before trying this exercise, otherwise you get into the situation you are now, where you don't understand the information that is being passed to you. Work hard on the basics and the rest is easy:)
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

On Microsoft Windows, if  when you click or type the name of a .pl file, you get an error "is not recognized as an internal or external command, operable program or batch file", then this means you do not have the .pl file extension associated with …
I have been pestered over the years to produce and distribute regular data extracts, and often the request have explicitly requested the data be emailed as an Excel attachement; specifically Excel, as it appears: CSV files confuse (no Red or Green h…
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Six Sigma Control Plans
Suggested Courses

764 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question