• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 476
  • Last Modified:

Wget

Hi,

I came across fetch.com which offers a pretty good solution but the price is high and solutions are geared mostly for enterprises. Is it possible to achieve what they offer with wget?

Here's the functionality i need:

I'm looking for a script where I can specify a list of domains (20-50K)  and have all site content downloaded to a main zipped file with all data. I have limited space (1tb) so I want only the text for each site and want to exclude images, flash, site files etc so the downloads are quick. Final output can be in any format.

I'm looking for a script to crawl URLs for specific keywords for domains I specify and if there is a match, the URLs will be written to a central file.

Lastly, I have a file with 100k domains and i want to append most recent site titles to create a directory. Is there a way to fetch this information from search engines?

Example.

unix.org
etc..

Output
unix.org            The UNIX System, UNIX System
etc..



Thank you very much in advance.

Best,
0
faithless1
Asked:
faithless1
1 Solution
 
wls3Commented:
As far as I know, wget (on windows) outputs a folder for each domain scanned.  This makes your requirement regarding a single zip somewhat difficult without additional scripting.
0
 
faithless1Author Commented:
Thanks, writing to a directory works as well.

Thanks again,
Tom
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Cloud Class® Course: CompTIA Healthcare IT Tech

This course will help prep you to earn the CompTIA Healthcare IT Technician certification showing that you have the knowledge and skills needed to succeed in installing, managing, and troubleshooting IT systems in medical and clinical settings.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now