Solved

Questions about designing a data mining website and crawler

Posted on 2008-10-15
2
767 Views
Last Modified: 2013-12-09
I have a few questions about a project I am working on. Being fairly new to the whole idea I decided to read up and found a great deal of information. The project has been designed and I have started writing the code for it but there are some issues that keep coming up.

Firstly, the bulk of of code comes in the form of class libraries that contain AI, rule processing and inference, database access, compression, etc. The crawler is also a class library that will reference the other libraries. The crawler will most likely be initialized by a console application or winform so that it will run outside of the asp.net session (any thoughts on running it from the asp.net website?).

So the first question is:
How can I control, manage, communicate with the web crawler when its running without using remoting or tcp client/server? Would I have to use a web service?

Second question is:
Is there a better approach to this design?

As it stands now I would like to have the crawler sit waiting for jobs to come in and then store the information into the database. I do not want the website to have to reference the libraries but still be able to access the data from the crawler and manage it as well.

My main concern is that if I use the scheduler I wrote to schedule the jobs and start the crawler the crawler will close out when the session from the site has ended. I am sort of lost on how to continue with this part.

I appreciate any help I can get and if I am being too vague just let me know and I will try to explain it in more detail and/or provide code snippets. Just as a side note, I am running SQL Server 2008, Windows Server 208 (IIS 7) and .NET 3.5 (Using Visual Studio 2008 to write it)

Thanks
Joe Wood
0
Comment
Question by:JoeDW
2 Comments
 
LVL 5

Accepted Solution

by:
wickedpassion earned 500 total points
ID: 22739193
0
 
LVL 1

Author Comment

by:JoeDW
ID: 22749598
Wow, I really like the first link. It had tons of good information and I am sure it will keep me busy for a while. Thanks!!
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction This article explores the design of a cache system that can improve the performance of a web site or web application.  The assumption is that the web site has many more “read” operations than “write” operations (this is commonly the ca…
I've been asked to discuss some of the UX activities that I'm using with my team. Here I will share some details about how we approach UX projects.
This tutorial walks through the best practices in adding a local business to Google Maps including how to properly search for duplicates, marker placement, and inputing business details. Login to your Google Account, then search for "Google Mapmaker…
Video by: Mark
This lesson goes over how to construct ordered and unordered lists and how to create hyperlinks.

867 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

21 Experts available now in Live!

Get 1:1 Help Now