Sorry for the novel, I'm a newbie, may have bitten off more than he can chew and wanted to capture everything...
I have a ambitious client who has contacrs in the marketplace who operate websites selling variants of a niche product - circa 100 websites each with their own product databases.
Some of these sites have MySQL databases, most are MS SQL but there are also sites which use databases such as Oracle and niche database technologies.
I have been tasked with building a 'price comparison' style website where the visitor can input a search term and recieve a combined set of distinct results from the 100 other sites, a bit like a price comparison site.
The schema and technology behind each of the [source] databases may be different but the price comparison site will only show 3 fields.
These are "Price", "Product Title", "Product ID" - in addition on the results page of the price comparison site there would need to be a fourth field "Site ID" which would identify the site in which that price was sourced from. roduct title is the only field searched.
The price comparison will be done with co-operation and permission from the source sites offering the products, however any development on the other sites to accomadate this would be at our expense and we're on a budget.
I think based on the number of databases (other websites) to query then page scraping would be messy. I also get the feeling that some may be unwilling to provide views.
Looking for an server that tells me the best method to query the third party data and consolidate as quickly as possible into a single set of search results on the price comparison site.
Minimum outlay, simple as possible and something that can be used for different datasource types. Ideally something that once built can be re-used/distributed on other datasources/websites using that technology so the portfolio of websites being queryed can be expanded to include even more sites and variants of the niche product. Datafeeds are not an option and for every search performed the resulting data must be live.
For the host site displaying the results the existing sample is PHP, which is also my own specialism. If cheaper/easier I'd go with other technologies though as it's not set in stone.
I have the feeling I may have to outsource te dev of this sites inner workings, but I'm really looking for a clear sense of direction. Thanks experts.