jls33fsls
asked on
Best way to store large number of keywords
I am working on a site where thousands of user-submitted URLs will be submitted and keywords (50-75 each) automatically generated for these URLs and stored in the database for use with targeted advertising. I need to be able to check with a mysql query if at least one keyword from an advertiser matches a keyword for a URL. What is the best way to store this large amount of data and access it at high speeds?
foreach url you could have a textarea of comma separated keywords. then setup a fulltext mysql index on the keywords column. then use the match ... against mysql syntax to rank the urls based on their keywords. let me know if you need more detail otherwise google "fulltext mysql index" and "mysql match against"
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
routinet, thanks for the great explanation! I have one final question. For the FK's, do I put anything for an action on update or delete?
Not unless you believe you have a need for it. I only showed the FK to bring your attention to the relationships between the tables; you do not actually need to explicitly define the foreign relationships, and no enforcement is necessary. If you're using MyISAM tables, those will only be implicit relationships anyways. Do make sure you index the related fields, though. It will speed things up quite a bit for you.
ASKER
So do you recommend using MyISAM or InnoDB? Which will be faster with a higher volume?
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.