Hi all, i'm after advice rather than a technical answer, therefore the best advice gets the points - (a reason why your advice is good would be very helpful).
I have 1 database with customer data in it.
I have 18 customers who each want to use BI to analyse their data.
Should I create one cube and use logins and roles to split the data between customers or should I create 1 cube for each customer?
My concerns are mainly with speed, recovery and security - starting with the following:
Would it be more efficient to build lots of small cubes or 1 big one?
If I build 1 cube & it fails it takes ages to reprocess. It also takes ages to restore from a backup.
If I build lots of cubes, every time a new customer is added, it opens up a massive security risk window having to go through all the tick boxes and make sure they're ticked. Would changing the fact table critieria when building the cube for each customer be a safer option (albeit still a security risk if someone does it wrong).
Currently the SSIS package the builds the 1 big cube fails about 30% of the time, although the package has been processing the cube in one go. Tonights will process each partition of the cube, if that doesnt work i'll break it down to data & index level, but thats diverting from this question a bit!
Any help/tips/advice would be much appreciated.