• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 231
  • Last Modified:

complete cache of DB

hey guys i've got a big report i need to run and the database is on the network. that means that the calls always takes very long and the multiple trips are a penalty. how do i create a local cache of the relevant tables when i'm running this mammoth report at the end of every week?

i'm thinking 3 ways

1) Access's synchronisation
2) DIY synchronisation
3) just copy the whole database to local machine, change the links of the table to the local database, run the report, link the tables back to the network database, then delete the local database

what do yall think guys? looking forward to your guidance and sharing guys = )
0
developingprogrammer
Asked:
developingprogrammer
  • 4
  • 4
  • 4
  • +1
2 Solutions
 
Rey Obrero (Capricorn1)Commented:
if the records are not going to change or the db is not in used at the end of the week,
the best way is option 3.
0
 
DatabaseMX (Joe Anderson - Microsoft MVP, Access and Data Platform)Commented:
I vote for # 3 also.
0
 
developingprogrammerAuthor Commented:
Cool, thanks guys, but if I didn't suggest number 3, how would y'all have done it? I think your methods should definitely be better than mine with all the experience y'all have! = )
0
The new generation of project management tools

With monday.com’s project management tool, you can see what everyone on your team is working in a single glance. Its intuitive dashboards are customizable, so you can create systems that work for you.

 
DatabaseMX (Joe Anderson - Microsoft MVP, Access and Data Platform)Commented:
Yes  I would have suggested it, because that is exactly what I do (in an automated fashion) on one of our reporting KPI databases.

:-)
0
 
Dale FyeCommented:
The 4th way would be to use temporary tables, creating only the necessary tables locally (see my article on temp tables).
0
 
developingprogrammerAuthor Commented:
Hrmm nice guys. I super like the ideas!! = ))

Hrmm may be a silly question, but if a db has one table with 100,000 records, is it faster in terms of "localising" it by

1) copying the whole database
2) reading the linked table and copying to temp table?

One is wholesale one is pinpoint (so to speak if there were more tables we want to ignore)
0
 
Dale FyeCommented:
I generally just copy exactly what I need into the temp table, not scientific, just the way I do things.
0
 
developingprogrammerAuthor Commented:
Yup I agree. Looks more elegant that way as well. What do you think MX? And also from a performance angle MX? = ) thanks guys!!
0
 
DatabaseMX (Joe Anderson - Microsoft MVP, Access and Data Platform)Commented:
What I do in the case I posted is ... what fyed does. I load a local temp table with records from approx 12 different dbs on the server, then run reports from that. The local table - toward the end of the year, can  have well over a million records. It takes less than a minute to load the temp table over our network.
0
 
Dale FyeCommented:
Joe,

That's because you have a blazingly fast network!

;-)
0
 
DatabaseMX (Joe Anderson - Microsoft MVP, Access and Data Platform)Commented:
Yep ... But it's **not** the only one in the World, contrary to popular belief :-)
0
 
Dale FyeCommented:
I'm just jealous.
0
 
developingprogrammerAuthor Commented:
Superb!!! Thanks guys!! = ))
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 4
  • 4
  • 4
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now