Inherited a table that has 271 million rows of data (and growing each day). My first approach to make this more sustainable is to convince the VP of IT that we need to change the nature of how we are storing the statistical data. Unfortunately that could be a tough sell.
For auditing purposes our customers need to know exactly when a file played (we are a digital signage company). It isn't enough to know that file X played 10 times in a 30 minute period of time (or 1 hour or 15 minute or whatever time frame I come up with). Admittedly, if we changed over to a count/time, then the number of rows decreases drastically.
Each "player" has an ID. One thought I came up with was making a table for each player. then use dynamic sql to determine which table to insert the data into and which to select from. I could create tables by accounts and then each table would have that account's players in it. . But I have to say that that just seems hackish.
I wonder though how Google manages such large data sets. They must be doing something to break the data up into manageable pieces and then figuring out a way to get to that data based on the user that is logged in. Maybe my idea isn't so bad.
Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.
These days, all we hear about hacktivists took down so and so websites and retrieved thousands of user’s data. One of the techniques to get unauthorized access to database is by performing SQL injection. This article is quite lengthy which gives bas…