Go Premium for a chance to win a PS4. Enter to Win

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 350
  • Last Modified:

Best Method for Temporarly Storing and Reading Logs

Hello,

We are currently writing the business requirements for a website logging application used in a specialist in house CRM application. The website logging application, logs the the various screen views within the CRM application, using a PHP post script on a data collection server that passes variables and values into a database server.

The question I have, is what is the best way to cache and queue the logs locally on the data collection server in the event that the connection to the database server goes down. As the variables and values can change from screen view to screen view within the CRM application, the data collect is not going to be always the same. Which make me think that a local MySQL server wouldn't be a good way to go as there is no way to predict the database schema.

In the scenario above what would be the best way to temporary cache the logs before they are passed to the database server.

Thanks
0
PlumInternet
Asked:
PlumInternet
  • 5
  • 3
  • 3
  • +1
2 Solutions
 
gheistCommented:
Why you are reinventing the wheel? Webservers write logs and applications can add their entries to that log (And THAT log in turn can be piped to locally running program like e.g. apache rotatelogs)
0
 
Aaron TomoskyTechnology ConsultantCommented:
Agreed, don't reinvent the wheel. If you want more than web server logs, look at google analytics or something similar.
0
 
gheistCommented:
Yo can run your own analytics too. On logs post-factum etc
0
What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

 
PlumInternetAuthor Commented:
Thanks to everyone for responding.

For reasons I cant go into here, using web logs (is server side reporting, no client side reporting server logs will miss critical user behaviors) and Google Analytics (data ownership plus data volumes will cause data to become sampled) is not a viable option.

We have moved beyond the reason why we need to solve to the problem this why as there is very valid reasons, to now trying to figure out how. Any suggestions on what would be the best way to solve this problem would be greatly appreciated.

Thanks
0
 
Aaron TomoskyTechnology ConsultantCommented:
0
 
PlumInternetAuthor Commented:
Ok I am guessing that no body knows the answer to this question then.

If it will help move things along, lets say hypothetically we want to create another competitor to the solutions listed, Or lets say I am a curious guy and want to know how the solutions listed solve this problem. Forget the why, the question I am asking is how.
0
 
Aaron TomoskyTechnology ConsultantCommented:
Alright then, I would start with elasticsearch as my goto log storage. One cool thing about ES is that you send data with a simple json web request. So if the ES server isn't around, just save the request locally (file, db, object, whatever). This is perfectly fine for lower volume web traffic.
If you have more demanding data storage needs, look at messaging queues. The most popular one is rabbitmq.
http://www.elasticsearch.org/guide/en/elasticsearch/rivers/current/
0
 
Slick812Commented:
Greetings  PlumInternet, , , , I have been able to save In File many data sets, and have them stacked, residual, series, and various formats or as TEXT, or as binary. In PHP I have found making an "array" to be the container as a ARRAY  or  a Key-Value data set, will produce a "Tree access" like a file system. You place into this any and all "data" you need to store, and then -
$store2file = serialize($array_Data);
and make a file.

BUT, you will need to KNOW, HOW, WHY, and WHEN, you need to access this file and READ the "Portions" of Data, stored there.

But, for me, this works, time and time again.
The serialize( ) is a native PHP, while JSON and others (XML) are foreign to PHP.
0
 
gheistCommented:
While you stay hiding what "critical behaviour" webserver is not able to log, even keeping us guessing what webserver(s) you have, i would say nobody can provide you with what you want past that the BEST way is to use existing logging capabilities.
0
 
PlumInternetAuthor Commented:
Thanks Gheist, that's precisely my point. The question is not whats the best way to use existing logging capabilities is it.....To recap the question was:

"question I have, is what is the best way to cache and queue the logs (collected post request data, not web logs) locally on the data collection server in the event that the connection to the database server goes down."

I think slick812 has been right on the money with the help we are looking for. Thanks Slick812 :)
0
 
gheistCommented:
Would be nice to know we talk here IIS apache nginx ot log4j .... Capabilities and options depend on that.
0
 
gheistCommented:
In general :
Get log analysis tool that shows what you want
Make webserver log required input (with reservation that it will be impossible to recover missing data from the past)
Will it be OK to do hourly or nightly log transfers? -> transfer logs
No -> try syslog or SQL (slowdown but hopefully for good purpose)
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

  • 5
  • 3
  • 3
  • +1
Tackle projects and never again get stuck behind a technical roadblock.
Join Now