Solved

memory, disk - scaling, performance - application architecture question

Posted on 2013-11-24
9
233 Views
Last Modified: 2013-11-29
I am re-architecting an application that focuses on group collaboration around interactively building and annotating a complex diagram.

Average total data size for one group session for will probably be somewhere between 150KB and 1MB.

Currently the group session data is stored entirely in memory in a large object containing datasets and supporting variables.  This data is synced to disk during diagram update/writes.  

The data is kept in memory for fast diagram form refreshes (one participants does a small write, which might change the relationships on the diagram, which then has to be refreshed to all other participants).

My question has to do with performance versus scaling:
>  If 1 GB can store 1000 concurrent sessions (in my dreams), it doesn't seem like that would be a scaling bottleneck i.e. the server would bog down first, or I could add servers if I ran out of memory.
>  If I had to retrieve say 50KB from disk per diagram refresh (say one per second for each group session), it seems like that would be a big disk performance/scaling constraint.

So it seems like keeping the data in memory is almost necessary, to achieve scaling and maintain performance.

I'm trying to figure out if I've made some basic error in this thinking.

Any comments on these thoughts and assumptions would be appreciated.

Thanks!
0
Comment
Question by:codequest
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 44

Assisted Solution

by:AndyAinscow
AndyAinscow earned 100 total points
Comment Utility
I tend to agree with you - keep it in memory if possible.
0
 
LVL 42

Assisted Solution

by:EugeneZ
EugeneZ earned 225 total points
Comment Utility
it depends how much RAM you have and set for Sql server.

Please classify sql server part in this process:

" group collaboration around interactively building and annotating a complex diagram.
"
how did you calculate this?
"Average total data size for one group session for will probably be somewhere between 150KB and 1MB."


on what server\PC stored
"the group session data is stored entirely in memory in a large object containing datasets and supporting variables.  This data is synced to disk during diagram update/writes."
?

What method are you using to insure that is it in memory?

-------------------------

in any case you must use perfmon \sql profiler \DMVs for sql server  that will help you see real numbers
0
 
LVL 2

Author Comment

by:codequest
Comment Utility
@EugeneZ

Thanks for input.  

1) I believe I considerably overestimated the amount of data that would be actively worked on and presented.   Better estimate would be 10KB.  This was calculated by considering table rows, fields, field usage and field sized (higher estimates did not account for null fields).

2) Data is currently maintained in memory in a ADO.NET DataSet that has 5 linked tables.

Unfortunately I have only a prototype and am unable to test high volumes in order to use the utilities you recommend.
0
 
LVL 16

Assisted Solution

by:Gerald Connolly
Gerald Connolly earned 175 total points
Comment Utility
If you are holding updates in memory, what are you doing to guard against equipment and/or power failure and the subsequent corruption of the DB on disk?
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 42

Accepted Solution

by:
EugeneZ earned 225 total points
Comment Utility
there are much more elements that you need to conceder, check:

"out of memory exception ado.net dataset"
http://social.msdn.microsoft.com/Forums/en-US/41e1b19a-b5ee-4cf2-ac1e-ff0c9a35b961/out-of-memory-exception-adonet-dataset?forum=adodotnetdataset

also review a possibility to use stored procedures to calculate and  sql server to hold data

again there is not too much details about your apps architecture \tiers
0
 
LVL 2

Author Comment

by:codequest
Comment Utility
From a similar question I posted on another site:

I'm re-architecting an asp.net application from web forms into MVC, moving from 2006 to 2013 asp.net technologies. The primary function of the app is group collaborative construction of a complex graphic/text data set. The plan is to run multiple concurrent SaaS group work sessions from a cloud.

The core graphic/text set ("data set") would consist of about 5 related tables, that need fairly complex business logic and associated multi-table queries to turn them into useful display information. The content of this data set needs to be sent to all participants, in a slightly customized way for each participant, every time it is updated by any participant.

In terms of volume, say 10 participants per group, one data set change every several seconds (in one session), the entire data set building up to approximately 10KB by the end of the session, so say an average of 5KB to retrieve the entire core data set from disk (if that were the path) for each send to the browser. That may be high but there could be a wide range of volumes.

The resulting pattern for a single session is then relatively infrequent, small updates to disk, followed by 10 times as many relatively large sends to the browser.
0
 
LVL 2

Author Comment

by:codequest
Comment Utility
@connollyg

re what about memory failure:   the app currently uses DataSet/TableAdapter;  the writes are all updated to disk at the time they occur.
0
 
LVL 16

Assisted Solution

by:Gerald Connolly
Gerald Connolly earned 175 total points
Comment Utility
re memory - I covered that under Equipment failure! Although you could go with a server that has RAID Memory!

re updates - You implied in your first post that some kind of write gathering was taking place. If your in-memory-db is really read-only its fine.
0
 
LVL 2

Author Closing Comment

by:codequest
Comment Utility
I've concluded that performance and scaling questions are completely non-trivial, and so my question can't really be conclusively answered.  Inputs here have been valuable in reaching that conclusion, so points are rewarded accordingly.
0

Featured Post

Why You Should Analyze Threat Actor TTPs

After years of analyzing threat actor behavior, it’s become clear that at any given time there are specific tactics, techniques, and procedures (TTPs) that are particularly prevalent. By analyzing and understanding these TTPs, you can dramatically enhance your security program.

Join & Write a Comment

Introduction Many of the most common information processing tasks require sorting data sets.  For example, you may want to find the largest or smallest value in a collection.  Or you may want to order the data set in numeric or alphabetical order. …
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

6 Experts available now in Live!

Get 1:1 Help Now