Entity Framework versus Datasets

(Related to another question)  
I am re-architecting an application that focuses on group collaboration around interactively building and annotating a complex diagram.

Average total data size for one group session for will probably be somewhere between 150KB and 1MB.

Currently the group session data is stored entirely in memory in a large object containing datasets and supporting variables.  This data is synced to disk during diagram update/writes.    There are about a dozen tables in the data object.  I make extensive use of tableadapters.

The data is kept in memory for fast diagram form refreshes (one participant could do a small write, which might change the relationships on the diagram, which then has to be refreshed to all other participants).

Let's assume that keeping the data in memory is the correct approach for performance/scaling concerns (that was the other question).

The question here is, is there a benefit to using Entity Framework in this situation?  

My understanding is that datasets work well for memory access of small data volumes, which I have, whereas Entity Framework always read/writes from/to disk.   So, maybe I could use EF for the disk syncing, but I'd still need all the dataset logic anyway for the in-memory data handling.  So why would I want to add EF in that case (except for easier DB maintenance, which maybe plenty of justification?)

Or is there some way of managing "dataset-like" information in memory that is complimentary to EF?    My diagram display form used to have gridviews, however, that's all been replaced with custom div collections, so there's no real dependency on datasets on the GUI side.  The business logic does make extensive use of datatables, however, I'm assuming they could be loaded other than from tableadapters (though that's very convenient, until I bump into the DB table definition updates issue).

So, maybe to summarize, the question is, would it be worth it to introduce Entity Framework in this situation, for DB management, and if so would I just keep the dataset object for memory, or would there be some alternative in-memory data management approach?

Any input on this would be appreciated.

Who is Participating?
Bob LearnedConnect With a Mentor Commented:
There are plenty of little (but important) things that you can research and experiment with.

Here is a good example:

Instantly Increase ASP.NET Scalability - Thru ThreadPool

Just remember that it is important to measure any changes to test their effectiveness.  Start with a baseline, and then make a change, and test again.
Bob LearnedConnect With a Mentor Commented:
What did you find out about performance and scaling in the previous question?

Scaling Strategies for ASP.NET Applications

The trick to optimizing server code is to use testing to be sure you're actually making a difference. You should use profiling tools to analyze your application and find out where the application is spending the most time.


In the long run, however, affinity creates grief. Keeping session data in-process may be fast, but if the ASP.NET worker process recycles, all those sessions are dead. And worker processes recycle for a lot of reasons. Under high load

It is important to balance everything when working towards a scaleable application.  You don't want to keep database connections open, since you can reach a maximum faster when adding more users.  You don't want to use too much server memory, since you will force the worker process to recycle more quickly when adding more users.
Bob LearnedConnect With a Mentor Commented:
Entity Framework works with the principle of "laziness", where related data is not pulled into memory until it is needed.

Demystifying Entity Framework Strategies: Loading Related Data
A proven path to a career in data science

At Springboard, we know how to get you a job in data science. With Springboard’s Data Science Career Track, you’ll master data science  with a curriculum built by industry experts. You’ll work on real projects, and get 1-on-1 mentorship from a data scientist.

codequestAuthor Commented:

Good links, thanks for the input.  No real progress yet on the scaling and performance question.  Your responses above are very helpful, and are tilting me back toward Entity Framework.  I'll need to research Lazy Loading more.  

It may come down to what Lazy Loading would do in this scenario:
>  the dataset required to refresh the user screens consists of 4 or 5 linked tables, from which data is pulled in a complex set of operations.  
> The entire dataset is built by user interaction in primarily an additive way, i.e. there are only a few edits of existing rows.  
> If the entire data set is repeatedly re-accessed by the business/controller layer in order to compute the form/views, would EF + Lazy Loading maintain most of the data rows in the dataset in memory, and only fetch the recent adds and edits from disk?
> If that's the case, then the need for in-memory datasets is removed, and THAT part of the decision about whether to go with EF is removed.

I asked the DataSet vs EF question in a more evolved form here:  


which might provide more insight into what I'm doing, however I believe the scenario above about Lazy Loading is the crux of the question.
Bob LearnedCommented:
What is the maximum number of users that you expect at one time (adding about 20% to that)?  

Do you expect any growth over time?
codequestAuthor Commented:
Thanks for your question.  The goal is a commercial subscription SaaS service, which could be -very- roughly compared to a cross between Webex and Survey Monkey.  For this exercise, let's say I'm re-architecting for at minimum 1000 concurrent users, at which point  (in my dreams) hopefully people more capable than I would be dealing with performance and scaling issues.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.