Solved

Entity Framework versus Datasets

Posted on 2013-11-24
6
2,088 Views
Last Modified: 2013-11-29
(Related to another question)  
I am re-architecting an application that focuses on group collaboration around interactively building and annotating a complex diagram.

Average total data size for one group session for will probably be somewhere between 150KB and 1MB.

Currently the group session data is stored entirely in memory in a large object containing datasets and supporting variables.  This data is synced to disk during diagram update/writes.    There are about a dozen tables in the data object.  I make extensive use of tableadapters.

The data is kept in memory for fast diagram form refreshes (one participant could do a small write, which might change the relationships on the diagram, which then has to be refreshed to all other participants).

Let's assume that keeping the data in memory is the correct approach for performance/scaling concerns (that was the other question).

The question here is, is there a benefit to using Entity Framework in this situation?  

My understanding is that datasets work well for memory access of small data volumes, which I have, whereas Entity Framework always read/writes from/to disk.   So, maybe I could use EF for the disk syncing, but I'd still need all the dataset logic anyway for the in-memory data handling.  So why would I want to add EF in that case (except for easier DB maintenance, which maybe plenty of justification?)

Or is there some way of managing "dataset-like" information in memory that is complimentary to EF?    My diagram display form used to have gridviews, however, that's all been replaced with custom div collections, so there's no real dependency on datasets on the GUI side.  The business logic does make extensive use of datatables, however, I'm assuming they could be loaded other than from tableadapters (though that's very convenient, until I bump into the DB table definition updates issue).

So, maybe to summarize, the question is, would it be worth it to introduce Entity Framework in this situation, for DB management, and if so would I just keep the dataset object for memory, or would there be some alternative in-memory data management approach?

Any input on this would be appreciated.

Thanks!
0
Comment
Question by:codequest
  • 4
  • 2
6 Comments
 
LVL 96

Assisted Solution

by:Bob Learned
Bob Learned earned 500 total points
ID: 39674533
What did you find out about performance and scaling in the previous question?

Scaling Strategies for ASP.NET Applications
http://msdn.microsoft.com/en-us/magazine/cc500561.aspx

The trick to optimizing server code is to use testing to be sure you're actually making a difference. You should use profiling tools to analyze your application and find out where the application is spending the most time.

...

In the long run, however, affinity creates grief. Keeping session data in-process may be fast, but if the ASP.NET worker process recycles, all those sessions are dead. And worker processes recycle for a lot of reasons. Under high load

It is important to balance everything when working towards a scaleable application.  You don't want to keep database connections open, since you can reach a maximum faster when adding more users.  You don't want to use too much server memory, since you will force the worker process to recycle more quickly when adding more users.
0
 
LVL 96

Assisted Solution

by:Bob Learned
Bob Learned earned 500 total points
ID: 39674545
Entity Framework works with the principle of "laziness", where related data is not pulled into memory until it is needed.

Demystifying Entity Framework Strategies: Loading Related Data
http://msdn.microsoft.com/en-us/magazine/hh205756.aspx
0
 
LVL 2

Author Comment

by:codequest
ID: 39675511
@LearnedOne

Good links, thanks for the input.  No real progress yet on the scaling and performance question.  Your responses above are very helpful, and are tilting me back toward Entity Framework.  I'll need to research Lazy Loading more.  

It may come down to what Lazy Loading would do in this scenario:
>  the dataset required to refresh the user screens consists of 4 or 5 linked tables, from which data is pulled in a complex set of operations.  
> The entire dataset is built by user interaction in primarily an additive way, i.e. there are only a few edits of existing rows.  
> If the entire data set is repeatedly re-accessed by the business/controller layer in order to compute the form/views, would EF + Lazy Loading maintain most of the data rows in the dataset in memory, and only fetch the recent adds and edits from disk?
> If that's the case, then the need for in-memory datasets is removed, and THAT part of the decision about whether to go with EF is removed.

I asked the DataSet vs EF question in a more evolved form here:  

stackoverflow.com/questions/20184319/options-for-syncronizing-in-memory-tables-with-entity-framework

which might provide more insight into what I'm doing, however I believe the scenario above about Lazy Loading is the crux of the question.
0
Master Your Team's Linux and Cloud Stack!

The average business loses $13.5M per year to ineffective training (per 1,000 employees). Keep ahead of the competition and combine in-person quality with online cost and flexibility by training with Linux Academy.

 
LVL 96

Expert Comment

by:Bob Learned
ID: 39676370
What is the maximum number of users that you expect at one time (adding about 20% to that)?  

Do you expect any growth over time?
0
 
LVL 2

Author Comment

by:codequest
ID: 39676432
Thanks for your question.  The goal is a commercial subscription SaaS service, which could be -very- roughly compared to a cross between Webex and Survey Monkey.  For this exercise, let's say I'm re-architecting for at minimum 1000 concurrent users, at which point  (in my dreams) hopefully people more capable than I would be dealing with performance and scaling issues.
0
 
LVL 96

Accepted Solution

by:
Bob Learned earned 500 total points
ID: 39676619
There are plenty of little (but important) things that you can research and experiment with.

Here is a good example:

Instantly Increase ASP.NET Scalability - Thru ThreadPool
http://arun-ts.blogspot.com/2013/10/instantly-increase-aspnet-scalability.html

Just remember that it is important to measure any changes to test their effectiveness.  Start with a baseline, and then make a change, and test again.
0

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Introduction This question got me thinking... (http://www.experts-exchange.com/questions/28707487/GLOBALS.html) Why shouldn't we use Globals? This is a simple question without a simple answer.  How do you explain these concepts to a programmer w…
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Via a live example, show how to shrink a transaction log file down to a reasonable size.

816 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

13 Experts available now in Live!

Get 1:1 Help Now