Solved

Entity Framework versus Datasets

Posted on 2013-11-24
6
1,973 Views
Last Modified: 2013-11-29
(Related to another question)  
I am re-architecting an application that focuses on group collaboration around interactively building and annotating a complex diagram.

Average total data size for one group session for will probably be somewhere between 150KB and 1MB.

Currently the group session data is stored entirely in memory in a large object containing datasets and supporting variables.  This data is synced to disk during diagram update/writes.    There are about a dozen tables in the data object.  I make extensive use of tableadapters.

The data is kept in memory for fast diagram form refreshes (one participant could do a small write, which might change the relationships on the diagram, which then has to be refreshed to all other participants).

Let's assume that keeping the data in memory is the correct approach for performance/scaling concerns (that was the other question).

The question here is, is there a benefit to using Entity Framework in this situation?  

My understanding is that datasets work well for memory access of small data volumes, which I have, whereas Entity Framework always read/writes from/to disk.   So, maybe I could use EF for the disk syncing, but I'd still need all the dataset logic anyway for the in-memory data handling.  So why would I want to add EF in that case (except for easier DB maintenance, which maybe plenty of justification?)

Or is there some way of managing "dataset-like" information in memory that is complimentary to EF?    My diagram display form used to have gridviews, however, that's all been replaced with custom div collections, so there's no real dependency on datasets on the GUI side.  The business logic does make extensive use of datatables, however, I'm assuming they could be loaded other than from tableadapters (though that's very convenient, until I bump into the DB table definition updates issue).

So, maybe to summarize, the question is, would it be worth it to introduce Entity Framework in this situation, for DB management, and if so would I just keep the dataset object for memory, or would there be some alternative in-memory data management approach?

Any input on this would be appreciated.

Thanks!
0
Comment
Question by:codequest
  • 4
  • 2
6 Comments
 
LVL 96

Assisted Solution

by:Bob Learned
Bob Learned earned 500 total points
ID: 39674533
What did you find out about performance and scaling in the previous question?

Scaling Strategies for ASP.NET Applications
http://msdn.microsoft.com/en-us/magazine/cc500561.aspx

The trick to optimizing server code is to use testing to be sure you're actually making a difference. You should use profiling tools to analyze your application and find out where the application is spending the most time.

...

In the long run, however, affinity creates grief. Keeping session data in-process may be fast, but if the ASP.NET worker process recycles, all those sessions are dead. And worker processes recycle for a lot of reasons. Under high load

It is important to balance everything when working towards a scaleable application.  You don't want to keep database connections open, since you can reach a maximum faster when adding more users.  You don't want to use too much server memory, since you will force the worker process to recycle more quickly when adding more users.
0
 
LVL 96

Assisted Solution

by:Bob Learned
Bob Learned earned 500 total points
ID: 39674545
Entity Framework works with the principle of "laziness", where related data is not pulled into memory until it is needed.

Demystifying Entity Framework Strategies: Loading Related Data
http://msdn.microsoft.com/en-us/magazine/hh205756.aspx
0
 
LVL 2

Author Comment

by:codequest
ID: 39675511
@LearnedOne

Good links, thanks for the input.  No real progress yet on the scaling and performance question.  Your responses above are very helpful, and are tilting me back toward Entity Framework.  I'll need to research Lazy Loading more.  

It may come down to what Lazy Loading would do in this scenario:
>  the dataset required to refresh the user screens consists of 4 or 5 linked tables, from which data is pulled in a complex set of operations.  
> The entire dataset is built by user interaction in primarily an additive way, i.e. there are only a few edits of existing rows.  
> If the entire data set is repeatedly re-accessed by the business/controller layer in order to compute the form/views, would EF + Lazy Loading maintain most of the data rows in the dataset in memory, and only fetch the recent adds and edits from disk?
> If that's the case, then the need for in-memory datasets is removed, and THAT part of the decision about whether to go with EF is removed.

I asked the DataSet vs EF question in a more evolved form here:  

stackoverflow.com/questions/20184319/options-for-syncronizing-in-memory-tables-with-entity-framework

which might provide more insight into what I'm doing, however I believe the scenario above about Lazy Loading is the crux of the question.
0
Top 6 Sources for Identifying Threat Actor TTPs

Understanding your enemy is essential. These six sources will help you identify the most popular threat actor tactics, techniques, and procedures (TTPs).

 
LVL 96

Expert Comment

by:Bob Learned
ID: 39676370
What is the maximum number of users that you expect at one time (adding about 20% to that)?  

Do you expect any growth over time?
0
 
LVL 2

Author Comment

by:codequest
ID: 39676432
Thanks for your question.  The goal is a commercial subscription SaaS service, which could be -very- roughly compared to a cross between Webex and Survey Monkey.  For this exercise, let's say I'm re-architecting for at minimum 1000 concurrent users, at which point  (in my dreams) hopefully people more capable than I would be dealing with performance and scaling issues.
0
 
LVL 96

Accepted Solution

by:
Bob Learned earned 500 total points
ID: 39676619
There are plenty of little (but important) things that you can research and experiment with.

Here is a good example:

Instantly Increase ASP.NET Scalability - Thru ThreadPool
http://arun-ts.blogspot.com/2013/10/instantly-increase-aspnet-scalability.html

Just remember that it is important to measure any changes to test their effectiveness.  Start with a baseline, and then make a change, and test again.
0

Featured Post

Why You Should Analyze Threat Actor TTPs

After years of analyzing threat actor behavior, it’s become clear that at any given time there are specific tactics, techniques, and procedures (TTPs) that are particularly prevalent. By analyzing and understanding these TTPs, you can dramatically enhance your security program.

Join & Write a Comment

Suggested Solutions

Introduction This article discusses the Chain of Responsibility pattern, explaining What it is;Why it is; andHow it is At the end of this article, I hope you will be able to describe the use and benefits of Chain of Responsibility.  Backgrou…
Real-time is more about the business, not the technology. In day-to-day life, to make real-time decisions like buying or investing, business needs the latest information(e.g. Gold Rate/Stock Rate). Unlike traditional days, you need not wait for a fe…
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

747 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now