Solved

.Net Remoting - Use, Design & Architecture questions

Posted on 2004-10-04
1
301 Views
Last Modified: 2010-04-17
I am caught in the middle of a development effort that I think is going in the wrong direction.  However I have been wrong in the past, so I would like to get some additional input before I make a final determination.

The Scenario:

The development involves a client/server application that processes bid quotes and requests as well as contact management.  The data is served up on an older Unix box and is stored in indexed flat files.  Roughly 20 year old technology.  It's been a while since I've used flat files but the last time I used them I was under the understanding that they were always accessed sequencially and are opened and closed individually if accessing more than one file.  Has this changed?  Is there some new buffering or caching that I'm unaware of?
From what I've read about remoting it is best used between .net machines on the same or different application domains and that it should not be used across platforms.  Am I right?

There are several servers that handle load distrabution between the calling clients and the single backend box.  (This just seems wrong to me)

The application's performance is extremely slow, in some cases upwards of 15-20 seconds to display a single screen that has no more than 1 flex grid with around 20 rows and 20 cells.

The infrastructure has been updated to facilitate higher bandwidth but the app isn't running any faster.

This is a second version of this type of applicaton the first being a web services app.  It was determined that it needed to go to .Net Remoting in order to gain speed and functionality that the web app didn't have.  It is interesting to note that the first app, although missing a great amount of functionality, is running faster than before and is out performing the .Net Remoting app.

Question(s):

Is this the right development direction?
Should this be re-directed to a migration of the backend to newer technology?
How can this project be salvaged, if at all, and gain the performance necessary to make this the real time app that is required?

I know that this is more of a design and architecture question than a specific issue with a definitive resolution.  However I am looking for opinions from people that have experience in .Net Remoting that may have run across this type of project before.
0
Comment
Question by:alexhogan
1 Comment
 
LVL 3

Accepted Solution

by:
KeithWatson earned 500 total points
ID: 12224588
In terms of performance, if your current system isn't performing as required then you'll have to make a change I guess. I think the key issues in terms of whether or not you would perform a migration of the back end would be:

-Are you sure the indexed file back end is the bottleneck? They're not necessarily slow; many databases use ISAM files as their implementation mechanism.
-If so, how many client applications currently use the indexed files? If it's only this single application, then a migration should be relatively simple. If it's multiple applications, you'll need to modify all these applications (assuming you don't have a convenient API wrapping the data access for all the files).

Load balancing of multiple clients against a single server certainly seems wrong; you would surely need more than one server to load balance too. In your current scenario, load is not being balanced at all. I don't follow what your boxes are doing here at all.

As for remoting, you mention multiple platforms. When you say Unix, I assume you don't mean Linux. I don't know how you would make remote calls to .NET objects on a Unix platform; .NET is not available for Unix platforms, apart from Linux with the Mono project (with which all respect to the Mono guys, and I'm sure it'll come on with leaps and bounds, but I think you'd be brave to include it in a mission-critical system at this point in time).

Another approach would be buying your way out of the problem; if I/O on the Unix box is the bottleneck, then there are plenty of Unix boxes available with massive I/O capability.

A common contemporary pattern for a cross-platform solution would be to host web services in an application server as a middle-tier containing application code that implements business logic and a separate logical tier to access data. This logical data tier will allow you to change database implementation without changing all of your clients' code.

Hope that helps,

Keith.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

I know it’s not a new topic to discuss and it has lots of online contents already available over the net. But Then I thought it would be useful to this site’s visitors and can have online repository on vim most commonly used commands. This post h…
A short article about problems I had with the new location API and permissions in Marshmallow
In this fourth video of the Xpdf series, we discuss and demonstrate the PDFinfo utility, which retrieves the contents of a PDF's Info Dictionary, as well as some other information, including the page count. We show how to isolate the page count in a…
With the power of JIRA, there's an unlimited number of ways you can customize it, use it and benefit from it. With that in mind, there's bound to be things that I wasn't able to cover in this course. With this summary we'll look at some places to go…

910 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

24 Experts available now in Live!

Get 1:1 Help Now