[Last Call] Learn about multicloud storage options and how to improve your company's cloud strategy. Register Now

x
?
Solved

.Net Remoting - Use, Design & Architecture questions

Posted on 2004-10-04
1
Medium Priority
?
307 Views
Last Modified: 2010-04-17
I am caught in the middle of a development effort that I think is going in the wrong direction.  However I have been wrong in the past, so I would like to get some additional input before I make a final determination.

The Scenario:

The development involves a client/server application that processes bid quotes and requests as well as contact management.  The data is served up on an older Unix box and is stored in indexed flat files.  Roughly 20 year old technology.  It's been a while since I've used flat files but the last time I used them I was under the understanding that they were always accessed sequencially and are opened and closed individually if accessing more than one file.  Has this changed?  Is there some new buffering or caching that I'm unaware of?
From what I've read about remoting it is best used between .net machines on the same or different application domains and that it should not be used across platforms.  Am I right?

There are several servers that handle load distrabution between the calling clients and the single backend box.  (This just seems wrong to me)

The application's performance is extremely slow, in some cases upwards of 15-20 seconds to display a single screen that has no more than 1 flex grid with around 20 rows and 20 cells.

The infrastructure has been updated to facilitate higher bandwidth but the app isn't running any faster.

This is a second version of this type of applicaton the first being a web services app.  It was determined that it needed to go to .Net Remoting in order to gain speed and functionality that the web app didn't have.  It is interesting to note that the first app, although missing a great amount of functionality, is running faster than before and is out performing the .Net Remoting app.

Question(s):

Is this the right development direction?
Should this be re-directed to a migration of the backend to newer technology?
How can this project be salvaged, if at all, and gain the performance necessary to make this the real time app that is required?

I know that this is more of a design and architecture question than a specific issue with a definitive resolution.  However I am looking for opinions from people that have experience in .Net Remoting that may have run across this type of project before.
0
Comment
Question by:alexhogan
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
1 Comment
 
LVL 3

Accepted Solution

by:
KeithWatson earned 2000 total points
ID: 12224588
In terms of performance, if your current system isn't performing as required then you'll have to make a change I guess. I think the key issues in terms of whether or not you would perform a migration of the back end would be:

-Are you sure the indexed file back end is the bottleneck? They're not necessarily slow; many databases use ISAM files as their implementation mechanism.
-If so, how many client applications currently use the indexed files? If it's only this single application, then a migration should be relatively simple. If it's multiple applications, you'll need to modify all these applications (assuming you don't have a convenient API wrapping the data access for all the files).

Load balancing of multiple clients against a single server certainly seems wrong; you would surely need more than one server to load balance too. In your current scenario, load is not being balanced at all. I don't follow what your boxes are doing here at all.

As for remoting, you mention multiple platforms. When you say Unix, I assume you don't mean Linux. I don't know how you would make remote calls to .NET objects on a Unix platform; .NET is not available for Unix platforms, apart from Linux with the Mono project (with which all respect to the Mono guys, and I'm sure it'll come on with leaps and bounds, but I think you'd be brave to include it in a mission-critical system at this point in time).

Another approach would be buying your way out of the problem; if I/O on the Unix box is the bottleneck, then there are plenty of Unix boxes available with massive I/O capability.

A common contemporary pattern for a cross-platform solution would be to host web services in an application server as a middle-tier containing application code that implements business logic and a separate logical tier to access data. This logical data tier will allow you to change database implementation without changing all of your clients' code.

Hope that helps,

Keith.
0

Featured Post

Concerto's Cloud Advisory Services

Want to avoid the missteps to gaining all the benefits of the cloud? Learn more about the different assessment options from our Cloud Advisory team.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Whether you've completed a degree in computer sciences or you're a self-taught programmer, writing your first lines of code in the real world is always a challenge. Here are some of the most common pitfalls for new programmers.
Although it can be difficult to imagine, someday your child will have a career of his or her own. He or she will likely start a family, buy a home and start having their own children. So, while being a kid is still extremely important, it’s also …
In this fifth video of the Xpdf series, we discuss and demonstrate the PDFdetach utility, which is able to list and, more importantly, extract attachments that are embedded in PDF files. It does this via a command line interface, making it suitable …
Starting up a Project

650 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question