Solved

low latency connection

Posted on 2010-09-09
14
465 Views
Last Modified: 2012-06-22
In low latency connection why do we prefer  RMI over IIOP and  Synchronous web service  invocation ?
0
Comment
Question by:cofactor
  • 8
  • 5
14 Comments
 
LVL 40

Expert Comment

by:gurvinder372
ID: 33634205
Synchronous web service is a expensive process. See it this way.
1) at client side, Your proxy stubs (got using wsdl2java) will be serialized to xml soap request (unmarshalling, xml parsing)
2) xml will be deserialized to Objects at the server side (marshalling, xml parsing)
3) then corresponding method will be invoked on server side
4) its response will be serialized again at server side (marshalling, xml parsing)
5) deserialized again at client side to get the response in proxy objects (unmarshalling, xml parsing)

It naturally creates high latency. XML parsing is expensive

In case of IIOP also, the request is made in some format which needs to be marshalled and unmarshalled at both sides.


0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33634235
I don't think RMI-IIOP and low latency is in any way related for better thouroughput!

may be the preference is because of the additional overhead( data ) in RMI-IIOP

low latency prefered in synchronous comm because in a high lateny connection the possibility of failure in transferring the results is more which may seem the total request failed...
0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33634252
I can't think of the XML parsing impacts the latency because the parsing happens only at the client or server not in the network... shouldn't matter IMHO
0
 
LVL 40

Expert Comment

by:gurvinder372
ID: 33634267
<<parsing happens only at the client or server not in the network>>
yes it happens only at client or server, but it is expensive. Just try marshalling a collection of more than 100 items, you will get to know the impact.

0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33634278
>>Just try marshalling a collection of more than 100 items, you will get to know the impact.

what I am trying to say is, the impact is on the client/server not in the network, the latency matters when the size of the data and the time it takes to transfer!
0
 
LVL 40

Expert Comment

by:gurvinder372
ID: 33634289
It all adds to the latency.
When the web-service is invoked, all the 5 steps mentioned above happens. in those 5 steps i am not even including the network latency. It is extra.


0
 

Author Comment

by:cofactor
ID: 33634307
could you guys tell me what exactly low latency connection ? Can I call slow speed GPRS connection as low latency connection ? DSL Broadband as high latency connection ?
>>>high lateny connection the possibility of failure in transferring the results is more
failure ? why do you think so ?
0
Threat Intelligence Starter Resources

Integrating threat intelligence can be challenging, and not all companies are ready. These resources can help you build awareness and prepare for defense.

 
LVL 26

Expert Comment

by:ksivananth
ID: 33634313
>>in those 5 steps i am not even including the network latency

the question is just about network latency( why a low latency conn prefered for... ) not the other extra stuffs you mentioned!

the marshalling/unmarshalling, serialization/deserialization happens in async requests too... they are specific to client/server implmentation...
0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33634322
>>Can I call slow speed GPRS connection as low latency connection ? DSL Broadband as high latency connection ?

its the other way around, low latency is high speed!
0
 
LVL 26

Accepted Solution

by:
ksivananth earned 150 total points
ID: 33634329
>>failure ? why do you think so ?

because, in a slow connection, the possibility of timeouts is more based on the traffic... a slight increase in traffic will lead into more timeouts!
0
 
LVL 40

Expert Comment

by:gurvinder372
ID: 33634535
<<the question is just about network latency>>
Can you show that to me in the original question?
0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33635132
>>Can you show that to me in the original question?

read the question,

low latency connection Question: In low latency connection why do we
0
 
LVL 40

Expert Comment

by:gurvinder372
ID: 33635252
Why am i not able to see "Network latency" in his question?
Are you saying latency is always related to network?

I agree with your point though that being 'Synchronous' is big part of the reason, which i missed.
0
 
LVL 26

Expert Comment

by:ksivananth
ID: 33635289
>>Why am i not able to see "Network latency" in his question?

read the question fully,

>>RMI over IIOP and  Synchronous web service  

distributed applications communicate through a network.

>>In low latency connection

network connection for these apps!
0

Featured Post

Why You Should Analyze Threat Actor TTPs

After years of analyzing threat actor behavior, it’s become clear that at any given time there are specific tactics, techniques, and procedures (TTPs) that are particularly prevalent. By analyzing and understanding these TTPs, you can dramatically enhance your security program.

Join & Write a Comment

Go is an acronym of golang, is a programming language developed Google in 2007. Go is a new language that is mostly in the C family, with significant input from Pascal/Modula/Oberon family. Hence Go arisen as low-level language with fast compilation…
Introduction This article is the last of three articles that explain why and how the Experts Exchange QA Team does test automation for our web site. This article covers our test design approach and then goes through a simple test case example, how …
The viewer will learn how to implement Singleton Design Pattern in Java.
This tutorial explains how to use the VisualVM tool for the Java platform application. This video goes into detail on the Threads, Sampler, and Profiler tabs.

708 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now