[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 316
  • Last Modified:

Does HTTP Protocol support random access?

When I use InternetSetFilePointer to read a block of data at position 100000 ,I found it only workes after all data before 100000 has been downloaded, and that is not what I want. Can anyone tell me why?
0
linyf
Asked:
linyf
  • 2
  • 2
  • 2
1 Solution
 
snoeglerCommented:
DWORD           dwStartFrom;
CString            temp;
temp.Format("Range: bytes=%d-",dwStartFrom);
pFile->AddRequestHeaders(temp,HTTP_ADDREQ_FLAG_COALESCE);
pFile->SendRequest();

Insert this code after successfully submitting the
OpenRequest().
This works with most servers. Some won't support this,
though.

The reason why your method not works lies within the
Internet API. Can't tell you exactly :(
But this is part of a working application and should
make no problems.
0
 
lucidityCommented:
you can tell if a server supports this if it supports resuming downloads
0
 
linyfAuthor Commented:
   Thank you very much! The mean you tell me really works.
But I still have a most headache problem: What I really want to do is to randomly read data from a file on a Internet Server just like from a local file by using file function Seek(). I tried, and found Http is not an appropriate protocol, because it is too slow. I want to know if there exists another protocol that meet my need?
Such a protocol does need many functions, just provide service that I can use to read/write data like an random access file.


0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
snoeglerCommented:
Umm ...
Internet protocols aren't designed for random access

One approach could be to split the file you want to access
into 1-4 kb portions (choose the size which won't create
unused space on your hard disk = cluster size).

Another approach is maybe to use server side CGI scripts,
which do the random access for you, and deliver you
HTTP pages filled with the information you need.

If you would use 'sockets' directly, maybe FTP would be fast
enough for you. Some commands like these can perform 'random'
access:

(1.) TYPE BINARY

(2.) REST (random access position)
(3.) RETR my_file.dat
...
Read as long as you need, then
(4.) ABOR

Then proceed with step (2.) until you're done
(5.) QUIT

Let me know if this is interesting, i've got some links
somewhere to a FTP wrapper class to make it easier to implement.
b
0
 
lucidityCommented:
If you are using servers where you have access right you may be better off write a server side script which does the file IO and you just tell it what to do..

Jason
0
 
linyfAuthor Commented:
Thank you !
   Your comment is right. But I can't split my file into small pieces, because I format the file just like a database, and want to query data from the file by network.
  I don't know whether FTP is fast enough for random accessing, if it not, it seems I have to reformat my file from scratch.
  I am very interesting in the FTP wrapper class you 've mentioned ,can you tell me where to get it?
0

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 2
  • 2
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now