VB.NET - How do I download a webpage that requires cookies, etc?

I've got a program that directly queries URLs, but i get junk results from web pages that require cookies, among other things.

Is there some way to simulate cookies programmatically, or is there some way to use a browser daemon or something to do the querying?

Who is Participating?
AkisCConnect With a Mentor Commented:
Dim strPost as string=""
strPost=<<you probably have to login or something here. So you create your post string>>

Dim postURL as string=""

            Dim objHTTP As MSXML2.ServerXMLHTTP
            objHTTP = New MSXML2.ServerXMLHTTP
            objHTTP.open("POST", postURL, False)
            objHTTP.setRequestHeader("Content-Type", "text/xml; charset=utf-8")
'//objHTTP.send   'if you do not have to post anything
            If objHTTP.readyState = 4 Then
                If objHTTP.status = 200 Then
                    vReportvFileToChk = objHTTP.responseText
                    Dim aCookie As String = objHTTP.getResponseHeader("Set-Cookie")
                    MyCookie = aCookie
                End If
            End If
Generally: I'm interested to know the answer to this one as well... I'd assign additional points to the correct answer, if I could :-)

To the OP: what type of information is in the cookies that the absence of results in junk? I have recently got around an autologin cookie by having my page complete the login process... That answer is here: http://www.experts-exchange.com/Programming/Languages/.NET/Visual_Basic.NET/Q_23785739.html

Obviously, if your cookie is more than logging in, then you need another answer... good luck
hamlin11Author Commented:
I'll let you both know how this works asap
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.