Link to home
Get AccessLog in
Avatar of sunhux
sunhux

asked on

help to get wget syntax for Windows (translate it from wget script for Linux )

On my Linux, I used to be able to issue the following command
to download an MRTG graph :

wget --no-check-certificate --convert-links -p --user=myloginid --password=\123456789 https://businessexprezz.com/mywebsite/bandwidth/latest.shtml

I have a wget version (see below for the full help & syntax) but
when it does not have the equivalent of the Linux parameters/options.

When I tried to issue wget as follows :
wget -p --user=myloginid --password=\123456789  www.businessexprezz.co
m/pim1/bandwidth/latest.shtml

 it just showed the message (IP addr altered for security reason) below  & then
paused there without saving anything:


Connecting to www.businessexprezz.com|202.6.161.25|:80... connected.
HTTP request sent, awaiting response...



Will need someone to help me code it or provide a free version that can
do what my Linux wget is doing.


============== wget for Linux =================

c:\wgetchk> wget --help
GNU Wget 1.10, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
  -V,  --version           display the version of Wget and exit.
  -h,  --help              print this help.
  -b,  --background        go to background after startup.
  -e,  --execute=COMMAND   execute a `.wgetrc'-style command.

Logging and input file:
  -o,  --output-file=FILE    log messages to FILE.
  -a,  --append-output=FILE  append messages to FILE.
  -d,  --debug               print lots of debugging information.
  -q,  --quiet               quiet (no output).
  -v,  --verbose             be verbose (this is the default).
  -nv, --no-verbose          turn off verboseness, without being quiet.
  -i,  --input-file=FILE     download URLs found in FILE.
  -F,  --force-html          treat input file as HTML.
  -B,  --base=URL            prepends URL to relative links in -F -i file.

Download:
  -t,  --tries=NUMBER            set number of retries to NUMBER (0 unlimits).
       --retry-connrefused       retry even if connection is refused.
  -O,  --output-document=FILE    write documents to FILE.
  -nc, --no-clobber              skip downloads that would download to
                                 existing files.
  -c,  --continue                resume getting a partially-downloaded file.
       --progress=TYPE           select progress gauge type.
  -N,  --timestamping            don't re-retrieve files unless newer than
                                 local.
  -S,  --server-response         print server response.
       --spider                  don't download anything.
  -T,  --timeout=SECONDS         set all timeout values to SECONDS.
       --dns-timeout=SECS        set the DNS lookup timeout to SECS.
       --connect-timeout=SECS    set the connect timeout to SECS.
       --read-timeout=SECS       set the read timeout to SECS.
  -w,  --wait=SECONDS            wait SECONDS between retrievals.
       --waitretry=SECONDS       wait 1..SECONDS between retries of a retrieval.

       --random-wait             wait from 0...2*WAIT secs between retrievals.
  -Y,  --proxy                   explicitly turn on proxy.
       --no-proxy                explicitly turn off proxy.
  -Q,  --quota=NUMBER            set retrieval quota to NUMBER.
       --bind-address=ADDRESS    bind to ADDRESS (hostname or IP) on local host.

       --limit-rate=RATE         limit download rate to RATE.
       --no-dns-cache            disable caching DNS lookups.
       --restrict-file-names=OS  restrict chars in file names to ones OS allows.

       --user=USER               set both ftp and http user to USER.
       --password=PASS           set both ftp and http password to PASS.

Directories:
  -nd, --no-directories           don't create directories.
  -x,  --force-directories        force creation of directories.
  -nH, --no-host-directories      don't create host directories.
       --protocol-directories     use protocol name in directories.
  -P,  --directory-prefix=PREFIX  save files to PREFIX/...
       --cut-dirs=NUMBER          ignore NUMBER remote directory components.

HTTP options:
       --http-user=USER        set http user to USER.
       --http-password=PASS    set http password to PASS.
       --no-cache              disallow server-cached data.
  -E,  --html-extension        save HTML documents with `.html' extension.
       --ignore-length         ignore `Content-Length' header field.
       --header=STRING         insert STRING among the headers.
       --proxy-user=USER       set USER as proxy username.
       --proxy-password=PASS   set PASS as proxy password.
       --referer=URL           include `Referer: URL' header in HTTP request.
       --save-headers          save the HTTP headers to file.
  -U,  --user-agent=AGENT      identify as AGENT instead of Wget/VERSION.
       --no-http-keep-alive    disable HTTP keep-alive (persistent connections).

       --no-cookies            don't use cookies.
       --load-cookies=FILE     load cookies from FILE before session.
       --save-cookies=FILE     save cookies to FILE after session.
       --keep-session-cookies  load and save session (non-permanent) cookies.
       --post-data=STRING      use the POST method; send STRING as the data.
       --post-file=FILE        use the POST method; send contents of FILE.

HTTPS (SSL/TLS) options:
       --secure-protocol=PR     choose secure protocol, one of auto, SSLv2,
                                SSLv3, and TLSv1.
       --no-check-certificate   don't validate the server's certificate.
       --certificate=FILE       client certificate file.
       --certificate-type=TYPE  client certificate type, PEM or DER.
       --private-key=FILE       private key file.
       --private-key-type=TYPE  private key type, PEM or DER.
       --ca-certificate=FILE    file with the bundle of CA's.
       --ca-directory=DIR       directory where hash list of CA's is stored.
       --random-file=FILE       file with random data for seeding the SSL PRNG.
       --egd-file=FILE          file naming the EGD socket with random data.

FTP options:
       --ftp-user=USER         set ftp user to USER.
       --ftp-password=PASS     set ftp password to PASS.
       --no-remove-listing     don't remove `.listing' files.
       --no-glob               turn off FTP file name globbing.
       --no-passive-ftp        disable the "passive" transfer mode.
       --retr-symlinks         when recursing, get linked-to files (not dir).
       --preserve-permissions  preserve remote file permissions.

Recursive download:
  -r,  --recursive          specify recursive download.
  -l,  --level=NUMBER       maximum recursion depth (inf or 0 for infinite).
       --delete-after       delete files locally after downloading them.
  -k,  --convert-links      make links in downloaded HTML point to local files.
  -K,  --backup-converted   before converting file X, back up as X.orig.
  -m,  --mirror             shortcut option equivalent to -r -N -l inf -nr.
  -p,  --page-requisites    get all images, etc. needed to display HTML page.
       --strict-comments    turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
  -A,  --accept=LIST               comma-separated list of accepted extensions.
  -R,  --reject=LIST               comma-separated list of rejected extensions.
  -D,  --domains=LIST              comma-separated list of accepted domains.
       --exclude-domains=LIST      comma-separated list of rejected domains.
       --follow-ftp                follow FTP links from HTML documents.
       --follow-tags=LIST          comma-separated list of followed HTML tags.
       --ignore-tags=LIST          comma-separated list of ignored HTML tags.
  -H,  --span-hosts                go to foreign hosts when recursive.
  -L,  --relative                  follow relative links only.
  -I,  --include-directories=LIST  list of allowed directories.
  -X,  --exclude-directories=LIST  list of excluded directories.
  -np, --no-parent                 don't ascend to the parent directory.

Mail bug reports and suggestions to <bug-wget@gnu.org>.
Avatar of sunhux
sunhux

ASKER


After about 1-2 minutes, I got another message & wget reattempt to download again:

HTTP request sent, awaiting response... Read error (Connection reset by peer) in
 headers.
Retrying.
Avatar of sunhux

ASKER


I've tried Internet access via a proxy as well as without proxy (direct via 3G broadband)
but it did not help
Avatar of sunhux

ASKER

I've also checked the other EE thread (below) but no luck:
 https://www.experts-exchange.com/questions/21489458/wget-errors.html
Note that I've tried direct Internet connection without going thru proxy to a simple gmail

While running wget, if I check on the netstat connections, I got :

C:\>netstat -ano 1 | find/i "74."
  TCP    192.168.205.107:4891   74.125.235.23:443      SYN_SENT        1304
  TCP    192.168.205.107:4891   74.125.235.23:443      SYN_SENT        1304
  TCP    192.168.205.107:4891   74.125.235.23:443      SYN_SENT        1304
  TCP    192.168.205.107:4891   74.125.235.23:443      SYN_SENT        1304
  TCP    192.168.205.107:4891   74.125.235.23:443      SYN_SENT        1304

where 74.125.234.23 is gmail.com's IP addr while 192.168.x.y is my PC IP addr
Avatar of sunhux

ASKER


Ok, my apologies, it's the proxy issue : it's fixed after I entered the correct password
& using direct Internet connection.  But now I got a different error :

wget-1.10.2.exe -p --user=myloginid --password=mypasswd --ignore-length --convert-links
--no-check-certificate  https://www.businessexpress.com/qim1/bandwidth/latest.shtml
--17:30:04--  https://www.businessexpress.com/qim1/bandwidth/latest.shtml
           => `www.businessexpress.com/pim1/bandwidth/latest.shtml'
Resolving www.businessexpress.com... 202.6.160.24
Connecting to www.businessexpress.com|202.6.161.25|:443... connected.
WARNING: Certificate verification error for www.businessexpress.com: unable to g
et local issuer certificate
HTTP request sent, awaiting response... 401 Unauthorized

FINISHED --17:30:06--
Downloaded: 0 bytes in 0 files


So what do I do next ?
ASKER CERTIFIED SOLUTION
Avatar of Shalom Carmel
Shalom Carmel
Flag of Israel image

Link to home
membership
This content is only available to members.
To access this content, you must be a member of Experts Exchange.
Get Access
Avatar of sunhux

ASKER

Can you point me to a good wget mailing list where there are good active
discussions?


>If you can get to this web page with firefox, get the Tamper Data firefox addon, and
>use it to trace the session and see if there are any unusual HTTP headers involved.

can you elaborate how do I get Tamper Data add-on & how to install/use it
SOLUTION
Link to home
membership
This content is only available to members.
To access this content, you must be a member of Experts Exchange.
Get Access
Avatar of sunhux

ASKER


I'm getting nowhere.  Still getting that error & the Firefox add-on thingy
is way too far to troubleshoot
Avatar of sunhux

ASKER

ok