Link to home
Start Free TrialLog in
Avatar of sakytech
sakytech

asked on

How to write a Batch script to go to a webpage and save that page in .csv

Hi
I am looking for script  which can save  a webpage in .csv.
This  particular page has text content.
What now i do is go to that intranet site   click save as name.csv...and convert it into a text file then send that to ETL team.
Is there a way i can automate this process untill converting it into txt file using a batch script?
any help is appreciated.
Avatar of knightEknight
knightEknight
Flag of United States of America image

google and download a tool called  httpget.exe

then in your script you can do this:

httpget.exe  "http://server.com/webpage.ext"  >  result.csv
Avatar of sakytech
sakytech

ASKER

Thanx for reply Knight,
But i can't download 3rd part tool on sys,
i tried wget.. but that doesn't work,it errors out.    I tried this ..[wget  -l 1 c:\test.csv http:// <URL>]... this does prety what i want to ..but it won't work for
my intranet url, it throws error 404. any insights?

any  suggestions??

Thanks,
hmm, wget should work just as well.  Try putting quotes around the URL ... other than that, I can't offer much more, sorry.
@knight ..well i tried that..it won't work, batch script does nothing. so i tried from command prompt
which gives the error. does it have anything to do with webpage, this site is accesable  only within company's network.

anyway thanks for trying ,
Are you running this command on a computer inside the company, or outside the company?

~bp
Thnx bill for replyin

inside the company..just trying to automate the process
Do you have any special characters in the URL string you are trying to invoke, like >, <, &, ^, or | perhaps.  Those can cause problems since they are special characters to the command shell.

~bp
I was thinking the same thing Bill ... I thought maybe quoting the url would resolve that.
Agreed knight, often quoting works in general, I haven't done a lot with wget so wasn't sure.  And without seeing the specific URL it's hard to propose a lot of thoughts on what might be causing a problem.

sakytech is it possible for you to share the full command line exactly as you are executing it?  Since the site is internal to your network I don't think sharing the URL would be a security risk, but that's best for you to decide.

Also, if you just do the following, does that open the page in your default browser?

START "http:\\yoururl.html"

~bp
thanks for the response,
this is the link looks like this http://icwebprod.wv.xyz.com/scripts/saky/

yea it works just fine when  i do start "http://icwebprod.wv.xyz.com/scripts/saky ".
as you mentioned placing the quotes worked in command prompt, but not in  batch script
[@echo off
cls
wget  "http://icwebprod.wv.xyz.com/scripts/saky" ]  am i missing something??
and also..conversion to .csv and to .txt! i have no clue how to do that ..in script!
any one know how to?


Thanx, bill.knight





yeaa bill,  when i  do start "http://icwebprod.wv.xyz.com/scripts/saky/
wget / httpget will create the file in the source format ... so if you wget a .jpg file it will create a .jpg file, if you wget a text file it will create a text file (by any name or extension you give it, e.g.  .txt or .csv or whatever).   In other words, wget will not "convert" from one format to another for you -- it all depends on what is returned from the specified URL.
Have you tried adding the -d option to the command line, which should provide some additional debugging info from wget.

~bp
ASKER CERTIFIED SOLUTION
Avatar of Bill Prew
Bill Prew

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial

Yea bill, sorry was caught up with different work.. it works just fine.. i missed that syntax..!!

Thanks for the help! It is solved!
==> sakytech

If this is solved can you accept and close?

~bp