We help IT Professionals succeed at work.

How to copy all links displayed on a specific site?

Hi Experts,

I would like to copy all links displayed on a specific page, including sub-links.
For example the following
https://howto.caspio.com/tech-tips-and-articles/
How can I accomplish that?
Comment
Watch Question

Commented:
Export/Print the page to PDF.  The links should be embedded in the PDF doc.
Hi,

I need it to be in a notepad file, with each link on a separate line.

Thanks,
Ben
Hi,
I suggest you simple solution using google sheets, open new google sheet and paste this function =IMPORTXML("https://howto.caspio.com/tech-tips-and-articles/","//a/@href")

It will automatically return all the links in the webpage.
Hi,

Sorry for asking this but I never used google sheets before.
Would you be able to guide me?

Thanks,
Ben
Commented:
I just tried abbas abdulla's method and very cool.

Just sign into Google, if sheets isn't listed as an app go to "Docs".  Once at Docs drop down the menu and select "Sheets".

I was going to suggest Cygwin and using "wget" but the Sheets method is much cleaner.
Hi Experts,

This seems like working!
Just wondering whats the definition of "//a/@href"?

Thanks,
Ben
Thank you experts!
"//a/@href" is xpath syntax in google sheet and in HTML it means hyperlink tag. With importxml function you are just telling Google sheet import everything has this tag from the website.