[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

Syntax issue with wget to include only 1 domain

Posted on 2006-05-10
9
Medium Priority
?
387 Views
Last Modified: 2010-03-18
Hi,
I am trying to use wget for only one domain but when I use the --domains=mydomain.com parameter it doesn't seem to limit it to just that domain.

Any ideas what I'm doing wrong?
0
Comment
Question by:jodyglidden
  • 3
  • 3
6 Comments
 
LVL 19

Expert Comment

by:Gabriel Orozco
ID: 16653743
try to add -l 1 :

       -l depth
       --level=depth
           Specify recursion maximum depth level depth.  The default maximum depth is 5.
Also:
--------------------
 Actually, to download a single page and all its requisites (even if they exist
 on separate websites), and make sure the lot displays properly locally, this author likes
 to use a few options in addition to -p:

 wget -E -H -k -K -p http://<site>/<document>

and I like this option:
------------------------
       -np
       --no-parent
           Do not ever ascend to the parent directory when retrieving recursively.  This is a useful
           option, since it guarantees that only the files below a certain hierarchy will be down-
           loaded.

0
 
LVL 1

Author Comment

by:jodyglidden
ID: 16653977
Hi I'm trying to download the whole site though.
0
 
LVL 19

Expert Comment

by:Gabriel Orozco
ID: 16655245
wget -E -H -k -K -p http://<site>/<document>
0
NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

 
LVL 1

Author Comment

by:jodyglidden
ID: 16658709
Hi,
I still can't seem to limit it to just the domain I'm looking for using these parameters.
0
 
LVL 1

Author Comment

by:jodyglidden
ID: 16824398
I appreciate the attempts but I can definitively tell you that these are not the answer.

0
 
LVL 19

Accepted Solution

by:
Gabriel Orozco earned 2000 total points
ID: 16830412
sorry, what if you add -L ?

from the manual:

When only relative links are followed (option `-L'), recursive retrieving will never span hosts. No time-expensive DNS-lookups will be performed, and the process will be very fast, with the minimum strain of the network. This will suit your needs often, especially when mirroring the output of various x2html converters, since they generally output relative links.

also you can limit the domain to use so if there are some full links, where two or three domains are hosted on the same server and you want them, but it would be like this:

wget -r -H -Ddomain1.com,domain1.edu http://www.domain1.edu/

this wil span hosts (-H) but limit only to domain1.com and domain1.edu (-D) all recursive (-r)

maybe this is what you were looking for
0

Featured Post

Technology Partners: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

I have seen several blogs and forum entries elsewhere state that because NTFS volumes do not support linux ownership or permissions, they cannot be used for anonymous ftp upload through the vsftpd program.   IT can be done and here's how to get i…
Note: for this to work properly you need to use a Cross-Over network cable. 1. Connect both servers S1 and S2 on the second network slots respectively. Note that you can use the 1st slots but usually these would be occupied by the Service Provide…
If you're a developer or IT admin, you’re probably tasked with managing multiple websites, servers, applications, and levels of security on a daily basis. While this can be extremely time consuming, it can also be frustrating when systems aren't wor…
Are you ready to place your question in front of subject-matter experts for more timely responses? With the release of Priority Question, Premium Members, Team Accounts and Qualified Experts can now identify the emergent level of their issue, signal…
Suggested Courses
Course of the Month19 days, 16 hours left to enroll

873 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question