wget
Joel Jaeggli
joelja at darkwing.uoregon.edu
Fri Apr 16 04:21:26 UTC 2004
On Thu, 15 Apr 2004, Tom 'Needs A Hat' Mitchell wrote:
> On Thu, Apr 15, 2004 at 10:41:39AM -0700, Gunnar vS Kramm wrote:
> > Original e-mail from: Matthew Benjamin (msbenjamin at fedex.com):
> >
> > > Does anyone know how to use wget to drill down to all of the folders and
> > > subdirectories in a website. I can mirror my website however it does not
> > > grab all of the folders which contain data that the links go to. The
> > > site is password protected.
> > >
> > > mattB.
> ....
>
> > You should be able to use the -r switch to wget, as such:
> > wget -r http://YourWebSite
>
> Also, does his web site have a robots file?
>
The other thing that I don't think came up in this thread is. if you have
control over the machine in which the website sits. it's a heck of a lot
more efficient to use rsync to mirror all the files than it is to use
wget.
>
>
>
--
--------------------------------------------------------------------------
Joel Jaeggli Unix Consulting joelja at darkwing.uoregon.edu
GPG Key Fingerprint: 5C6E 0104 BAF0 40B0 5BD3 C38B F000 35AB B67F 56B2
More information about the fedora-list
mailing list