Tech Support banner

Status
Not open for further replies.
1 - 5 of 5 Posts

·
Registered
Joined
·
1 Posts
Discussion Starter #1
Hi

I have a question about wget. I hope this is the rigth forum for that question:

I want to get a number of files from a website. They are located in this kind of structure:
http://www.host/mainDir/dir001/[files].

There is directories from dir001-dir090 or something like that.
There isn't any other direcorys under mainDir.
But the html-files lies directly under the mainDir with the names index001.htm - index090.htm.

How do I download all the files from that area?

I've tried something like:
wget -r -np -A [filetype] http://host/mainDir/subDir/
with no success

There is also the issue with access. The html-pages with the links
are directly http://www.host but the files are located under http://user:[email protected] The html-files are not included under members (404 error)

Please help me with this...
 

·
Registered
Joined
·
1 Posts
Hi,

I've started to use wget few days back. Just to understand how it works, I'm trying to understand the freely available code. But I dont understand how to proceed.

Can anyone suggest me, how to proceed?
Thanks in advance.

Regards,
Ram.
 

·
Manager, - Alternative Computing, - Design
Joined
·
6,502 Posts
Hi,

I would recommend that you do a couple of things to learn more about wget. First is to open terminal and then type:
Code:
man wget
. This will open up the manual file on wget. To quit man just hit "q" without quotes. Also, I would recommend that you look at your distros documentation and see what they have on it.

Cheers!
 

·
Registered
Joined
·
1 Posts
Hi,

I want to download all pages in a directory, say http://www.example.com/directory1/[files] and not the rest of http://www.example.com

However, to display these pages correctly, I need the images, which are not contained in that directory but in http://images.example.com/[files]. I don't want all the images in the images directory, only those necessary to display those pages.

Alternatively, I could say I want to do something like "page requisites", but for all pages in a directory without entering each individually.

I have read the manual, but I have no experience with wget, and don't know how to put the commands together.
 
1 - 5 of 5 Posts
Status
Not open for further replies.
Top