I’m working on a script that downloads a page or a directory, I’m encounterin difficulties to recreate the order of a site. For example if i want to download Apple Website
1 - I enter the following url : www.apple.com/ 2 - My script create a folder on the desktop with its name set to www.apple.com 3 - Then it takes all the url from the page and download them, but I don’t have any idea on how to put it to the right place
All ideas are welcome
Jean-Baptiste LE STANG
What are you using for downloads?
Rob
The first thing that comes to my mind is that you could play with the absolute/relative concept of a url. This probably involves some testing and parsing of every followed url. thats the hard part (i think), since from then on, AS can take over in the relative way (that is, relative to the local wwwrootfolder). if you want to be the local copy to be fully functional however, you will have to “rewrite” every link in every downloaded page to be relative if it is’nt already.
Second, how deep will you go? Are going to download every link? If so, be prepared to download terabytes i feel recursion in my nostrils…be careful. Ah yes, you could limit the number of downloads by testing for equality of the hostnmame part of the url. relative urls won’t be a problem by then.
greets, karl.
There could be but I’m not familiar with any.
Rob
Rob I’m using URL ACCESS - Is there a Scipting addition that makes the same work?
Jean-Baptiste LE STANG
Ok thanks.