I would like to go to a website enter a specific fileneme into a search field click on a linked image after the search result and then click on a link to download the a hqx of the specified file.
Let me detail the manual process in downloading these files after logging in to website
Enter a specific unique number in a search field i.e. CX234566 and hit search.
–result: a page with the a thumbnail of the image which is a link to the download page of that image
Click the thumbnail of the image.
–result: a page with the HQX file download available
Click the HQX file to start the download.
–result; the file is downloaded to your computer
Repeat.
Note: There is no process available for multiple downloads on this website.
Our subscription allows us full access to unlimited downloads from this website.
I have created a list of files I need to download.
My goal is to perhaps create a repeat loop and download the needed files from said list.
My need for direction is to what scripting approach I should take to achieve this purpose:
Safari, URL Scripting, Javascript.
There is a load of options to do it. I would try using “curl” (the *nix command-line tool) and AppleScript, so you can run all the process in the “background”. Something as (follows pseudo-code):
set searchResults to (do shell script "curl 'http://www.site.com/search.php?str=XXXXXXXX'")
--> searchResults contains now HTML
--> parse it and find the next link
set downloadPage to (do shell script "curl " & quoted form of foundLinkInSearchResults)
--> now downloadPage contains HTML
--> parse it and find d/l link
do shell script "cd ~/Desktop; curl -o " & quoted form of finalDownloadURLHQX
--> this one downloaded the file to your desktop
Simply enclose in a repeat loop, and loop thru the list of search strings (XXXXXXX, YYYYYYYY, ZZZZZZZZZ…).