Scripting Downloads

Hey guys

I really love automator buts it’s a bit buggy when it comes to handling large amounts of http links, well, for me atleast.
So
I was wondering how would I would download http links in Applescript.

Is this done by tell application “safari” or terminal? I have no idea:)

Hi

there are two possibilities:

do shell script "curl http://www.mydomain.com" -- reads the intire contents of the URL

tell application "Safari" open location "http://www.mydomain.com" set {a, b} to {source, text} of document 1 -- a: the whole source, b: only text content end tell

Hello,
I believe I have explained wrong.
I don’t want to view the text or source of a page, I want to download it.
For instance,

I would like to download links…

http://www.halkfhdlsakfj.com/d.dmg
http://www.sfuhoioiiii.com/d.dmg
http://www.iorhfefnlwdc.com/d.dmg

then save them to a directory.
Any thoughts.

Thank you,
Dan Watson

this is no problem, too

do shell script "curl http://www.halkfhdlsakfj.com/d.dmg -o ~/desktop/d.dmg"

downloads the file d.dmg to your desktop